Jump to ContentJump to Main Navigation
Beyond the CongregationThe World of Christian Nonprofits$

Christopher P. Scheitle

Print publication date: 2010

Print ISBN-13: 9780199733521

Published to Oxford Scholarship Online: September 2010

DOI: 10.1093/acprof:oso/9780199733521.001.0001

Show Summary Details
Page of

PRINTED FROM OXFORD SCHOLARSHIP ONLINE (www.oxfordscholarship.com). (c) Copyright Oxford University Press, 2016. All Rights Reserved. Under the terms of the licence agreement, an individual user may print out a PDF of a single chapter of a monograph in OSO for personal use (for details see http://www.oxfordscholarship.com/page/privacy-policy). Subscriber: null; date: 08 December 2016

(p.171) Appendix

(p.171) Appendix

Data Collection Methods

Beyond the Congregation
Oxford University Press

Most of the data presented throughout this book come from an original collection conducted by the author. The collection consisted of three stages that began in 2006. First, organizations fulfilling certain criteria had to be identified. Second, tax returns for these organizations had to be located and financial information from them entered into a database. Third, organizational narratives provided on these tax returns had to be coded and combined with the financial measures.

Defining the Target Population

While a significant amount of research has looked at local religious social service organizations, almost no systematic research had looked at the type of organizations that are more commonly mentioned as representing the parachurch sector. Specifically, little research had looked at the larger Christian organizations that have a national and/or an international reach. The goal of this book has been to fill this gap. Therefore, the target population of interest for this project was defined as large Christian nonprofit organizations based in the United States that operate on a national or an international scope and are not under the financial or administrative control of official denominational organizations. This still required some specification and operationalization, though, since “large” and “national/international” are concepts that are not entirely clear.1

The national or international scope requirement was defined by having operations in either a foreign country (i.e., international) or in at least two states in the United States (i.e., national). The “large” requirement is obviously always going to be relative, but it ended up being tied to the national or international scope standard. It was found that under a total revenue of $200,000, organizations were increasingly likely not to fulfill the national/international scope standard and were also more vague in their description of activities (likely because smaller (p.172) organizations are still trying to figure out exactly what they are doing). Therefore, it was decided that the data collection would be limited to those organizations with total revenue greater than $200,000.2 The availability of scanned tax returns lags by about 2 years. When this data collection began, the most consistently available recent data came from 2004, so that is the year utilized in these data.

So to be included in these data, the organization had to

  1. 1. be a 501(c)(3) public charity

  2. 2. identify as Christian

  3. 3. have revenue more than $200,000 in 2004

  4. 4. operate in at least two U.S. states and/or in a foreign country and

  5. 5. not be under the financial or administrative control of a church or denomination.

Identifying the Organizations

With the criteria for inclusion set, the next step was to find the organizations that met the criteria. In the absence of a list of Christian nonprofits, I had to look for other sources. The closest thing to a list of Christian nonprofits is the National Center for Charitable Statistics’ (NCCS) database of all nonprofits that submit annual tax returns.3 Looking through every one of the over 303,000 nonprofits contained in the 2004 database to determine which ones fulfill the criteria would be an impossible task. Fortunately, the NCCS provides some guidance by assigning each organization a code based on its activities or mission. This coding system is called the National Taxonomy of Exempt Entities (NTEE). The NTEE has 10 broad categories:4

  1. 1. Arts, Culture, and Humanities

  2. 2. Education

  3. 3. Environment and Animals

  4. 4. Health

  5. 5. Human Services

  6. 6. International, Foreign Affairs

  7. 7. Public, Societal Benefit

  8. 8. Religion Related

  9. 9. Mutual/Membership Benefit

  10. 10. Unknown, Unclassified.

Under each of these broad categories are a variety of subcategories. Of interest to my data collection were those subcategories listed under the “Religion Related” category:5

  • 1 Alliances & Advocacy

  • 2 Management & Technical Assistance

  • 3 Professional Societies & Associations

  • 5 Research Institutes & Public Policy Analysis

  • 11 Single Organization Support

  • 12 Fund-Raising & Fund Distribution

  • 19 Support—Not Else Classified

  • 20 Christian

  • (p.173) 21 Protestant

  • 22 Roman Catholic

  • 30 Jewish

  • 40 Islamic

  • 50 Buddhist

  • 70 Hindu

  • 80 Religious Media & Communications

  • 81 Religious Film & Video

  • 82 Religious Television

  • 83 Religious Printing & Publishing

  • 84 Religious Radio

  • 90 Interfaith Coalitions

  • 99 Religion-Related—Not Else Classified.

The bold categories were those that were examined in detail to see whether organizations within them fulfilled the criteria.6 Using the NCCS’ database, queries were run to limit the returned organizations to those with total revenue above $200,000.7 Once the names of the organizations were found, the actual digital (i.e., scanned) 990 tax return for the organization was then located using either www.guidestar.org or www.foundationcenter.org. Using the self-description located on the 990 form, it was determined whether the organization fulfilled the other criteria concerning national/international scope, Christian identity, and were not official denominational organizations.8 Any organizations that fulfilled these criteria were then added to a list of included organizations.

Because it is possible that some Christian nonprofits would not have been identified using just the methods described above, particularly because some Christian nonprofits may be classified with a different code (e.g., Education), four other sources were also consulted. The first was the membership list of the Evangelical Council for Financial Accountability. The ECFA has just over 2,000 members and many are large national organizations. Also, the over 500 organizations listed on the Christian “watchdog” Web site www.ministrywatch.org were also examined for eligibility. Keyword searches (e.g., “Christian,” “ministry,” “bible,” “god,” “faith”) were also conducted on www.charitynavigator.org and the Associations Unlimited database to identify any other potential organizations for inclusion. While a handful of organizations were solely identified by one of these four sources, the large majority of organizations identified were simply confirmations of organizations already identified through the NCCS’ database.

Entering Financial Data

After creating the list of included nonprofits, the first step of data collection was to enter the financial information of each organization from its 990 tax return for the year ending in 2004.9 All of the revenue, expense, and asset lines on page 1 (Revenue, Expenses, and Assets) and page 2 (Functional Expenses) of the 990 form were entered into a database. For those organizations required to fill it out (specifically, those not claiming church status), information on revenue from the previous four years and lobbying activity from Schedule A of the 990 form were also entered.

After this information was entered, internal accuracy checks were conducted by using the totals entered from the forms against computed totals created from entered parts. For (p.174) example, Total Revenue was entered directly from the 990 form, but another “Total Revenue” was computed using parts entered from the 990 form that goes into the Total Revenue number. Any discrepancies were then checked against the forms and corrections made. In some cases, the issue was due to an error in the original form. Because it was sometimes impossible to know what exactly was incorrect in the forms (i.e., was the total incorrect or were one of the subparts incorrect? If the latter, which subpart?), these cases were left as they were on the original forms.10

Coding Statements of Purpose

The financial data collected from the 990 forms provided critical information on the identified organizations. However, some key concepts of interest could not be measured by financial numbers alone, such as the organization’s activities and how each organization expresses its religious identity. For both these measures, I looked to the “Statement of Program Service Accomplishments” provided in Part III on page 2 of the 990 forms. Each form was coded as having 1 of 27 primary activities. These 27 activity codes were contained within the nine larger sectors featured in the preceding chapters. These primary activity codes were created inductively based on informal examinations of many organizations and their 990 forms.

Each form was coded independently by multiple coders. This provided for the assessment of the measures’ reliability using Cohen’s kappa scores, which assesses the percent agreement in coding between the two coders. Kappa adjusts for the expected agreement due to chance and is therefore a more stringent measure than just percent agreement. Reliability scores for each sector, the unweighted average, and the weighted average are all shown.11

Table A.1. Reliability of Sector Codes.


Average Percentage of Organizations in Category

1. “Charismatic Evangelism”



2. “Relief & Development”



3. “Education & Training”



4. “Publishing & Resources”



5. “Radio & Television”



6. “Missions & Missionary”



7. “Fellowship & Enrichment”



8. “Advocacy & Activism”



9. “Fund-Raising, Grant-Making, & Other”



10. “Unspecific or missing”



Unweighted average kappa


Weighted average kappa


After assessing the reliability of the activity codes, the two coders then began to discuss those organizations where there were disagreements between codes. Each was then assigned an agreed-upon final code. In addition to organizational sector, both coders also assessed the religious expression and identity for each organization by looking for certain religious (p.175) keywords within the self-description of program service accomplishments. These words are listed with their respective kappa scores of reliability.

Table A.2. Reliability of Religion Keywords Codes.




Christ/Jesus/Jesus Christ


















Great Commission/Commission




















Again, any discrepancies were investigated and corrected. The coding process also assessed whether the organization operated on an international level, whether it was affiliated with some religious tradition (e.g., Lutheran, Baptist), and whether it was named after someone listed as an officer on the 990 form. The same reliability scores for each of these were computed and are shown.

Table A.3. Reliability of International Activity, Faith Tradition, and Executive-Named Codes.


International activities

No international activities mentioned




South/Central America (including Mexico and Caribbean)




Europe (including Russia)


Middle East




Australia (including New Zealand)


Unspecific, but international activities mentioned


Affiliated with faith tradition


Named after executive




(1.) See, for instance, Campbell, David. 2002. “Beyond Charitable Choice: The Diverse Service Delivery Approaches of Local Faith-Related Organizations.” Nonprofit and Voluntary (p.198) Sector Quarterly 31: 207–230; Pipes, Paula F. and Helen Rose Ebaugh. 2002. “Faith-Based Coalitions, Social Services, and Government Funding.” Sociology of Religion 63: 49–68; Ebaugh, Helen Rose, Paula F. Pipes, Janet Saltzman, and Martha Daniels. 2003. “Where’s the Religion? Distinguishing Faith-Based from Secular Social Service Agencies.” Journal for the Scientific Study of Religion 42: 411–426; Wuthnow, Robert, Conrad Hackett, and Becky Yang Hsu. 2004. “The Effectiveness and Trustworthiness of Faith-Based and Other Social Service Organizations: A Study of Recipients’ Perceptions.” Journal for the Scientific Study of Religion 43: 1–17.

(2.) Nonprofits with gross receipts less than $25,000 do not have to file annual returns. “Gross receipts” is the equivalent of revenue (e.g., contributions + profits from commercial activities) added to the expenses for those commercial activities (e.g., cost of goods sold). Those with gross receipts less than $100,000 file a shorter 990-EZ form.

(3.) Called the “Core File,” this database is produced yearly for 501(3)c nonprofits, private foundations, and other tax-exempt organizations.

(4.) “Guide to the National Taxonomy of Exempt Entities.” http://nccs2.urban.org/ntee-cc/

(5.) “National Taxonomy of Exempt Entities-Core Codes.” http://nccs2.urban.org/ntee-cc/summary.htm#x

(6.) Another code (Religious Youth Leadership) in a different major category was also examined.

(7.) The database limits search results to 500 organizations, so multiple queries were required within each NTEE code and revenue group (e.g., Search 1: Code=X01 AND Revenue>1000000; Search 2: Code=X01 AND Revenue <1000000 & Revenue>500000).

(8.) If necessary and available, the organization’s Web site was also consulted.

(9.) Nonprofits can go by a calendar year beginning January 1 and ending December 31 or a fiscal year, which can begin and end at any time during the year.

(10.) For the 990 sections of Revenue, Expenses, Assets, Functional Expenses, and Schedule A, there were 13, 27, 24, 54, and 35 organizations, respectively, that showed errors in the original forms (although some of these were the same organization with errors in multiple places).

(11.) The weighted average takes into account the percent cases assigned to a particular code. This prevents a small category’s reliability score counting as much as a larger category. For the uncondensed activity codes, the unweighted average was .58 and the weighted average was .62.