Did You Really Go To Church This Week? Behind the Poll Data
by C. Kirk Hadaway and P.L. Marler
Kirk Hadaway is minister for research and evaluation at the United Church of Christ’s Board for Homeland Ministries. Penny Long Marler is associate professor of religion and philosophy at Samford University in Birmingham, Alabama. This article appeared in The Christian Century, May 6, 1998, pp. 472-475. Copyright by The Christian Century Foundation; used by permission. Current articles and subscription information can be found at www.christiancentury.org. This article prepared for Religion Online by Ted & Winnie Brock.
Church attendance in the U.S. is, apparently, stable and strong. Year after year 40 percent of Americans tell pollsters that they attended church or synagogue in the last seven days. From this evidence, American religion seems quite hardy, especially compared to the statistics from European nations. If the poll data can be believed, three decades of otherwise corrosive social and cultural change has left American church attendance virtually untouched.
Public opinion polls, which measure everything from church attendance to confidence in the president, provide many of the "hard facts" that social scientists (and the general public) use to understand the social world. But how much trust should we put in the polls, particularly in the accounts people give of their own behavior?
Numerous studies show that people do not accurately report their behavior to pollsters. Americans misreport how often they vote, how much they give to charity and how frequently they use illegal drugs. Their misreporting is in the expected direction: people report higher than actual figures for voting and charitable giving, lower for illegal drug use. People are not entirely accurate in their self-reports about other areas as well. Males exaggerate their number of sexual partners; university workers are not very honest about reporting how many photocopies they make. Actual attendance at museums, symphonies and operas does not match survey results.
We should not expect religious behavior to he immune to such misreporting. Several years ago we teamed up with sociologist Mark Chaves to test the 40 percent figure for church attendance. Our initial study, based on attendance counts in Protestant churches in one Ohio county and Catholic churches in 18 dioceses, indicated a much lower rate of religious participation than the polls report. Instead of 40 percent of Protestants attending church, we found 20 percent. Instead of 50 percent of Catholics attending church, we found 28 percent. In other words, actual church attendance was about half the rate indicated by national public opinion polls.
Gerald Marwell, then editor of American Sociological Review, said our research raised questions about "stylized facts" that are passed around "as if they were the truth." Of course, much depends on whose experience does or does not match the presumed "truth" about American church attendance.
Many people, and particularly local church pastors, did not seem surprised by our findings. In fact, a story in the Cleveland Plain Dealer reported that "plenty of religious leaders express private doubts about polls that find almost half of American adults say they worship God each week." Less congratulatory, although still confirming, were the reactions of some of our colleagues, friends and family which tended to go something like, "So you discovered what everybody else already knew," or "Well, I could have told you that church attendance wasn’t that high without doing a study about it."
Others saw the truth about American church attendance quite differently. While very few laypersons or clergy seemed troubled by our findings, one active laywoman did call to protest that her suburban church was "packed" at every service.
The greatest outcry, however, came from survey organizations who produce the polls, social scientists who utilize poll findings to bolster arguments about the vitality of American religion, and a number of Roman Catholic researchers who argued that we exaggerated the overreporting in their constituency. One prominent sociologist, who represents all three groups, said our research was "a sloppy piece of work" and added, "I doubt if the subject was anything but religion that a serious science journal would publish it."
Rather than attack the research directly, the Gallup Organization tried to explain the positive (and, in its eyes, erroneous) response to it. In Emerging Trends Gallup suggested that those who doubt the validity of the 40 percent figure may be reacting to their own experience in places (such as large cities) where attendance is likely to be low.
We did not begin our research with the assumption that the Gallup figures were "wrong." Like other social scientists who use survey data, we trusted Gallup poll results because we knew they employed sound sampling methods. Doubts emerged, however, when we compared statistics on church membership from American denominations to Gallup’s reports on church attendance. If the percentage of Americans attending church is stable, aggregate church membership should have increased as the American population grew. But after adding together denominational membership statistics (including estimates of membership for independent congregations) we found that the aggregate membership total has been virtually static since the late 1960s. This contradiction led us to wonder if Americans were reporting the same level of attendance to pollsters while their actual church participation was dropping. Our first study provided an initial test of this dynamic. Subsequent research confirmed it in important ways.
We returned to Ashtabula County, Ohio, to add a Roman Catholic attendance count to our previous count of Protestants. Because Catholic parishes did not regularly record attendance, we counted Catholic mass attendance ourselves by attending each scheduled mass at every Catholic parish in the county. We attended a total of 38 masses in 13 parishes over several months, counting attendance at each mass. Our counts showed that 24 percent of Catholics attended mass during an average week. In a poll of Ashtabula county residents, however, 51 percent of Roman Catholic respondents said they attended church during the past week. The gap between what people say and do in this rural county is roughly the same as that found in the original study among Catholics in 18 metropolitan dioceses.
We also conducted a study of church attendance in a county in Canada (Oxford County in southern Ontario) using the same methods employed in Ashtabula (using a survey of county residents, attendance reports from Protestant churches and personal counts of Catholic attendance). The results confirm that a large church attendance gap also exists north of the border. The same is true for Great Britain. In the U.S., Canada and Great Britain, the number of people who say they attend church is much higher than the number who actually attend. The proportion of residents who say they attend church is lower in Canada and Britain, of course, but the proportionate size of the discrepancy is remarkably similar.
These studies increased our confidence that church attendance is overreported and that it is not a uniquely American phenomenon. But we also wanted to know why people overreport. Although some colleagues have (somewhat) jokingly accused us of calling decent Americans "liars," we have never argued that people "lie" about their church attendance. Follow-up questions about what people meant by "attending church" revealed that a few were counting things other than attending worship—such as going to weddings, funerals, committee meetings, Sunday school and choir practice. One individual in Ashtabula County even said his attendance consisted of mowing the church lawn on the previous Saturday. Being at church for reasons other than worship "counts" as church attendance for some people who answer poll questions. But these cases represent less than 2 percent of all persons polled, and a large attendance gap remains when they are removed. Why do other people misreport attendance?
A few years ago a longtime staff member of the National Council of Churches and an active church member responded to our findings by admitting that if Gallup called her to inquire about her attendance in the last seven days, she would say she attended even if she had not done so, and she would not consider her response to be a lie. Her reasoning? Saying yes was an affirmation of her involvement in and support of the church. Not attending was atypical, so to count her as a "nonattender" would be inaccurate and misleading.
Most overreporting occurs among those who consider themselves to be regular church attenders. In another study, conducted among members of a large evangelical church in the South, we were able to determine exactly who misreported their attendance. Most of those who said they attended and who, in fact, did not were people who report that they normally attend church "every week." People who attend less often—particularly those who say they normally attend once a month or less—accurately reported that they did not attend church in the previous week.
Researchers who study how people answer survey questions have long known that responses to behavioral questions represent more (or less) than "just the facts." When asked how many times they ate out last week, how frequently they have sex, and whether or not they voted in the last election, most people report what they usually do, what they would like to do or what they think someone like them ought to do. The question that Gallup asks, "Did you, yourself, happen to attend church or synagogue in the last seven days?" provokes similar, often less than factual responses.
Active church members who did not happen to attend church last Saturday or Sunday are expected to say no in response to Gallup’s question. But this creates problems for people who see themselves as committed church members and "weekly attenders." Many have an internal rule that says, "I am a person who attends church every week." Saying "No, I did not attend church" violates that internal rule and identifies them, symbolically, as nonchurchgoers. On the other hand, saying, "Yes, I went" is consistent with their internal rule, counts them on the side of active churchgoers, is in line with their usual behavior (including what they hope to do next week) and affirms their support of the church.
It is possible to reduce the gap between poll-based estimates of church attendance and actual attendance by using questions that do not make the respondent symbolically choose between being churched and unchurched. This is illustrated by the different rates of church attendance produced by different kinds of questions. In Great Britain, for instance, Gallup asks people what they did the previous weekend and presents a list of likely possibilities. Going to church is listed alongside watching television, taking a walk, reading a newspaper and a number of other options. This question produces a weekly attendance rate of about 14 percent.
When the U.S. version of the question is asked in Great Britain, the weekly attendance rate rises to 21 percent. How many people in Great Britain really attend church in a typical week? Peter Brierley’s figures from the 1989 English Church Census and additional attendance data from the 1996-97 UK Christian Handbook indicate that only around 10 percent attend worship services each week. Typically, these lower figures are used when religious activity in Britain is compared to the U.S., which means that churchgoing in America appears to be three or four times greater than in Britain. When the same poll question is used in both countries, however, attendance in the U.S. is only twice as high as it is in Great Britain.
An Australian wording of the church attendance question also produces a lower rate. When asked, "How long is it since you last went to church, apart from weddings, funerals and similar occasions?" 15 percent of Australians said they attended church in the previous week. But when the U.S. version of the question was asked on a national poll in Australia, attendance claims rose to slightly over 20 percent.
Clearly, poll data should not be taken at face value. Moreover, it appears that poll results are not equal: different wording produces significantly different results. Why does it matter? Because the image of religion in America as exceptionally strong and stable has been at least partially supported by poll data. Our research raises doubts about that image.
If the portrait of American religion painted by poll data is not as strong as once thought, does it necessarily follow that it is less stable? Has a large gap always existed between what people say about attendance and what they actually do, or have consistent responses to the polls masked declines in actual church attendance?
The San Francisco Bay area provides the ingredients for testing the possibility of a changing attendance gap. We have accurate attendance counts along with poll-based estimates of church attendance in the region for several decades. Although the Bay area may seem atypical, it does reflect clear trends in the western region of the U.S. and, to a lesser degree, the rest of the country. Mainline Protestants have declined, whereas nontraditional groups, including once-marginal Protestant churches, smaller sects and non-Western religions, have increased. At the same time, a growing number of people have shed their particular religious affiliations, saying they are just "religious, spiritual" or have no religion at all.
The Archdiocese of San Francisco has collected attendance data from all its parishes since 1961. In the subsequent 35 years mass attendance fell by almost half, dropping from 205,000 to 107,000. Yet two surveys of community residents in the three-county archdiocese area (one in 1972 and one in 1996) reveal a very stable Roman Catholic population and a stable proportion of Catholics who say they attended church. The net result is an increasing gap between saying and doing. Actual mass attendance dropped while self-reported attendance remained the same.
An increasing attendance gap also was found in Great Britain. When identical survey questions are compared, poll-based rates of church and synagogue attendance are static from 1970 to the mid-1990s. At the same time, actual attendance counts in churches and synagogues dropped by more than a third.
What does a growing gap between saying and doing mean? The issue is one of self-identity at a couple of levels. First, a "churched" identity, once established, seems remarkably resilient and long-lasting. Second, whereas "churched" behavior might be important for establishing such an identity, continued frequent attendance does not seem necessary for people to maintain it.
A middle-aged woman we interviewed in Connecticut—let’s call her Carol—is typical of many people who continue to see themselves as "regular" churchgoers despite increasingly irregular attendance. She was raised in the 1950s and 1960s by parents who were United Church of Christ members and active churchgoers. Carol went to church or church-related youth events almost weekly through her teens but dropped out during college and the early years of marriage, childbearing and raising children. In their early 30s Carol and her husband returned "for the sake of the children" to a Presbyterian church (a compromise between his Episcopal background and her own Congregational one).
Before long, however, Carol’s kids lost interest in church school and the youth group to which few of their best friends belonged. Neither she nor her husband was inclined to fight their children’s (or their own) competing interests. Carol, however, retains a lingering commitment to the church and likes to see herself as a "regular" member. She continues to go when she can, and she has managed to stay connected by donating her silk-screening services for youth retreats and other church events. Now the family attends church together only at Christmas or Easter or for other special services, and even then they may opt for the local Methodist or Episcopal church, depending on service times, the preacher, the music or which family members are going.
Carol and her family don’t know the current minister or many active church members very well. There is less and less pressure to attend. Still, the church seems welcoming and familiar whenever they do go. And if a pollster calls? Well, depending on the time, circumstances or the question, Carol will either say she’s Congregationalist or Presbyterian. And if asked about her church attendance? Considering her volunteer work, her own solo attendance and participation with family members for special observances, she may easily reason that she’s pretty active. She may even, if pushed, say she went "last Sunday." After all, she went the week before and made quite an effort to do so—and there was that memorial service at mid-week at the Episcopal church, and she was expecting her daughter to visit this weekend and certainly they would try to go together...
Regular church attendance is increasingly difficult, even for those committed to it. Sunday morning is no longer "sacred" time: job responsibilities, sports leagues, family outings, housework and many other things get in the way of traveling to a church building for worship at a scheduled time. And if you happen to miss church next weekend, will anyone know if you slept in, comforted a sick child, left town on business, or decided to have brunch at the Hyatt? Church attendance is increasingly a private matter, and it is correspondingly easier for each of us to maintain an idealized image of ourselves as regular attenders when in fact we may only manage to attend church two or three times a month at the most.
As long as the proportion of Americans who see themselves as regular, fairly active churchgoers is stable, the proportion of Americans who say they attend church each week will remain about the same—regardless of the actual level of attendance. Change in self-reported attendance will occur only when it becomes less important for Americans to see themselves as regular churchgoers or when the definition of "regular churchgoer" changes.
An identity transformation of this type occurred among many Roman Catholics in the U.S. following Vatican II, and it may happen to the next generation of Protestants if lower levels of childhood involvement in the church result in a different interpretation of what it means to be a Christian and an "active church member." Similar changes are happening now in Australia, where an increasing number of people are shedding their nominal church identities and saying they have "no religion."
Too much trust in survey data has produced a distorted image of religion in America by masking declines in church participation. Church attendance is less strong and stable than poll data show. Still, many Americans continue to hold the church in great esteem and define themselves in traditional religious terms. The increasing gap between doing and saying reflects these countertrends.
But we do not think that this pattern can continue indefinitely. Enduring church-related identities are a legacy of involvement in the church. When experience is diminished over many years, church identity is likely to erode, and with it the need to say you went to church when you did not. The challenge for American churches is to help reconnect the doing and the saying, before all is said and done.
Viewed 99953 times.