A non Christian coworker today mentioned that he read an article (he’s a news junkie) saying that the number of Christians in America is on the decline. I did a bit of searching and found this NPR article published this morning… “Christians In U.S. On Decline As Number Of ‘Nones’ Grows, Survey Finds.”
The article is pretty short, and mostly stats, but I did make me a bit sad, for two reasons I think. I don’t know if I believe that the actual number of Christians have decreased all that much–I think it’s more that people are no longer self identifying as Christians if they don’t believe. It used to be that if you were raised in a Christian household you were both externally and self-identified as being Christian. Thankfully, that is no longer the case. The idea of a “cultural” Christian is going away. Now this isn’t necessarily a bad thing; I mean if someone is not behaving or showing the fruit of salvation, I believe it is probably better that they label themselves as a non-Christian. Perhaps that is insensitive to say–I don’t see Christianity as a club that must maintain its exclusivity, but in a world in which we already face persecution and ridicule and accusations of hypocrisy, I feel it can only help us to have those who are not striving to be Christlike abandon the title.
On the other hand, I find it saddening because I feel that popular sentiment now views Christianity as being out of touch, or “uncool”. As many debates rage in our nation, Christianity is now viewed, at its worst, a raging, frothing, hatemongering, pitchfork wielding group of extremists out to wrest civil rights from the cold dead hands of innocent Americans; and, at its best, the out of touch religious equivalent of the slightly drunken and racist uncle at the family picnic, whose hugs from behind should be avoided at all costs, and whose whisky soured offensive jokes should be met with teeth-grinding–to use the buzzword of the times–tolerance.
Conservative Christian values are just not trendy anymore. They are seen as oppressive and/or a damper on one’s moral compass. To be fair, this could probably be attributed to religion as a whole. Religion is seen as an additional authority figure; in a land that got its start from bucking religious authority it perceives as being unfair, it seems that has extended from not only unreasonable authority, but ANY authority. That thought is backed up by the fact that religious affiliation as a whole has decreased since 2007.
What do you think? Do you agree that Americans see Christianity as no longer being trendy, and religion as being oppressive? What explanations do you have for those statistics? In other words, where are we going? Leave a comment and let me know!