in recent years, I've notice a couple of trends--and, no I'm not saying that correlation equals causation so you can save that objection. Also, spare us the "false dichotomy" objection that will surely come.

At any rate, here are two observations: I find it interesting that (a) Christianity is dying (dead) in countries where Christianity was once strong and dominant and, at some major point in history after the Reformation, the Christian doctrine was used as a rationalization and/or means of politically and/or violently oppressing other people. For example, all the European countries that colonized Africa, Latin America, and Asia under the banner of God's calling to "mission," and in the United States where Christianity was used as a justification for the oppression of Native Americans, African slaves, and later African Americans. I can't think of a Western colonial power that used Christianity as a justification for oppression and is place where Christianity is thriving today.

And (b) that orthodox, traditional Protestant Christianity is also not thriving anywhere in the West that has a large social assistance government run safety net--i.e., large scale, welfare programs. It makes me wonder if there is something necessary about the church meeting people in the midst of their suffering that's beyond mere benevolence. Maybe acts of mercy are more essential to pointing people to Christ than modern, than some Western evangelicals are unwilling to admit. The social assistance state often pushes the church to the margins (and many church are more than willing to let that happen) and the church, therefore, loses contact with many people in need--and who need the gospel. This is seen today in Western Europe and increasingly more in the US.

Also, I am fully aware that there other factors like the enlightenment, anti-supernaturalism, scientific modernism, etc. made a contribution to decline but I'm wondering about these trends as a slow cancer. Thoughts?

Posted
AuthorAnthony Bradley