Pages

Friday, October 06, 2017

Did the Reformation Secularize the West?


We’re all aware that the West is more secular than it used to be. Christianity no longer exerts the same influence over our public institutions as it did centuries ago, and at a personal level churchgoing and Christian belief have been declining in most Western countries for half a century or more.

Scholars’ attempts to explain this phenomenon can sometimes sound like a game of Clue, except instead of trying to explain the murder of poor Mr. Boddy—“It was Colonel Mustard with the candlestick in the billiard room!”—they’re trying to find out who killed religion in the West: “It was the philosophes with the Enlightenment in the eighteenth century!” “It was Darwin with On the Origin of Species in 1859!” “It was the Sexual Revolution with the birth control pill in the 1960s!” Read More

No comments:

Post a Comment