Are we living in a post-Christian country?

  • Thread starter Thread starter Christphr
  • Start date Start date
Status
Not open for further replies.
I agree that many early American leaders were closer to Deism than to a specific Christianity. But in the population as a whole, many of the unchurched or rather non denominational early Americans presupposed the framework of Christian belief and morality. Passages from the Bible were written right into some state legal systems. It was not so much “we have to push our religion on all” but rather the whole population took for granted the Natural Law, even the deists. (cf the Declaration of Independence, for instance)

America was a Protestant-compatible society certainly by the last 1800s, when churches were going up in every town. Those who did not go to church on Sunday were mostly respectful of Christianity, and Christian assumptions about moraliy were taken for granted, to some extent, even by those who did not go to church.

Secularism was gaining before WWII, and the Natural Law was weakening.But as a result of revulsion against Hitler the Natural Law regained strength during and after the War, which was practically a moral crusade, Christianity (Catholic and Protestant) gained heavily. It was a Christian country certainly in the 1950s. But that did not last, and of course the Natural Law has been in the cross hairs since 1960.

I realize the Natural Law goes beyond Christianity but in the West the Catholic Church has hung onto it, while others abandoned it, so it is kind of “the canary in the coal mine”, for Christianity, to stretch a metaphor.
 
Last edited:
Status
Not open for further replies.
Back
Top