Apparently, this is no longer a Christian nation, and things aren’t so black and white.
A couple of recent (as in today) stories floating around out there caught my attention…because I wonder what the religious right would have to say about either/both of them.
This from USAToday: Most religious groups in USA have lost ground, survey finds
And this from the BBC (though it’s all over the news on all major (and many minor) news networks): Obama ends stem cell funding ban
One of the more interesting quotes from the USAToday article is this:
“‘The piety gap defines the primary sides in the culture wars,’ Kosmin says.
‘It’s about gay marriage and abortion and stem cells and the family. If a personal God says, ‘Thou shalt not’ or ‘Thou shalt’ see these a certain way, you’d take it very seriously. Meanwhile, three in 10 people aren’t listening to that God,’ he says.”
I wouldn’t be surprised to hear that some would link these two stories, saying that because of the “falling away” from Christianity, we, the people, are more prone to be accepting of things such as lifting the ban on stem-cell research…which is just not allowed according to their God.
I wonder at the right’s opinion, because I’d really like to engage the dialog. What do people (conservative and/or evangelical Christians) think about the movement away from God? About movement away from organized religion? And what do you think is the appropriate response?