A Christian Nation?
For some reason there are people who like to fight about whether America is a Christian nation or not. I’m not sure why. I guess it’s to make some kind of a point about America??? Maybe it’s to argue that America = Christianity??? Maybe it’s to argue that that we are somehow special? Maybe it’s to argue that we can take Biblical things out of context and impose restrictions on people and know who to exclude in society, and do this with divine blessing?
These arguments don’t usually encourage critical thinking or conversation. They usually fall into right/wrong bumper sticker rhetoric, or…