Member-only story
A Christian Nation?
For some reason there are people who like to fight about whether America is a Christian nation or not. I’m not sure why. I guess it’s to make some kind of a point about America??? Maybe it’s to argue that America = Christianity??? Maybe it’s to argue that that we are somehow special? Maybe it’s to argue that we can take Biblical things out of context and impose restrictions on people and know who to exclude in society, and do this with divine blessing?
These arguments don’t usually encourage critical thinking or conversation. They usually fall into right/wrong bumper sticker rhetoric, or Tweet-level memes, to put it in a current context.
What’s the point of arguing if America is a Christian nation anyway? What does that prove?
How about instead of worrying about whether we can be labeled as Christian nation or not, we Christians just act Christ-like? Let others worry about the label. How revolutionary that would be?
But I know, it takes a whole lot less risk to make arguments and verbally fight with people and figure out who’s a heretic or unpatriotic than to actually just go and live in a Christ-like manner. It’s take a whole lot less energy to have these fight than to die to self by living as a disciple. Making the claim about whether America is a Christian nation means we get to be in control, and that we are right! Because apparently being right is what Jesus was most concerned with, in spite of all the Scripture pointing to living faithfully.