For those who think that the USA is, or ever has been, a Christian nation, remember that the good European Christians who founded it did so by killing and enslaving non-Christians. Those good Christians robbed the land from the Natives and kidnapped Africans to work the fields, killing huge numbers of them in the process. Do you feel comfortable claiming the name “Christian” for your country, given those facts? If the U.S. is a Christian nation, it is so in the most depraved sense of the term.
A Melting Pot?
For me, as an American, I would rather proclaim that we are a nation of diverse origins, always struggling to become a better place to live. I would rather admit that we started out badly, that many of us were genocidal, that many of us enslaved people; and that the rest of us were content to let it happen and reap the benefits. I would rather try to rid our country of the remnants of that horrendous past, not by ignoring it, but by correcting the inequalities that remain. I would rather focus on building on the good parts of what we did, welcoming immigrants and refugees, abolishing child labor, finally extending the right to vote and own property to minority groups and women, improving our experiment with democracy.
If you want to call the U.S. a Christian nation, do so. I don’t think it’s accurate, but I am not interested in political correctness for its own sake, nor in trying to eliminate religious references or practices in this country. There is no doubt that America has been strongly influenced by various forms of Christianity, in both positive and negative ways. I suggest that you figure out which Christian values you intend to promote by such nomenclature. You could start by specifying that slavery and genocide are bad, and respecting human beings and being good stewards of the land are good.