Was America Founded As a Christian Nation: Part 1 The Founding Fathers

It is often preached from the pulpit that America was founded as a "Christian Nation." Perhaps worse than the blatant fallacy behind this is that so many people buy into it. However, to anyone who has spent a little time investigating the matter, the claim that America was founded as a Christian nation is unequivocally false. It is not really a claim which needs to be refuted since, the simple fact of the matter is, America was the first country founded on the principle that all religions deserved equal respect and none deserved unrequited favor. The Christian doctrine of exclusivity was, to the minds of the founding fathers, incompatible with their loftier principles of a united republic, a United States. The vision they had was one of an autonomous nation where your religion was just one part of what defined you--but at the end of the day--each and every citizen, man or woman, could proudly call themselves free--they could call themselves--Ame...