Was America Founded as a Christian Nation?

Timothy J. Sabo
Leftovers, Again
Published in
4 min readJul 13, 2021

--

Photo by Brad Dodson on Unsplash

This is one of the most challenging questions, and no one ever gets it completely correct. This is my take.

No, the United States was not founded or established as a Christian nation. Prior to establishing the nation of the United States of America (late 18th century, upon ratification of Constitution), this land was a multicultural center of migration for Europeans striving to express their religious freedom. Many of these colonists came here specifically to escape government-mandated worship in the manner the state required: Germany, England, and many other European nations had established religions, and some were state-sanctioned. After Columbus landed in Hispaniola in 1492, frustrated Europeans sought refuge in this land, where they believed they could practice their faith with liberty.

Upon arrival, Europeans were met by people described as “Indians” due to the misconception that Columbus has sailed all the way to Asia. These people had long-established cultures, with different nations establishing their own societies. Colonial leaders claimed the land for their kings, and many treaties were signed with the native people. William Penn was one of the few who went out of his way to purchase land at a fair price to the natives. Over time, hundreds of treaties were broken, always by the ever-expanding white culture.

--

--