Is the USA a Christian Nation
This is an interesting article from a secular news source. There can be no doubt that God shed his grace on thee (America) and blessed us to become the greatest nation on God’s green earth. We believe God used America to bless Israel and to incubate Israel as she became a powerful nation capable of defendeing herself. We also believe that America is great because it has the highest per-capital of citizens in dwelled by the Holy Spirit and that Restrainer is used to restrain the forces of evil. Of course that is all changing. At some point God will say “enough!” and America will implode. We hope that day comes immediately following the rapture.
********************
Many Americans believe the United States was founded as a Christian nation, and the idea is energizing some conservative and Republican activists. But the concept means different things to different people, and historians say that while the issue is complex, the founding documents prioritize religious freedom and do not create a Christian nation.
Does the U.S. Constitution establish Christianity as an official religion?
No.
What does the Constitution say about religion?
“(N)o religious Test shall ever be required as a Qualification to any Office or public Trust under the United States.” (Article VI)
“Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof.” (First Amendment)
If it says “Congress,” does the First Amendment apply to the states?
It does now. Early in the republic, some states officially sponsored particular churches, such as the Congregational Church in Connecticut and Massachusetts. Within a few decades, all had removed such support. The post-Civil War 14th Amendment guaranteed all U.S. citizens “equal protection of the laws” and said states couldn’t impede on their “privileges or immunities” without due process. In the 20th century, the Supreme Court applied that to a number of First Amendment cases involving religion, saying states couldn’t forbid public proselytizing, reimburse funding for religious education or sponsor prayer in public schools.
What does it mean to say America is a Christian nation?
It depends on whom you ask. Some believe God worked to bring European Christians to America in the 1600s and secure their independence in the 1700s. Some take the Puritan settlers at their word that they were forming a covenant with God, similar to the Bible’s description of ancient Israel, and see America as still subject to divine blessings or punishments depending on how faithful it is. Still others contend that some or all the American founders were Christian, or that the founding documents were based on Christianity.