America Was Founded As A Christian Nation By Christians That Believed That The Bible Is The Word Of God

One of the reasons why America has gotten so far off track is because most of the population has forgotten that our founders intended our country to be a Christian nation with laws based upon the principles found in the Word of God. The other day I encouraged my readers “to look into why our founders came to this country in the first place, what they believed was most important in life, and how they viewed the world”, and this is precisely what I was talking about. The United States was founded by waves of Christian immigrants from Europe, and these were people that took their faith extremely seriously. These days there are so many people running around saying that we should “get back to the Constitution”, but the Constitution itself was based upon the laws, values and principles in the Bible. If we truly want to get back to the way that our founders intended this country to run, we have no choice but to get back to the Bible. (Read More...)

America Became A Lot Less Christian During The Presidency Of Barack Obama

Barack Obama State Of The Union - Public DomainIn 2006, Barack Obama famously said that “we are no longer just a Christian nation“, and after the last eight years that is now more true than ever. The Pew Research Center has just released a major report entitled “How America Changed During Barack Obama’s Presidency“, and what struck me the most about the report was the fact that it showed that the United States became a lot less Christian while Obama was in the White House. Of course this trend did not begin under Obama, but it seems to have accelerated during his presidency. (Read More...)