How Will The Shocking Decline Of Christianity In America Affect The Future Of This Nation?

Is Christianity in decline in America?  When you examine the cold, hard numbers it is simply not possible to come to any other conclusion.  Over the past few decades, the percentage of Christians in America has been steadily declining.  This has especially been true among young people.  As you will see later in this article, there has been a mass exodus of teens and young adults out of U.S. churches.  In addition, what “Christianity” means to American Christians today is often far different from what “Christianity” meant to their parents and their grandparents.  Millions upon millions of Christians in the United States simply do not believe many of the fundamental principles of the Christian faith any longer.  Without a doubt, America is becoming a less “Christian” nation.  This has staggering implications for the future of this country.  The United States was founded primarily by Christians that were seeking to escape religious persecution.  For those early settlers, the Christian faith was the very center of their lives, and it deeply affected the laws that they made and the governmental structures that they established.  So what is the future of America going to look like if we totally reject the principles that this nation was founded on? (Read More...)

Do NOT follow this link or you will be banned from the site!