Is America Becoming An Anti-Christ Nation?
Once upon a time, America was a Christian nation. I know that many on the left cringe when they read a statement like that, but it is true. For most of our history, the population of the United States was overwhelmingly Christian, and the values that governed our society were primarily Christian values. But of course everything has changed in recent decades. When Barack Obama boldly declared that “we do not consider ourselves a Christian nation” in 2009, he was speaking the truth. We are no longer a Christian nation and we haven’t been for a very long time. So if we aren’t a Christian nation at this point, what exactly are we? (Read More...)