America Was Founded As A Christian Nation By Christians That Believed That The Bible Is The Word Of God May 4, 2017May 4, 2017
America Became A Lot Less Christian During The Presidency Of Barack Obama January 12, 2017January 12, 2017