Rape Culture, Feminism, And The Shocking Truth That You Aren’t Being Told

Woman Silhouette 2015 - Public DomainDoes a “culture of rape” exist in the United States? For years, feminists have been speaking of a “rape culture” that exists in this country.  They claim that rape has become “pervasive and normalized due to societal attitudes about gender and sexuality“.  But what they won’t tell you is that the “sexual revolution” that they championed back in the 1960s and 1970s is at the very root of the explosion of sexual violence that we have seen in this nation since that time.  Once upon a time, you could actually let your kids run over to the local playground and play for hours unattended.  I know, because that is what my parents did with me when I was growing up.  But now you have to watch your children like a hawk because there are hundreds of thousands of sexual predators running around out there.  Today, the average American spends more than five hours watching television, and most of the “programming” that Americans allow to be poured into their minds is hypersexualized.  The filth that spews forth from our television sets teaches us that the value of a woman is in how she looks, that the number one goal in life for men is to “score”, and even our children are taught to “dress sexy”.  Our hypersexualized culture is constantly fanning the flames of sexual desire while at the same time teaching us to disregard all of the traditional boundaries for sex.  Needless to say, the results have been absolutely disastrous. (Read More...)

End Of The American Dream