I keep hearing about "rape culture" which is defined as a society or environment whose prevailing social attitudes have the effect of normalizing or trivializing sexual assault and abuse.
This is a liberal term, invented by liberals. I'd just like to point out that the biggest "rape culture" contributors are film makers, and pop culture in general. I'd also like to point out that the vast majority of pop culture and film is created and funded by liberals, like Harvey Weinstein.
My point is that liberalism has been the driving force behind this so called "rape culture." Maybe it's liberalism that is causing these problems.
I'd also like to point out that the definition of rape is forceful, unwanted penetration of a bodily orifice. Culture is defined as the arts and other manifestations of human intellectual achievement regarded collectively. That means that the term "rape culture" is unfairly and inaccurately defined.
I completely agree. It's just those pesky politically correct liberals who like to stir up misunderstandings, incite violence then turn around and ask "How did this all happen, It must be those right wing extremists who forced us to engage in violence, looting and abuse"
I just laugh when this happens. You want rape culture, look at Boko Haram, ISIS, South Africa (just the bad parts), etc. That's rape culture.
In the west, we have women falsely accusing men for money, ruining their lives and getting away with it for free. Seems to me like we have a "false accuser" culture. Jokes.
Exactly! Given the data it appears that an overwhelming majority of contributors to most of our societal problems are liberals.