Woke in Plain English: "Rape Culture"
Rape Culture. When the woke used the term "rape culture," they mean any setting where sexual assault, particularly against women, is viewed as normal and is largely ignored. In a rape culture, consent is not the norm, victims are usually blamed, and the law (along with the legal system) favors the accused. To be clear: rape cultures do exist. There are cultures where rape is frequent, where there are negative attitudes toward rape victims, and where the legal system rarely, if ever, convicts individuals guilty of rape. Examples of rape culture can be found in parts of India, Pakistan, and South Africa. But when the woke use the term "rape culture," they are not referring to these places. They do not refer to these regions as rape cultures because these regions are largely populated with people who have dark skin, a different religion, and a different culture. Instead, the woke are referring to the United States in general and to college campuses in particular, where incidents of rape are actually among the lowest in the world.