Rape culture is defined as “a society or environment whose prevailing social attitudes have the effect of normalizing or trivializing sexual assault and abuse” by the Oxford Dictionary. Rape culture isn’t a society that says rape is fine, rather it’s a society that minimizes rape – with things like victim blaming, denying the prevalence of sexual …...
December 21, 2017|
Articles, Featured
Read More