So rape isn’t happening anymore.
If you look at any TV movies or media that is anyway.
When glorifying rape and making rape sexy started beginning problematic as in yeah it’s actually not any fun for women Hollywood had no idea what to do.
But if we can’t show a conventionally stunning woman getting raped by a conventionally stunning man what are we going to do? Especially if we are not going to show her struggling just a little bit and then laying down to take it?
So if you notice no one is a rapist anymore.
They’re murderers, torturers, maybe gropers, stalkers …but no one actually rapes anymore. Even when you know they’ve clearly created this character to be a rapist.
Do I want to watch women get brutally raped? No I do not.
But it doesn’t help society to completely take the idea that women are still being raped completely out of the culture.
If rape is alluded to it’s kind of like exposition lite. Meanwhile we can see every manner of living being stabbed shot burnt alive buried alive tortured kidnapped brutalized in every possible manner. But God forbid we actually show what really happens during rape.
It’s like Hollywood just froze when women started gaining power and changing the culture’s idea of what rape really is.
If it’s not a secret male fantasy then what are we going to do with it?
Luckily there are a lot of women producers and directors writers actors who are gaining power. The Tale (screenplay/director Jennifer Fox) was the first movie I’ve ever seen to be that brave.
Hopefully the future will be representing what actually happens during sexual abuse and rape so that we can face this and inform our growing girls and women on how to continue the narrative and speak out and change the world.