I hear this term, rape culture, bandied about, but I’ve never read a definition of it that seemed consistent with reality. Mostly the definitions seem designed to fit a feminist narrative.
Wikipedia puts it this way: “In feminist theory, rape culture is a setting in which rape is pervasive and normalized due to societal attitudes about gender and sexuality.” That’s seems to be the way it’s mostly used, but the obvious problem is that rape is not pervasive and normalized in our Western culture. Rape is criminalized and condemned. The present experiences of Europe make it pretty clear that the Middle Eastern culture immigrating into Europe finds rape normal, but not Western culture.
Unless the definition of rape is changed, of course. Feminists two or three decades ago greatly expanded the definition of rape to better fit their agenda. They actually wanted it to be law that any wife who agreed to sex with her husband when she didn’t really want to was a victim of rape. Any woman who later regretted agreeing to sex was retroactively a rape victim.
Anyway, it would help much if you would give us your definition of rape and your definition of rape culture. It’s difficult to assess your article without first understanding what you mean by those terms.