We Need to Give Revenge Porn Victims the Same Rights as Copyright Victims

  • Revenge porn is alarmingly common, with nearly 10% of US women being threatened with revenge porn, with almost half of threats following through.
  • While more and more states are criminalizing revenge porn (obviously a good thing), those laws are meant to punish the person who posted it (which is almost impossible to prove), not the platform hosting it, which is still protected by section 230. That means, even if you catch the perpetrator, the results still exist on platforms like Google.
  • The problem is, most of the long term damage is done because the images live on in Google, humiliating the victim every time they go on a job interview, client meeting, or even date. There is no effective mechanism to get these posts taken down.
  • This recent article in the NY Post highlights this issue. “ Without exception, my clients’ №1 urgency is always the same — their horrific Google results!….The current policy says Google may remove nude or sexually explicit images that were shared without consent, but the company maintains sole discretion about when to remove nonconsensual pornography. If Google decides it will keep linking to a website that contains your nude images, victims are just out of luck. And there’s no appellate body. There is no law, only corporate policy, that protects (or fails to protect) victims’ most private information.”
  • This is something we have a lot of first hand experience with. When a victim comes to us, we do everything we can to get the content removed. However, when that is unsuccessful, we have to take a “suppression” route, which means creating more positive content with the goal of burying the revenge porn and making it hard to find. This process is long, time consuming, and can be expensive.
  • There is a simple solution. Give revenge porn victims the same rights as copyright victims. These platforms are protected by section 230, which is important, but there are obvious exceptions, like in the case of DMCA takedowns, which grants platforms protections as long as they remove reported violations in a reasonable time frame. The same thing can be done for revenge porn cases. I wouldn’t necessarily hold platforms liable for the content as long as they have a reasonable mechanism to report and remove.
  • I hope there are initiatives taken to get something like this into affect. To do our part, we’re currently in talks with organizations like the Center for Humane Technology, to figure out if there are ways forward. Until then, we do our best to help victims with the tools we have.

--

--

--

Patrick Ambron, Co-founder & CEO of BrandYourself.com. Check out my website: http://patrickambron.com/. Or check me out at patrickambron.brandyourself.com

Love podcasts or audiobooks? Learn on the go with our new app.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Patrick Ambron

Patrick Ambron

Patrick Ambron, Co-founder & CEO of BrandYourself.com. Check out my website: http://patrickambron.com/. Or check me out at patrickambron.brandyourself.com

More from Medium

Thinking about Starting up a Co-op?

The Genius of “IGOR” by Tyler, the Creator.

How are wave cut platforms formed?

Let’s talk about Sustainable Fashion!