Facebook Wants Our Naked Pictures. Should We Trust Them?

Facebook recently unveiled a strange plan to fight “revenge porn.” They want people to upload their nude pictures for review. Is there any way this can work?

Chris Hall
The Stockroom
5 min readNov 17, 2017

--

Image: Canstockphoto.com

Last week, Facebook finally ventured into the realm of the outright surreal with their latest idea: In order to fight revenge porn on their site, they want you to send them your naked pictures. The plan is this: We give them our selfies, and Facebook runs the photos through some kind of AI that will create some kind of digital fingerprint so that if some creep tries to upload your picture in the future, their system will immediately flag it and keep it from being posted. It’s not totally automated, of course. The process does involve real human beings looking at your homemade porn. According to an article in The Verge, this is what happens after you’ve uploaded your pictures:

…a member of Facebook’s Community Operations team reviews the image and then “hashes” it, or creates a numerical representation of the image that Facebook says cannot be read by humans. The company considered blurring out images before they ended up in the hands of human reviewers, but decided against it because that may have resulted in accidentally hashing legitimate images. So to clarify, someone at Facebook is indeed looking at the nude photos, but the company stresses that these are “specially trained representatives.”

In related news, I’ve decided that I would like everyone reading this to send me their PayPal logins and passwords in order to keep them safe from fraud.

This has gotten a lot of very snarky commentary in my social media feeds over the last week. I’m sure that some of you have seen the same, and possibly indulged in it yourself. Frankly, the snark is well-deserved. I don’t find myself looking at this and asking “How could this go wrong?” I can’t imagine it going right.

In my recent pieces about the ethics and etiquette of getting or giving naked selfies, one of the major points is that giving sexy photographs — especially if the person in them can be identified — is an extreme expression of trust. It’s the responsibility of the person receiving them to be worthy of that trust. Facebook needs to be held to at least the standard of accountability that you’d expect from the latest hottie that you’re texting with.

A lot of the snark boils down to people asking if we can expect Facebook to live up to those standards of trust and accountability. At the very least, I’m skeptical.

So far, Facebook hasn’t shown themselves to be really strong on issues of sexuality or handling the needs of vulnerable communities as a whole. One of the things that makes discussing sexuality or gender on Facebook so very, very difficult is that their standards for “acceptable content” are not only vague, but inconsistent and mercurial. To start chronicling the major examples would take much more space than I have here. But nearly everyone who’s queer, kinky, trans, or a sex worker has their own stories about how Facebook’s policies have failed them. The battle over Facebook’s “real names” policy is only one of the most infamous that still hasn’t come to a satisfactory conclusion, but it is an excellent example of how out of touch the company has been with the privacy and safety needs of its users.

Of course, I have my own stories; I’ve had pictures of vibrators taken down even when they weren’t anatomically correct, as well as semi-nude images that I thought were within the proper limits. I never got any explanation for what made the latter unacceptable. In 2016, I was one of many people who had their accounts suspended for reposting an article referring to Cathy Brennan, one of the internet’s most infamous anti-trans feminists, as a “fake goth.” This, according to Brennan, was “hate speech,” and Facebook went along with it.

In addition to Facebook’s weird history over such things, this particular announcement comes at a really inopportune moment. They’re trying to convince us that we should give them our naked photos while we’re in the middle of a huge (and long-needed) conversation about sexual abuse and harassment. The most painful part of that is finding that a lot of people that we loved and admired aren’t who we thought they were. I’m still reeling from the disclosures about George Takei and Al Franken, two men that I had really come to admire, something that’s really hard for me to do these days.

Against that backdrop, Facebook needs to do more than say, “Give us your nudie shots. You can trust us.” That’s trust that has to be earned. Especially because this isn’t just turning your photos over to be processed by an algorithm. If you look again at that quote from The Verge, it says plainly that real people are going to be “reviewing” your photos. We’re assured that they’re “specially trained,” but we don’t know what that means, and quite frankly, Silicon Valley as a whole has a really bad reputation for misogyny and techno-jock culture. What does the “special training” consist of? Does it include anything about sensitivity to queer or trans identities? Body image issues? Or should we imagine Facebook workers showing photos to each other and laughing about people who are too old/too fat/too kinky for their tastes?

I think that Facebook’s announcement actually presents a good opportunity. It gives us a chance to start talking about what we can and should expect from the companies that we freely give so much of our information to. In a lot of very important ways, we’ve already stripped naked for them without a thought. But at some point, they have to reassure us that they’re worthy of that trust. Before they ask us to start turning over sexy shots of ourselves to their “specially trained” teams of reviewers, Facebook needs to take a good, hard look at the problems that they already have with sex and gender issues.

--

--

Chris Hall
The Stockroom

Editor, Writer, and Godless Pervert, living in the Berkeley hills, but fundamentally a city boy.