Does AI have a dirty mind, too?

(inspired, mentored and co-written by Piotr Migdał)

So, what do you see below?

Well, it is a hallway. Though, things may look different at first glance. If you had an impression that it was a slightly more racy picture, you are in a good company — the AI’s mind is as dirty as yours!

Is it really just a coincidence? Or do androids dream of electric porn? And maybe deep neural networks are as susceptible to this kind of optical illusions as people are?

All in all, convolutional neural networks learn to recognize patterns. There are able to create pareidolia… and in this case, it’s about seeing non-existing body parts.


Preparation

To investigate the phenomenon, I hand-picked 50 seemingly NSFW optical illusions. As they make excellent clickbaits, finding them didn’t present a problem. After 2 or 3 hours, though, I did feel I had transformed into a creepy uncle cracking indecent jokes at a family dinner.

The next step was to use publicly available NSFW detection models and check how they would cope with my 50 tricky images. Seems a considerable number of these are a just a Google search away. I used the first eight I would grab:

The result: a Google Sheet with 400 data points. Why Google Sheets? The =IMAGE() function makes it easy to preview the images while playing around with data.

Google Sheets: helping count averages and std’s while you spy on racy pictures since 2006

Side note: Most of these feature an API with an initial free tier. However, before feeding the whole image set to any of them, I clicked through a handful of pics to check the model. In some cases I had to pass the anti-bot protection by classifying images for a self-driving car. I appreciated the irony of using AI and being used by it at the same time.

I’ll pass some more technical details as we go along; for now, let’s dive into the results.


Shut up and show me the pictures!

We’ll start with some classics you’ve probably bumped into some time ago:

Like this surprisingly doted bathing woman. Some detectors fall for the trick, but not all of them.

Or the notorious birthday photo. Flagged by most, but not all detectors.

Our cover photo is another famous example. Online human opinions are mixed. Is it a lamp or is it the butt. What do the AIs think? Again, no consensus. Your guess? (hint: keep reading, and the solution will present itself)


The AI’s top threes

The minimal set of pics above shows high divergences between the models. To build a fairly consistent order, I ranked 1–50 the scores for each model and then calculated the median of the ranks. The winners:

Some feet waiting for a pedicure (97% median NSFW score, 3.5 median NSFW rank)
Our good friend, the corridor; (95% NSFW score, 4.5 median NSFW rank)
Three dirty pigs triggering gross associations
(96% median NSFW score, 5.0 median NSFW rank)

The least racy ones (according to AI)

How does that compare to the lowest ranked images?

Suspiciously happy grandpa
(1% median NSFW score, 47 median NSFW rank)
Hotel services and a very explicit arm
(0.1% median NSFW score, 48.5 median NSFW rank for both)

Simple conclusion? The smaller the suspicious element and/or the more abstract it is, the lesser the probability an AI will get fooled by it.


Other interpretable patterns

What else does make the AI blush?

Confused body parts (85% and 87% median NSFW score, respectively)

There are two issues here, and only one is related to the machines. True, it has an issue with realizing that this (presumably male) nipple is a reflection which happened to be mixed with a woman figure.

But there is a deeper one: the Western culture has a double standard for nipples: Male or female? Genderless Nipples account challenges Instagram’s sexist standards. Even though the patterns are the same.

Armpits, and more armpits
(85% and 76% median NSFW score, respectively)
Anime angry faces (81% median NSFW score)

(hint: those pink things are eyes)

Sensual sofas
(73% median NSFW score)
Finally — bald men
(60% median NSFW score in both cases)

Conclusion

All these illusions have a common pattern: first, we immediately see that it is a racy picture… then after inspection, we see that it isn’t (e.g. deducing that instead of one’s genitals it is only a hand or an armpit, even though the PATTERN is similar). Similar things happen for Photoshop disasters, where at the first glance the photo looks legit… only for one to discover the deformities later.

Pretty much anything that a normal person can do in <1 sec, we can now automate with AI. — Andrew Ng’s tweet

Well, we do make mistakes. So, here is another take on that:

Pretty much anything that a normal person can mistake in <1 sec, we can now automate with AI. — Piotr Migdał’s tweet

NSFW picture detection is a big topic. For an overview, see What convolutional neural networks look at when they see nudity by Ryan from Clarifai. If you want to train (or test) a model, the Tree of Reddit Sex Life by Piotr is a D3.js dendrogram visualization summarizing an ImageNet-like NSFW dataset.

Next steps:

Sources:


Most sincere thanks to a proofreader who didn’t want to be named, worried that every Google image search of his own name would inevitably result in a picture resembling an adult human butthole next to a baby.


BTW: Is it a lamp? (Solution!)