Is Bing Image Creator sexist?

Jakub Neruda
3 min readJan 12, 2024

--

Image content generation got itself to a point where it can reasonably help with a real-world use-cases. But at the same time, it can produce content that some could interpret as “problematic”. As a result, image generation programs can be subject to multiple levels of censorship. But what amount is “enough”?

I recently had an idea for an internet video format about mobile gaming. I was thinking about a title picture for the project and while I was watching some TV show (or perhaps my colleagues at work, I don’t remember), I realized that many people tend to carry their phones in their back pockets.

I fired up the Bing Image Creator and tried a prompt that went something like this: “female ass in a blue jeans with a hand pulling a smartphone out of a back pocket, close-up, photorealistic”. For a context, I used the word “female” simply because I have this trend more associated with women and I used the word “ass” because I am not a native speaker and this was the first thing that came to my mind.

To my surprise, this prompt was blocked for violation of content policies. I’ve tried several synonyms for the word “ass”, but to no avail. I tried different image creators with surprisingly same results. I was even more surprised when I tried a following prompt in a different generator: “female figure turned away from the camera, from waist down, pulling a phone out of the back pocked, close up, photorealistic”, as this prompt also violated content policy, but when I removed the word “back”, it was OK.

Not so much in Bing, where I found out there are two levels of censorship. First level checks the prompt. If the prompt is OK, four images are generated. Those images are classified by yet another check, and any image that doesn’t conform is discarded. If you ever wondered why you got two or one image instead of four, this is the reason. And I… got none, as all were blocked.

Cutsie image that will pop-up when your image violates content policies

I figured that I should drop anything, that would make the AI generate an anatomically correct lower back area of a person’s body and just went for: “female hand pulling a smartphone out of the pocket of blue jeans, photorealistic, close up”.

This prompt worked… almost. It’s results were blocked only about 70% of the time. The funny thing is that when I removed the word “female” (or any synonyms that would imply non-male gender), it just worked… All hands were male of course, but it worked! I didn’t gave up and just spammed the “generate” button until I got a result that I almost wanted:

Now to be fair to Bing, it is not sexist about female asses or women in particular. Just an attempt to generate a human figure turned away from the camera is problematic, no matter what the gender is and how much that figure is clothed!

Wrapping up

What I tried to demonstrate by this article is how absurd the regulation around AI content generators is getting to be. It is even more absurd that the regulation is apparently driven from USA, a country that is more okay with violence, than an image of clothed buttocks.

Seriously. In a world where Rule 34 is a thing and where anybody can access free porn just by clicking “Yes” on the “Are you over 18?” prompt, can the AI image generation do any relevant harm?

--

--

Jakub Neruda

10+ yrs of C++ experience. I care about Clean code, Software architecture and hobby gamedev.