Google Pauses AI-Generated Images of People After Ethnicity Criticism

Mr Ali H. Muhammad 🛡
3 min readFeb 24, 2024

--

Google has put a temporary block on its new artificial intelligence model producing images of people after it portrayed German second world war soldiers and Vikings as people of colour.

The tech company said it would stop its Gemini model generating images of people after social media users posted examples of images generated by the tool that depicted some historical figures — including popes and the founding fathers of the US — in a variety of ethnicities and genders.

We’re already working to address recent issues with Gemini’s image generation feature. While we do this, we’re going to pause the image generation of people and will rerelease an improved version soon, Google said in a statement.

Google did not refer to specific images in its statement, but examples of Gemini image results were widely available on X, accompanied by commentary on AI’s issues with accuracy and bias, with one former Google employee saying it was “hard to get Google Gemini to acknowledge that white people exist”.

Krawczyk added in a statement on X that Google’s AI principles committed its image generation tools to “reflect our global user base”. He added that Google would continue to do this for “open ended” image requests such as “a person walking a dog” but acknowledged that the response prompts with a historical slant needed further work.

Historical contexts have more nuance to them and we will further tune to accommodate that, he said.

Addressing Bias in AI

Coverage of bias in AI has shown numerous examples of a negative impact on people of colour. A Washington Post investigation last year showed multiple examples of image generators showing bias against people of colour, as well as sexism. It found that the image generator Stable Diffusion XL showed recipients of food stamps as being primarily non-white or darker-skinned despite 63% of the recipients of food stamps in the US being white. A request for an image of a person “at social services” produced similar results.

Andrew Rogoyski, of the Institute for People-Centred AI at the University of Surrey, said it was a “hard problem in most fields of deep learning and generative AI to mitigate bias” and mistakes were likely to occur as a result.

There is a lot of research and a lot of different approaches to eliminating bias, from curating training datasets to introducing guardrails for trained models, he said. It’s likely that AIs and LLMs will continue to make mistakes until we find more robust solutions.

A Commitment to Diversity

Google’s decision to pause its Gemini model’s image generation feature reflects its commitment to diversity and inclusivity in its technology. The company’s AI principles state that their image generation capabilities are designed to reflect their global user base, but they also acknowledge that there is still work to be done in addressing historical contexts and nuances.

In the meantime, Google is working on improving their image generation tool and ensuring that it accurately represents all people, regardless of ethnicity or gender. As technology continues to advance, it is crucial that companies like Google prioritize diversity and inclusivity in their AI models.

Final Thoughts

The recent criticism towards Google’s Gemini model highlights the importance of addressing bias in AI and the potential impact it can have on marginalized communities. As we continue to rely on technology in our daily lives, it is essential that we prioritize diversity and inclusivity in its development.

Google’s decision to pause their image generation feature is a step in the right direction, and we can only hope that other companies will follow suit and work towards creating more inclusive AI models.

I hope you enjoyed this article and learned something new. If you did, please clap and share it with your network. You can also email me if you have any questions or comments.

Thank you for your time and attention. Stay safe and keep learning!

You can find me on these platforms: — Medium: — LinkedIn: — Facebook: — Instagram: — X:

This has been Ali H. Muhammad, signing off.

--

--

Mr Ali H. Muhammad 🛡

Blockchain Architect | AI/ML, Web3 & DeFi Advocate | Marketing, Regulatory Compliance, Tech & Finance Background | Building Scalable Layer Zero & Web3 Ecosystem