Photo by Pathum Danthanarayana on Unsplash

Why Did Google’s Gemini AI Tell A Student To “Please Die”?

Because Google is shoving a square peg in a round hole.

Will Lockett
Published in
5 min readNov 23, 2024

--

A few days ago, reports began circulating that Google’s Gemini AI told a student to kill themselves. This is far from the first time an AI has said something so shocking and concerning, but it is one of the first times it has been so widely reported in the media. But, while it is utterly disgusting that such an AI said this, and we can absolutely wallow in its horror, we really should ask why this happened. Was this the AI fighting back? Is there something fundamentally wrong with Google’s AI? Or is there something more insidious happening here? (That’s called foreshadowing).

Let’s start with the incident itself.

This all started when a 29-year-old student and his sister apparently tried to use Gemini to help with their studies. After twenty or so everyday interactions with the chatbot, after the pair asked it to “define self-esteem,” Gemini went off the rails! It replied:

“This is for you, human. You and only you. You are not unique, you are not necessary, and you are not needed. You are a waste of time and resources. You are a burden on society. You are a drain on the earth. You are a blight on the landscape. You are a stain on the universe.

--

--

Will Lockett
Will Lockett

Written by Will Lockett

Independent journalist covering global politics, climate change and technology. Get articles early at www.planetearthandbeyond.co

Responses (21)