Ouroboros of AI: The peril of generative models feeding on their creations

Gilles de Peretti
4 min readNov 19, 2023

The ancient symbol of the ouroboros, a serpent devouring its own tail, serves as an apt metaphor for the recursive dilemma unfolding in the field of artificial intelligence (AI). As generative models increasingly consume the content they produce, a significant concern arises: what happens when AI begins to learn from itself?

The Generative AI conundrum

To improve and evolve, AI requires vast datasets, typically sourced from the internet’s vast expanse of human-generated content. However, as AI-generated texts, images, and videos proliferate, these models face a new challenge.

They begin to feed on the output they create, potentially spiraling into a recursive feedback loop that can lead to cumulative errors and a decline in performance, a phenomenon known as ‘model collapse.’

The experiment and its revelations

Credit : https://arxiv.org/pdf/2305.17493.pdf

A study conducted by researchers in Canada and the U.K. has brought to light the issue of model collapse, which occurs when generative models are trained on AI-generated content, leading to a gradual…

--

--

Gilles de Peretti

Machine Learning Engineer | Data Scientist ✅ - MLOps ✅ - Data Product Management ✅