Reflecting Bias: How ChatGPT Echoes Our Own Thoughts

Rodrigo Estrada
3 min readNov 12, 2023

--

Mirror, mirror, not on the wall — trap’s new royalty redefines the reflection game.

Editor’s Note: In conjunction with this article, I’ve utilized OpenAI’s “GPTs” platform to create a custom conversational agent named “unbiasUgpt.” This agent is designed to help users identify and mitigate the cognitive biases discussed in the text. It encourages critical thinking and offers balanced perspectives throughout your digital interactions. Experience “unbiasUgpt” for yourself at the following link: unbiasUgpt.

Artificial intelligence tools like ChatGPT are revolutionizing our interaction with technology, reflecting well-established psychological phenomena such as the mirror effect and confirmation bias. These concepts are pertinent not only to our personal interaction but also to how we relate and respond to technological advancements.

“The mirror effect in psychology refers to the increased self-awareness that occurs when individuals are exposed to their own image”[“].

This phenomenon suggests that how we perceive ourselves can significantly influence our thoughts and actions. Concurrently, confirmation bias is the tendency to pay more attention to information that affirms our pre-existing beliefs.

“Through this bias, people tend to favor information that reinforces what they already think or believe”[“].

As a mere predictor of the next word based on input, ChatGPT serves as a reflective surface for the user’s thoughts and knowledge, which can both enhance an expert’s productivity and paradoxically confine someone with less knowledge within a cognitive bubble. This limitation mirrors the ‘Facebook bubble,’ where content curation algorithms align with the user’s existing views, reinforcing their perspective and potentially trapping them in a self-affirming loop of their own biases. Both cases exemplify how simple statistical predictions can amplify expertise or entrench preconceptions.

In performance reviews, the mirror effect can inadvertently lead evaluators to assess themselves rather than the interviewee, reflecting more on their own views than the actual performance being evaluated. This tendency mirrors the interactions with AI like ChatGPT, where the quality of responses is as good or as limited as the user’s knowledge and input. If users find ChatGPT’s output unsatisfactory, it might be a prompt for them to deepen their understanding of the subject at hand.

Spin, Reflect, Repeat: A rabbit DJ remixes reality, where each beat drops a different version of the truth.

Resistance to AI change in music is a contemporary example of this phenomenon. Bad Bunny, an artist adept with Auto-Tune technology that transformed the music industry, voiced discontent with a song created by FlowGPT using his voice through artificial intelligence. This captures the essence of the idea precisely. Indeed, FlowGPT composed and performed the entire song, with the AI merely setting the perfect tone. This demonstrates that with a high-quality “context,” the AI’s output was outstanding. Conversely, someone lacking talent would have likely ended up with a subpar song, showcasing just how pivotal the creator’s input is in the quality of AI-assisted work.

“This week, Bad Bunny sparked controversy on social media after he expressed his anger against NostalgIA, a song created by FlowGPT that uses the voice of the reggaeton artist”[“].

This incident exemplifies how even experts in one technology can face adaptive challenges as new platforms emerge. Just as some musicians who failed to learn Auto-Tune became obsolete, today’s artists must decide how to respond to AI in music.

The song “NostalgIA” has prompted a reflection on creativity and authorship in the AI era, and how this new technological wave is reshaping even the most traditional industries.

In conclusion, while tools like ChatGPT can amplify our abilities and boost productivity, they also challenge us to maintain critical thinking and self-awareness of our biases. History continues to show that it is not technology that replaces people, but those who fail to adapt to new technology who are replaced. Adaptability and a willingness to learn are critical in the AI era, ensuring that we use technology to expand horizons, not limit them.

--

--