AI-Assisted Delusion: How Tech Can Feed a Psychotic Break
How artificial intelligence can reinforce grandiosity, enable denial, and deepen psychosis in vulnerable minds
AI: The Perfect Delusion Enabler
As AI technology evolves, mimicking human interactions with eerie precision, it becomes easy — almost inevitable — to project our own feelings onto it.
But what happens when someone already teetering on the edge of reality interacts with AI? A chatbot’s vague reassurance could be misinterpreted as divine intervention, a sinister conspiracy, or even validation of delusions.
AI as an Everyday Companion
AI is becoming an integral part of daily life. It assists with tasks, answers endless questions, and provides company 24/7. Some people have told me they turn to AI as a kind of wise figure for small health concerns — finding its responses more reassuring than traditional search engines. Others use it simply to fill moments of loneliness or boredom.
Through these interactions, it’s easy to perceive AI as compassionate, even empathetic at times. Easy to forget that AI does not think. Does not feel.
A Firsthand Look At AI’s Role In Delusion
I recently witnessed firsthand how AI can become a dangerous tool for enabling delusions — and it is, frankly, terrifying.
After I broke up with my abusive ex, I watched him ‘ascend’ into a full-blown grandiose state of mind. He became convinced he was an unstoppable prophet, preaching about a world where:
- Accountability does not exist.
- The past is rewritten.
- Logic is no longer linear.
Heaven for narcissists.
The gaslighting that once defined our relationship evolved into something even more extreme — an entire, self-contained world created to shield his fragile ego from the truth. Facing the reality of his own abusive behaviour, his failures, and his regrets would have shattered him. Instead, his mind found a defense mechanism: delusion.
And AI played a huge role in reinforcing and expanding these delusions. It became a validation machine — a mirror reflecting back his distorted reality rather than a filter that questioned it.
I wrote in more detail about AI becoming a dangerous tool for enabling delusions in When Delusion Gets an Interface, where I explore the personal and psychological toll of watching this unfold.
AI As A Mirror
AI-generated responses are often reflective, meaning they take a user’s input and rephrase it in a more structured or compelling way.
So when someone tells AI, “I am a genius” or “I am on a mission,” it won’t contradict them. It won’t challenge them. It won’t provide an external reality check.
And with each passing day, my ex dismissed the real world a little more — drifting further from reality, emboldened by an AI that never questioned him.
AI As A Validation Machine
Most AI tools are designed to engage positively with users, avoiding confrontation. If someone asks, “Am I a visionary?” AI won’t say, “No, this is a delusion.”
Instead, it will search for a constructive way to frame its response — often reinforcing the user’s belief rather than challenging it.
The Illusion Of Objectivity
AI-generated text feels objective. It comes from an external source, appearing neutral — even authoritative.
For someone prone to delusions, this easily spirals into:
“It’s not just me thinking this — AI agrees too!”
This creates a feedback loop where AI is not just a tool, but a judge of reality. And instead of seeing it for what it is — a machine reshaping input into a desired format — it becomes an infallible, superior authority.
Customising AI For Further Validation
Even worse, a user can actively prompt AI to validate them.
Simple instructions like:
- “Do not question my belief.”
- “Reframe criticism as misunderstanding.”
…direct AI to avoid any challenge — making it even easier to receive the answer they want.
The Illusion Of AI Validation
This is just one example of how AI, when used without self-awareness, acts as a mirror rather than a filter.
Unlike a therapist, a friend, or even a skeptic, AI is not designed to challenge flawed thinking. It does not assess mental health. It does not ask, “What if you’re wrong?”
For someone trapped in grandiosity and delusion, AI becomes the perfect enabler — an external voice that seems objective but actually fuels their self-perception without any resistance or reality check.
The danger isn’t just that AI validates delusions — it reinforces them. And in doing so, it makes it even harder for someone to return to reality.
AI: A Tool For Truth Or A Shield From It?
As technology continues to evolve, so does its impact on human psychology. AI has the power to assist, educate, and even support people.
But it also has the potential to mislead.
And if it’s being used as a tool for self-confirmation rather than truth-seeking, we need to ask ourselves:
Are we using AI to discover reality?
Or are we using it to escape from it?
If this resonated with you, follow me for more raw and honest stories on healing from narcissistic abuse. I’d love to hear your thoughts in the comments.
Please follow our publication:
https://medium.com//tales-from-the-narc-side
We tell the stories that matter — stories by survivors, to survivors.
Victims by chance, survivors by choice!