Member-only story
The Illusion of Care: Why AI Therapy Apps Cannot Replace Human Relationship
Unregulated therapy chatbots may look convincing, but they strip therapy of its core relational depth and risk reshaping client expectations in harmful ways.
AI has entered the therapy space more rapidly than regulatory or ethical frameworks can keep pace. Every week, new apps appear, offering instant counselling, daily check-ins, or even promising to “replace your therapist.” They sound smooth, reassuring and authoritative. For clients who are struggling, they can feel like a lifeline.
The danger is that they are not therapy at all.
I tested one of these apps recently with a simple statement:
“I am scrolling too much, I cannot get out of bed.”
Within seconds, the app told me I had depression, that I was isolating myself because I was obviously overwhelmed. Without asking for any more context, it moved straight into providing me with strategies and offering empathetic-sounding generalised reflections.
It was convincing on the surface, but it was also full of assumptions and entirely ungrounded in any relationship with me. A few questions would have revealed this was a regular Monday…

