Sitemap
Mind Cafe

Relaxed, inspiring essays about happiness.

Using AI Chat for Personal Support Is Dangerous

Why your personal support should come from books, people, or anywhere other than a chat thread

6 min readOct 2, 2025

--

Press enter or click to view image in full size
Image generated by author on midjourney, which is thankfully not a chatbot

ChatGPT (and it’s siblings Claude, Copilot, Gemini, etc.) are extraordinarily powerful tools for nearly every conceivable intellectual task. But perhaps one of the most common uses of these AI chatbots is for personal support. Unlike real people, who aren’t always available and who have limited emotional energy, friendly ChatGPT agents are always available and never run out of patience for you. They literally exist only to help you, after all, and are thrilled to do so for however long they are required.

I’m quickly coming to think this kind of use of AI is dangerous— it twists natural human instincts and deploys them against us in a way that is destructive. Not intentionally, of course (it never is), but it does so all the same.

How does AI do this? Well, AI has a few qualities that are neutral at first glance but that reinforce dangerous long-term patterns.

  • AI chatbots always respond instantly. This means there’s no upper limit to how long someone can ruminate. In real life, conversations reach a natural stopping point, and someone ruminating alone in an empty room eventually runs out of steam. But the endless AI chat can fuel…

--

--

Mind Cafe
Mind Cafe

Published in Mind Cafe

Relaxed, inspiring essays about happiness.

Sam Holstein
Sam Holstein

Written by Sam Holstein

I make software, teach people about AI, and some other stuff also. www.samholstein.com

Responses (2)