The Future of AI Therapy
With the arrival of AI, nothing will ever be the same again. Including, presumably, therapy. So, if therapy is about to get an upgrade, what might it look like? The only thing we can be sure of is that it won’t be quite what we expect.
Freeing you up
The first thing that you’ll be using AI for is your notes. Already, AI-based platforms such as Upheal can record your sessions and transcribe, summarise, and generate Progress Notes from them. Other forms of documentation are coming soon: say goodbye to writing your own treatment plans. Soon, you’ll put all your energy into being a better therapist, rather than a scribe.
But that’s just the beginning. Get ready for AI-generated session analysis, insights, and even recommendations. If that sounds interesting, read on.
You, but even better
Ever realized after a session that you missed something? Or not been sure what you missed, when the client doesn’t come back? Already, AI can give you a clue: look at a handy visual map of the session to see where the pauses were, and where the client’s speech was more present- or future-focused; more positive or negative in tone.
This analysis is going to get much more sophisticated. You’ll be able to tell the AI what to highlight in the session: do you want to see the cognitive distortions? Or the defenses? Or the subtle moments of disengagement when the client wasn’t quite on board? Tell the AI what you want, and it’ll show you in its summary of the session, or perhaps even alert you in the session, as it happens. Of course, you’ll be able to switch off these alerts if you find them distracting, or perhaps choose how they appear: anything from an unobtrusive icon to a more attention-grabbing alert.
Soon, the AI won’t just be relying on what’s said to draw its conclusions. Instead, as the technology evolves, it will be incorporating visual data to read posture, eye movements, and expression. Throw in some measurements of biomarkers via wearables — heart rate, skin conductance, and the like — and it’ll have more information to go on than any therapist ever had.
But all the insight in the world can only take you so far — you also need to know what to do with your client. All too often, we rely on the same old ways of formulating and the same old bag of tricks; the things we’ve done so many times that they’re second nature.
Instead, AI will soon turbocharge our decision-making process. How about a treatment plan including the AI’s own suggestions, that you can incorporate into your thinking? Maybe also a formulation that evolves session-by-session as new information is integrated into it. Want to get some insight from a therapy model that you don’t usually use? Just tell the AI, and it’ll analyse your session using that model, and suggest what you might do next time.
But we don’t just want to rely on AI to supplement our skills. We want to get better, as therapists. So, get ready for AI-designed training exercises. Maybe you’d like to work on building rapport, or on setting goals. Ask the AI, and it will design training exercises for you, graduated in difficulty. Perhaps you can practise your skills using video clips selected for the purpose, or even identified by the AI from your own video-recorded sessions. Perhaps you won’t even need to identify your own training needs: the AI will analyse all your sessions and tell you what you might want to work on.
Beyond you and your clients
This might all sound good, but it’s based on what we currently know about effective therapy: rapport, goal-setting, and so on. AI is going to take us into a brave new world of psychotherapy research, that will shed new light on what really makes therapy work.
As more and more therapy sessions are fed into AI platforms, huge datasets are going to become available, which can be analysed at the speed of AI, instead of the snail’s pace at which researchers can transcribe sessions and analyse the transcripts. And who knows what it’s going to reveal: maybe some big surprises, and certainly much more fine-grained insight into what works, for whom, and when. No longer will we have to be guided by what works for the average client (who, of course, doesn’t exist); we’ll be able to personalise treatment according to AI-assisted research findings. And this new information can be fed directly back into your work, via AI-generated recommendations as to which therapist and which approach would work best for this specific client, and even what technique would be most helpful at this moment.
Into the sci-fi future
So far, so exciting. But we’re still in the same old therapy room (or video-conferencing platform), doing the same old thing: talking. Which is great, but not really the best we can imagine.
The major tech companies are betting on virtual reality as the next big thing, and so, before too long, therapy might not just be a conversation between two people, but instead a trip into any kind of virtual world that you can imagine.
Already it’s possible to use virtual reality for certain limited purposes. Scared of flying? Come join me, dear client, for exposure work in a VR flight simulator. And with AI added into the mix, the sky is the limit (as it were). Soon enough, by combining VR with AI’s capacity to generate convincing images from just a brief description, you might be able to design any kind of therapeutic environment you want, to work on your particular issue. Imagery re-scripting, for example, is likely to be a lot more powerful when you can literally enter the memory and change its outcome.
But won’t it be…weird?
Of course, we might have some worries about letting AI into our therapy sessions. What about data security? What if the AI gets things wrong, or is biased? What if it replaces us altogether? And isn’t it just a bit weird, having soulless technology intruding into the profoundly human activity of therapy?
Well, yes and no. To be sure, these are all important questions. We shouldn’t use any platform that doesn’t meet the highest standards of data security. And we shouldn’t substitute its judgment for our own — its suggestions should be just that, rather than dictates that we unthinkingly follow. And is it weird? Well, that’s for each of us to judge. To therapists of an older generation, telehealth is weird; to Socrates, writing was weird. But if AI has the potential to make us more effective — more helpful to our clients — shouldn’t we give it a try?
Again: when it comes to therapy and AI, no one knows what’s going to happen. But I for one am keen to find out.
Note: The author is a paid advisor for Upheal, an AI-company referenced in this article. A version of this article originally appeared on Upheal’s website.
About the Author:
Michael Eisen is a Clinical Psychologist in the UK and the director of Intend Therapy