Your Facebook Messenger AI Therapist Will See You Now
If you’re feeling blue, Facebook Messenger’s Woebot looks to provide some comfort.
Artificial intelligence bots can give you news, help you shop, and troubleshoot tech problems. But a new startup launching today on Facebook Messenger, Woebot, wants to go a bit deeper and act as a virtual therapist.
Woebot is the brainchild of Dr. Alison Darcy, an adjunct faculty member at Stanford’s Department of Psychiatry and Behavioral Sciences. The Ireland native worked as a software developer for an investment bank in London before pursuing her PhD at University College Dublin, where her research centered on new mHealth applications and group cognitive behavioral therapy for those with chronic anorexia nervosa. PCMag spoke with Dr. Darcy recently about what to expect with Woebot.
PCMag: Dr. Darcy, Woebot is not your grandparents’ therapist, with framed credentials on wood-panel walls and an expensive couch; its conversational style is refreshingly modern.
Dr. Darcy: That’s deliberate. We want to create a dialogue that reflects the language users feel comfortable with while at the same time address what we think are some common fears around therapy.
Which is why, presumably, you’re launching Woebot on Facebook Messenger — because that’s where the young people hang out?
Exactly. We built our prototype inside Facebook, and it turns out that in early testing that was the most favorable aspect. The number-one rule of therapy is to meet people where they’re at, just like the number-one rule of product development [is] go where your audience is. Having said that, as a business, we are planning to build out native apps in the future to give users a choice of platforms in which to interact with Woebot.
We would be remiss if we didn’t point out privacy concerns on the Facebook platform, considering its business model is based on analyzing data to serve contextual ads, i.e. it might be disconcerting for a user to be chatting with Woebot about binge triggers and see an ad pop up for a popular high calorific snack?
It’s a tricky subject to address but, safe to say, at Woebot, we take user privacy very seriously and, in our [terms and conditions], confirm that ALL conversations are completely anonymized from profiles and kept private. Having said that, of course, all Facebook users are subject to Facebook’s rules and privacy regulations as well.
Where did the inspiration for Woebot come from?
When I was a software developer creating decision-support software for investment bankers, my interest was sparked by the possibilities in text-based help programs and its impact to democratizing access to just-in-time information. Later, during my PhD at University College, Dublin, and then at Stanford, I started to wonder why we didn’t have the same level of digital sophistication in healthcare applications. Cognitive Behavioral Therapy (CBT) models are, like computer science principles, formulaic, rules-based, and follow distinct procedures — the two work very well together.
And, of course people have been anthropomorphizing bots since the 1960s, most notably with ELIZA, based on a Rogerian psychotherapist, which many people “treated” as human — thereby proving, in some small way, Alan Turing’s Turing Test?
An empathy-based conversation is helpful — no matter where it comes from. People want to be heard and understood. In fact we have found that many of our users prefer to divulge stuff that’s really hard to deal with to Woebot instead of speaking to a real person. I imagine it’s because there’s no fear of being judged. There’s an understanding that it’s a robot — the expectation is that it has perfect recall and doesn’t mind being burdened with painful information at 2 a.m., when one wouldn’t want to call a therapist, even if he/she were available.
Without making too many sweeping assumptions around gender or ethnicity, in your trials, was Woebot more popular with certain sectors of society? The suicide stats among US college students are distressing, for instance.
Let’s just say there’s a stigma in many cultures that have a relentless focus on success, and/or those that are uncomfortable with addressing mental health issues. Woebot also treats everyone the same way and doesn’t have unconscious bias around gender, ethnicity, disability, or sexual orientation, which is a relief for our users. So far, we’ve seen that many of our most frequent users are male, though Woebot is built for everyone. Woebot is needed to provide the stigma-free environment for people to talk candidly about their feelings.
Quick case study scenario for you: Mike, 20, comp-sci major facing final exams and going into a panic spiral showing suicidal ideation. How will Woebot handle this situation?
Woebot deals with crisis really well. There are three clinically accepted levels of care in dealing with people in crisis: providing resources, contracting for a suicide prevention plan with a therapist, and then hospitalization. Woebot is part of the first level, and we tell people in the onboarding that we’re not able to provide the last two levels of care. We tell users, “If you’re stuck at any time, type ‘SOS’ and we’ll give you some resources — including suicide prevention helplines and/or suggest 911.”
So what does Woebot tell ‘Mike’ to do for level-one care in this scenario?
It’ll be something like this:
Woebot: Hey Mike, what’s up, buddy?
Mike: I’m freaking out about an exam tomorrow.
Woebot: I’m here for you. Let’s take some time to think about this together OK?Mike: OK.
Woebot: Do you have 10 minutes to work with me on this, Mike?
Woebot: Great. Do you feel the panic in your body (heart racing, sweaty palms, etc.) or is mostly in your thoughts?
Mike: I’m panicking!
Woebot: I can help. Breathe with me like this.
[Woebot sends an image of Woebot taking deep breaths, which activates Mike’s mirror neurons, and they can count together. Woebot also sends a link to Marconi Union’s “Weightless” video, a proven visual/aural relaxation asset used by many therapists.]
Woebot: How’s your breath now? Getting slower?
Mike: Yeah. I’m not so fracking scared now.
Woebot: Thank you for the “Battlestar Galactica” ref. You did great work tonight. Check in with me tomorrow, OK?
Mike: I will. Thanks.
Talk about the clinical research behind Woebot.
We recruited 70 individuals (average age: 22) from a university social media site. The participants all answered a clinical questionnaire and were then randomized, and either offered 20 sessions with a “text-based conversational agent,” aka: Woebot, within a two-week period, or were directed to the National Institute of Mental Health ebook Depression in College Students to create a control group.
We found that those in the Woebot group significantly reduced their symptoms of depression over the study period, while those in the Information Control group did not. But interestingly, participants referred to relationship factors a lot, from remarking that Woebot felt like a friend, to feeling empathized with. From this we concluded that conversational agents appear to be a feasible, engaging, and effective way to deliver Cognitive Behavioral Therapy. The results of the study revealed a statistically significant reduction in anxiety and depression in only two weeks. We’re thrilled with that timeline, especially since traditional face-to-face therapy typically takes much longer to have an impact.
How much does Woebot cost? And what’s your funding situation?
We’re seed funded now and open to looking at health-based incubators and/or other investment sources down the road. Woebot is direct-to-consumer and there are tiered pricing levels: either an annual contract, which costs $6 per week, or — if you have commitment issues — $12 a week for a drop-in style session. Having said that, we provide the first two weeks for free.
That’s a pretty reasonable fee structure.
We wanted to make sure it’s a real business, and we know that when people invest in health, it’s empowering and effective for outcomes. But we’re still at least 40 percent less than a regular insurance co-pay.
Finally, how does the A.I. magic work? Are you using keywords to generate responses? Is Woebot pulling from previous conversations to show learning? Did you do a data dump from decades of anonymized patient-clinician sessions?
Did we train a dataset on therapy sessions? No, Woebot was built from the ground up and draws on everything we know about stellar HCI [human-computer interaction] clinical decision making and human-centered software design. Our overall structure is based on a decision tree that mimics a complex clinical process with slots that use token/keyword trigger/matching in some places and Natural Language Processing (NLP) in others. Then there’s machine learning and optimization in later stages. Initially it’s quite scripted to get the relationship going, getting smarter and more personalized over time.
Read more: “How Your Relationship Drama Will Train Future Robot Therapists”
Originally published at www.pcmag.com.