Caring for Vincent: A Chatbot for Self-compassion

Nena van As
ACM CHI
Published in
5 min readMay 1, 2019

This article summarizes our work by Minha Lee, Sander Ackermans, Nena van As, Hanwen Chang, Enzo Lucas, and Wijnand IJsselsteijn from the Human Technology Interaction (HTI) department at Eindhoven University of Technology. Our CHI 2019 paper will be presented on Tuesday, 7th May 2019 from 14:00 to 15:20. Room: Clyde Auditorium. Session: Chatbots and Agents.

Edited, original photo by Igor Starkov on Unsplash

Your phone just vibrated. It’s after dinner, and you received a text from Vincent, asking if you have a moment. You chat with Vincent about something awkward that happened during the day, and when you put down your phone you’ve helped him get through the uncomfortable event.

If Vincent were a good friend, you’ve probably experienced this scenario. Now imagine Vincent as a chatbot, and that listening and replying to his story about failure actually helps you.

Our paper on a self-compassion chatbot, Vincent, addressed exactly that scenario. We investigated how people’s self-compassion levels changed when they talked to Vincent daily for two weeks. Half of our participants talked to a Vincent that gave help, like a therapist, whereas the other half talked to a Vincent that asked for help, like a friend. Although it’s too early to be sure, self-compassion increased most for the people who gave help to Vincent.

Chatbot therapists

Chatbots have been around for a while. There’s ELIZA from the sixties, Google’s Cleverbot from the nineties, and the current multiple-Loebner Prize winner Mitsuku. Thanks to developments in machine learning and computational power, they are evolving fast and becoming more adept at having conversations with us humans.

Their new abilities are also (re)introducing them in the field of mental health care. For example, there’s Woebot, a chatbot that delivers cognitive behavioral therapy through small daily chats. Chatting with Woebot effectively alleviates symptoms of depression and anxiety.

Image from Woebot’s website.

Prevent, rather than cure

Woebot shows that chatbots can be used to assist therapists and to reach people who are waiting for treatment. But it mainly targets people who are already suffering from mental illness. What if, instead of curing these people, we could use chatbots to prevent them from getting ill?

This brought us to self-compassion, the ability to be kind and forgiving to yourself in times of struggle. It’s good to be self-compassionate; it correlates with higher levels of well-being and lower levels of depression and anxiety.

Vincent

Vincent was built using Google’s Dialogflow, and implemented on Facebook. He was not built with any sophisticated AI, in that the conversation flow was strictly pre-designed, with options for responses like below, as well as a few open-ended response opportunities.

Care-receiving Vincent seeking out help on Facebook Messenger

For two weeks, our participants had short, daily chats with either caregiving or care-receiving Vincent. Caregiving Vincent was modeled after Woebot; he took up a traditional therapist/coach role, and guided participants through self-compassion exercises. Care-receiving Vincent, on the other hand, talked about his failures in a very unforgiving way. He mimicked someone who would have a low score on self-compassion, and invited participants to comfort him.

Responses

People responded differently to Vincent. Although some of them gave only short replies, most participants showed interest in Vincent’s struggles or exercises, shared lengthy and intimate stories, and showed signs of attachment — even within just two weeks:

“Can I keep him?”

Moreover, some participants thought like a chatbot to help a chatbot. In response to one of care-receiving Vincent’s problems, one participant remarked:

“I would find a window to climb through. But maybe in your case better try to hack into the folder.”

But the most relevant finding was that we found parts of self-compassion reflected in participants’ answers, particularly in response to care-receiving Vincent:

“Just remember that it can happen to anyone and that it’s not your fault.”

There are worse things that could happen.”

Stay positive and keep trying until you succeed.”

Results and implications

First, we learned that Vincent was able to develop a shared narrative with our participants despite being very limited in his conversational ability. His chats solicited surprisingly thoughtful replies, considering the fact that Vincent did not process user input at all. This implies that a chatbot for self-compassion can do its job without sophisticated algorithms, although he might still benefit from some more subtlety.

Photo by NordWood Themes on Unsplash

Second, we see a potential for personalization or segmentation. For example, young women generally score lower on self-compassion and might need an approach with more frequent conversations, whereas older women generally score higher and could do equally well with less. Vincent could learn these details from his users and tailor his conversation to their need.

In line with this, it is also important to mention the fact that Vincent is a gendered name. He was named after Vincent van Gogh, a famous painter who is known to have struggled with mental health issues. Yet, the appropriateness of Vincent’s presumed gender may be a point to ponder on. A chatbot might be better off non-gendered, like Woebot, or perhaps gendered towards the preferences of the participant.

Finally, our main result is that participants who talked with care-receiving Vincent showed increased self-compassion levels whereas those talking with caregiving Vincent did not. Although we need a larger sample to verify this result, it suggests that we might be missing out on an important question on how to design conversational agents for future mental health applications.

Instead of asking “what can technology do for us?”, the value may be in asking “what can we do for technology?”

What we say to a chatbot that portrays our everyday suffering may help us become more compassionate towards ourselves.

Photo by Andy Kelly on Unsplash

--

--