A chat with a bot could help

Digital Leaders
Digital Leaders
Published in
9 min readMar 8, 2019

Written by Adam Tweed, Service Development Manager at AbilityNet

To hear someone say we are facing a crisis in mental health is perhaps common to the point of cliché but it is a very real issue. The point is that demand for mental health services is far outweighing the available resources. Those who desperately need the support are faced with lengthy delays for appointments and the opportunities for early intervention are often missed and often sadly progress into more significant mental health issues. These require more intensive support perpetuating a downward spiral.

In their report entitled “Mental health of those suffering with physical or learning disabilities” the Shaw Foundation highlighted; “UK researchers have found that 30% of those with a long term physical condition also have a mental health problem, and 46% of people with a mental health problem have a physical condition” and that; “25–40% of those in the UK with a learning disability have a dual diagnosis with a mental health disorder.”

Long-term mental health conditions including anxiety and depression now account for a significant amount of DSA applications and specialist support, both human and tech-based.

But isn’t tech part of the problem?

According to Ofcom, the average British adult checks their phone every 12 minutes, 1/5 admit to spending upwards of 40 hours a week online and 64% of adults describe having a constant internet connection as an essential part of their life. Most of us keep our phones by our sides just-in-case, but all too often just-in-case becomes; I’ll-quickly-do-that; I’ll reply now or I’ll only have to do it tomorrow; I’ll comment because I’ll have forgotten by the morning; I’ll like because I need to show I’m keeping up with what’s happening.

It’s now relatively accepted that the light from our devices can have an impact on the hormone that controls both our ability to fall asleep and the quality of the sleep once we eventually do so and a good night’s sleep is important for our mental health as it allows our brains to process, sort and store. Add to this the incessant ping of notifications from people who don’t share our sleep patterns or perhaps live in different time-zones and it’s easy to see the disruptive impact. To address this, our devices default to a night-mode, silencing notifications and reducing the blue light, but is this to address a problem, or to ensure we don’t consider the option of leaving our phones outside the bedroom?

Although there is evidence that socialisation has changed, it may not be as significant as we may think and our worries about how we see younger generations communicating are based on our own challenges with this adjustment. For those who have grown up with what we may see as replacement online networks, online is just another means of communication and evidence shows that face-to-face interaction still occurs wherever possible. Indeed, online communication has also meant a more inclusive environment for many who may have previously struggled with socialising or been isolated. But that’s not to say there isn’t a down side.

Our online social identities are idealised versions of ourselves; few people post an ‘average day’ and we are therefore forced to aspire to the edited highlights of the lives of our vastly extended friendship webs and worrying if our posts do not trigger a sufficient level of likes. Social media is the incubator, the petri-dish for perfectionism; peer-comparison and the nagging validation that everyone else seems to be doing far better than you.

So how can tech help?

Tech has empowered many disabled people enabling independence where previously there may have been a reliance on human support. But can tech bridge the gap between the individual and support services when it comes to mental health issues?

In the first of this series of blogs we look at the ‘chatbot’.

A chatbot is essentially an automated conversation partner; it’s what you increasingly see when you visit websites where something pops up asking if you’d like any help.

So if a chatbot can offer advice on my mobile phone contract or my electricity provider, if it can understand the many different ways people can phrase things (or can say “Sorry I don’t understand” if it doesn’t), is it so strange to consider it as a means of accessing advice and support for mental health? All I would need is something that can understand my general intent; the basics of what I’m having difficulty with; “I’m anxious”, “I’m stressed”, I’m depressed” and offer some reasonable advice.

CBT — It’s not so hard is it?

Cognitive Behavioural Therapy (CBT) is a therapy typically used to treat anxiety and depression. Mind summarises; “It combines cognitive therapy (examining the things you think) and behavioural therapy (examining the things you do)” CBT recognises that many of the thought processes, typically the negative ones often involve a cycle of reinforcement, the notion that “I’m not good enough” for example results in negative thoughts that then results in no longer participating in the activity. CBT uses established techniques to identify and challenge negative thought processes and on a superficial level this is very much a case of mirroring back what a person is saying; encouraging reflection and offering techniques such as breathing exercises.

Now I’m not suggesting chatbots replace skilled practitioners; what we are discussing here is addressing the gap or the shortfall of available resource. To use a car analogy; most people know how to top up the oil or check the tires, or at least feel confident enough to do so armed with a mobile phone and a YouTube video, but few of us would attempt to change a gear box or replace the brake pads; for this we know that more skilled support is required. So why not consider a similar triage for our mental health? Skilled practitioners like cognitive mechanics should be able to work with those requiring the most intensive intervention, but for low-level, generic self-help, the type of help needed by the majority of us from time to time, or where a small amount of supported advice can may serve to delay or eliminate the onset of more significant issues. Chatbots are surely a tool to add to the kit.

All the chatbots featured below do make it very clear at the point of an initial interaction that they are not replacements for human therapy and are not designed to deal with significant mental health issues. However, it would be naive to think that anything associated with mental health support would never encounter a situation where someone might be at crisis and would turn to this bot, and there needs to be a recognition on the part of the bot (or rather its designers) that people might use them in this way. It is therefore crucial that the bot is able to recognise its limits, recognise the language associated with significant disclosures and to then signpost appropriate human support and do so without being dismissive. Each bot reviewed has this capability at varying levels of effectiveness.

It’s also worth mentioning that many of the chatbots have been developed by clinicians and therapists across a variety of disciplines and often one of the motives for doing so has been the identification of the lack of mental health care available to people as well as the effectiveness of DIY CBT.

Meet the bots

The first chatbot is ‘Woebot — Your charming robot friend who is ready to listen, 24/7‘. Woebot was developed by Alison Darcy, a clinical psychologist at Stanford University who in an interview with Business Insider highlights; “Woebot isn’t a replacement for an in-person therapist … nor will it help you find one. Instead, the tool is part of a widening array of approaches to mental health. It’s fundamentally different from any other form of therapy.”

The initial interaction with Woebot is a little laborious and without creating a login you will be led through a detailed explanation of data collection, bot limitations and other disclaimers, all of which have to be agreed to before accessing services (creating a login will allow these agreements to be stored so you will only have to do it once) The policies include highlighting that the bot is not a replacement for human support, but also that human support can be accessed (as in it will provide contact details) if needed/at crisis.

General interaction with the bot is slightly jarring as responses are a forced (potentially in an effort to minimise misunderstanding); you are often given single-word/phrase options only like “interesting” and “go on…” There seems to be a significant amount of explaining what it is capable of; a bit like trying to bare your soul to that really-upbeat friend who has just got a new job and really doesn’t want to hear about your problems just now. That said, there are some great features; regular interactions will unlock more content as well as enabling a user to track mood over time with ‘check-ins’ and the chatty interface and lack of choice in response does have the effect of halting any continued use of negative language that a text-based input would perhaps allow. Declaring a significant issue triggered an immediate response and provided details of support numbers/contact, but as the bot is US based, these were international rather than local, but were appropriate.

The second bot is ‘Youper — AI Assistant For Emotional Health‘. Youper was developed by a team led by psychiatrist Dr Jose Hamilton and the name is derived from its central idea that you can become the best version of yourself; Super-you; “You” + “Super” = Youper. This bot uses CBT techniques as well as Acceptance and Commitment Therapy (ACT), Mindfulness and Meditation. The chat interface allows for free-text input, but also makes use of the fact that with a smartphone screen, you can interact in different ways; for example on occasions it will provide options as to how you feel and then a slider to quantify this. The interface allows for a fluid conversation examining thoughts and feelings and providing tips and techniques such as mindfulness and meditations within the chat. When declaring crisis, Youper made several good attempts at intervention. It suggested considering reasons for living as well as highlighting the importance of additional support; doctors, counsellors, and other support, and eventually offered US, UK and rest of the world contact details for support. It would perhaps have been slightly more reassuring if it had offered these numbers first before attempting its own intervention.

The final bot; ‘Wysa — your 4am friend and AI lifecoach‘. Wysa was developed by life coach Jo Aggarwal. Like Youper, it uses a free-text interaction as well as fixed-response selections. Wysa has a friendly feel to it for some reason (possibly just the penguin icon) it feels more like chatting to a ‘thing’ rather than a bot. The interactions are relatively smooth, but there are occasions where misunderstandings will trigger a default “shall we just try some relaxation” or similar. Declaring a significant issue resulted in perhaps the most appropriate response, highlighting its limitations; “I’m only a bot and not able to deal with such things” but then offered national and international support. It then asks the user if they would call the numbers it had provided and when the app is opened on a subsequent occasion, it asks the user if they had made the call. No replacement for human support, but at a point of crisis quite possibly just enough to have someone reconsider an alternative.

Although occasional misunderstandings may remind you that your conversation is with a ‘thing’, it is odd how quickly you accept talking with a bot as just another text-based chat. It is an interaction; a ‘voice’ that checks-in to see how you are doing, reminds you to consider your wellbeing, listens to your issues and responds with appropriate advice and techniques.

Interacting with a chatbot also means they do not require you to admit your problems to another person and this can often be the taboo that many people struggle with when recognising they need help.

They may not be suitable for complex mental health issues at this point, and they make no attempt to hide this. These bots are always available, always listening, never running short of time or low on resources. They may only offer a friendly ear and some basic techniques to improve our general wellbeing but for many this is enough and at the very least it offers a place to go and an alleviation of the pressures currently placed on our over-stretched mental health services.

Originally posted here

More thought leadership

Originally published at digileaders.com on March 8, 2019.

--

--

Digital Leaders
Digital Leaders

Published in Digital Leaders

Thoughts on leadership, strategy and digital transformation across all sectors. Articles first published on the Digital Leaders blog at digileaders.com

Digital Leaders
Digital Leaders

Written by Digital Leaders

Informing and inspiring innovative digital transformation digileaders.com