3 ways AI could help our mental health
Charlotte Stix Programme Assistant, Robotics and Artificial Intelligence, European Commission
Mental health difficulties affect around 1 in 5 adults at least once in their lifetime in the United States. But it’s not just the US. Poor mental health is a global issue, with 83 million people affected in Europe alone. What if we could harness Artificial Intelligence (AI) to address this?
AI is increasingly hyped as a silver bullet, applicable to almost all areas, ranging from economic prosperity, to solving complex global issues. While the holistic impact of AI remains to be seen, the case for using AI within mental health is surprisingly encouraging, as backed by medical studies and pilot programmes.
In its current form, AI is still merely a support mechanism. But looking towards the future, its impact can be significant, provided that further research is backed and shortcomings, such as unclear data usage, misdiagnosis and privacy concerns, addressed.
Let’s look at three particularly noteworthy benefits of AI.
Early detection of mental health difficulties is of crucial importance to the prompt and successful treatment of the patient. AI can already detect markers that indicate a high probability of cancer at very early stages. What if AI could flag up similar warning signs about your mental health, simply by listening to you?
Traditional practice in mental health largely relies on the individual to observe and self-report indicative changes, alongside the observations of mental health professionals. AI could notice relevant symptoms and act as an early detection mechanism, as demonstrated by two recent case studies.
Veterans are considered a typical high-risk group for developing mental health difficulties. To catch these developments early on, Cogito — a company funded by the Defense Advanced Research Projects Agency — teamed up with the US Department of Veterans Affairs to trial an app that monitors veterans’ mental health.
The app itself, called Companion, passively monitored a veteran’s phone 24/7, by listening to the sound of the user’s voice and their frequency of mobile phone usage. The changes in inflection, energy of pitch and amount spoken, as well as phone usage provided the app with a variety of behavioural indicators. The AI system then used these indicators to detect crucial changes in the user’s mental health.
Similar to Cogito, IBM also harnesses AI as an early detection mechanism for mental health. In two studies, IBM’s Computational Psychiatry and Neuroimaging group, alongside several universities, aimed to predict the onset of psychosis in patients.They built an AI that detected differences in speech patterns between high-risk patients who develop psychosis and those who did not. To detect this, they used a method called Natural Language Processing (NLP). NLP analysed the patient’s speech for different indicators, such as coherence of speech and ideas. It then built a predictive model for the onset of psychosis. After training this AI system over two studies, IBM achieved an incredible 83% of retrospective accuracy of detection in the second study group. It was a quantifiable demonstration of the power of listening.
Approximately 45% of the world’s population in 2014 lived in a country where there was less than one psychiatrist for every 100,000 people. It is clear that access to treatment is a luxury that many people around the globe do not have or cannot afford.
On top of improving access to mental health treatments, AI can play a big role within personalised treatments. Ginger.io, for example, covers both. An online platform that uses AI and machine learning alongside a staffed clinical network, Ginger.io tailors its suggestions to the needs of the user and provides access to a variety of treatments.
The algorithm might, for example, suggest that the most suitable course of action is cognitive behavioural therapy (CBT). CBT is a popular talking therapy that aids to reframe the way you think and behave, to change the way in which you address problems. It usually requires several visits to a professional over an extended period of time, which might be unachievable because of the user’s location. Other treatments the user might be referred to are mindfulness training, resilience training or being escalated to a licensed therapist or board-certified psychiatrist, depending on severity of symptoms.
All signs indicate that AI is set to become a key driver in lowering barriers of access to advice, services and personalised treatment.
Lowered fear of stigma
Stigma surrounding mental health can act as a strong deterrent to seek help and speak out. Some people affected may not wish to discuss their situation with other individuals, including trained professionals, for fear of stigma. In the long run this can contribute to a worsening of the person’s situation.
As opposed to fellow humans, an AI does not necessarily form part of any wider social construct with all the associated cultural norms and expectations. The AI is likely perceived as non-judgmental, non-opinionated and overall neutral.
The opportunity to confide in an AI system has been within reach for quite some time. ELIZA, a basic NLP program developed in 1966, reenacted the behaviour and responses of a psychotherapist. An early predecessor to many subsequent chatbots, its creator’s intentions can be seen as particularly aligned with those of Woebot’s.
Woebot works in a similar way to an instant messaging app. Created by clinical research psychologist Dr Alison Darcy and integrated on Facebook, Woebot aims to replicate the open ear of a trained professional. It learns about the individual and tailors its questions to their situation through repeated conversations.
Woebot doesn’t tire of lengthy conversation, is always available to listen and, most importantly, it is perceived as non-judgemental, no matter what thoughts and worries the user expresses. In this light, Woebot can contribute to increased well-being by reducing isolation, providing an instant channel of communication, and allowing for anonymous self-expression.
AI may not be a silver bullet for mental health yet, but it has all the indicators of making a significant contribution in the field.
Originally published at www.weforum.org.