Chatbots: Leveraging Artificial Intelligence to Improve Mental Health Services

Zoe Louise
Trends in Data Science
9 min readMay 21, 2020

Why Innovate?

One in two Australians will experience a mental health disorder during their lifetime. Less than half of those who do will receive appropriate support and treatment” (Hosie et al., 2014).

Ready access to mental health services is essential for individual and community well-being. Unfortunately, for many, such services are not always accessible. Three main challenges to providing appropriate support are availability, affordability and acceptability. In order to meet these challenges, innovation in the sector is essential. A number of mental health websites, apps and chatbots have been developed to address these barriers. However, current solutions lack personalisation and have high churn rates. There is an opportunity to improve existing online mental health services via an advanced chatbot that provides highly personalised mental health support. Developing such a platform comes with a number of data-related difficulties and considerations, including privacy, collection and analysis. Once these issues have been addressed, a highly evolved mental health chatbot has the ability to solve many of the present obstacles preventing widespread access to mental health services, which, in turn, has the potential to effect significant positive impacts on individuals and the broader community.

Background

Challenges to the effective provision of mental health services

The mental health industry presently faces a myriad of obstacles to ready access, the most significant being:

a) availability;

b) affordability; and

c) acceptability,

(Corscadden et al., 2019).

Availability refers to whether the services are able to be accessed by patients in terms of both geographical location and timing. For example, there are less mental health practitioners in rural and remote areas per capita compared to the number of mental health practitioners in cities, and the prevalence of mental health disorders per capita in remote and rural areas is greater than that of cities (National Rural Health Alliance Inc., 2017). Long wait lists to book appointments with practitioners can also lead to individuals not receiving treatment when they need it, increasing the severity of conditions.

Affordability refers to whether the patient can pay for the services. In a comprehensive study of the mental health of young people in Australia (see Figure 1 below), cost was identified as the greatest barrier to prospective clients pursuing mental health services (Sawyer et al., 2000).

Acceptability refers to the stigma associated with having a mental health disorder and whether that inhibits an individual from seeking help (Corscadden et al., 2019). This issue is compounded in small communities where confidentiality is more difficult to maintain (National Rural Health Alliance Inc., 2017). Mental health services are drawing on the strengths of other disciplines, such as technology and data science, to address these challenges.

Figure 1: Barriers to obtaining help (extracted from Sawyer et al., 2000)

Current technological interventions

Internet-based platforms and native mobile applications have been developed to aid in the treatment of a range of mental health conditions by attempting to combat the challenges identified above. The platforms and applications are essentially self-guided online treatments which aim to alleviate symptoms for a range of different conditions (Truschel, 2020).

Whilst these platforms have had some success treating mental health conditions, individuals tend to discontinue use of the platform before completion of their treatment (Karyotaki et al., 2017). Data science is being leveraged to improve the efficacy of self-guided online treatments.

The use of chatbots in the mental health sector

What are chatbots?

Automated conversational agents or ‘chatbots’ are being developed to provide mental health support to individuals. Chatbots are computer programs that are built with the aid of Artificial Intelligence (AI) to simulate human conversation with a user (Khan and Das, 2017).

Chatbots have the potential to be more successful than platforms that do not adopt AI, as they can ask relevant questions, remember past conversations to provide a sense of continuity and are able to more readily aid the individual with issues confronting them at the time of the interaction (Kretzschmar et al., 2019).

A number of chatbots have been developed to support mental well-being, however, according to research by Kretzschmar, Tyroll, et al, “as of yet, chatbots cannot grasp the nuances of users’ life history and current circumstances that may be at the root of mental health difficulties.” The research by Kretzschmar, Tyroll, et al, analysed three popular mental health chatbots and found that conversation with patients only offered a limited level of personalisation. Improved use of AI could address this issue.

Chatbots address many of the obstacles preventing widespread access to mental health services, including affordability, availability and acceptability. Chatbots are available to anyone with access to the internet and a computer or smart phone, they have the potential to be cost effective for the user and they can be used privately, without fear of judgement. However, the successful integration of chatbots into the mental health sector also raises unique challenges that will need to be resolved before their impact can be fully realised.

Chatbot privacy issues: protecting patients

Ensuring user privacy and engendering trust from individuals is critical to the success of chatbots for use in mental health services, where extremely sensitive personal and medical information is being communicated. A perceived lack of privacy already deters some individuals from attending face-to-face services. Mental health services will need to ensure that the use of chatbots does not compromise an individual’s confidentiality and that patients are made aware of the level of privacy protection that is being afforded to them.

There are a number of actions that chatbot developers could take in order to make users feel comfortable using their service. Firstly, the terms of use for the chatbot must have an explicit privacy policy so that the user understands how/if their data is being used. The user should also be afforded significant autonomy to opt in or out of any confidentiality terms. Secondly, if personal information is being kept by the chatbot, it should be stored securely. Thirdly, content of the conversations that the user has with the chatbot should be de-identified to preserve the user’s privacy (Kretzschmar et al., 2019).

Data collections issues: data availability

The chatbot must be trained from an initial data set. However, conversations that are used to train the chatbot are likely to have been conducted in confidence. Individuals must consent to their conversations being used for chatbot training, before a development team is able to access and use them. Even in circumstances where patients are informed that their data will be de-identified, they may remain unwilling to provide this consent.

In order to circumvent the issue of limited data availability, a chatbot could be trained using a set of questions and answers that are inputted by experienced mental health practitioners. However, this could impact the quality of the data as the health practitioners’ responses may not reflect the responses of ‘real’ mental health patients.

Data collection issues: volume of data

In order to improve on existing online services, chatbots will need to be able to respond appropriately to a vast number of questions that are asked in a range of different ways. They must also be able to pose relevant questions at opportune times. All of these questions and answers combined represent a large amount of data that needs to be collected and programmed into the chatbot in order for it to have a meaningful, personalised conversation with an individual. It would be difficult to predict all of the potential questions and answers on a topic as diverse as an individual’s mental health.

Chatbot performance analysis issues

Analysing the effectiveness of a chatbot, before being set live and whilst live, is a critical task. A patient’s decision to address their mental health issues can be a difficult one, even if the patient’s chosen therapeutic method is an anonymous chatbot. Therefore, if the chatbot provides information that is unhelpful, irrelevant or trivialises the individual’s emotions or condition then the individual may be reluctant to seek help again.

Testing must be carried out with a valid sample size of the intended user group. If the chatbot was not successful in alleviating symptoms of the mental health condition, the data analyst would have to understand whether this was the result of poor performance from the chatbot, or whether it was due to another factor, such as the user not using the chatbot frequently enough. The data analyst would have to account for the fact that results from a test group may be skewed since participants could have a level of willingness to use the chatbot and may be preconditioned to respond in a certain way to chatbot questions, compared to non-test groups.

Without high levels of confidence that the chatbot would be successful, it would be difficult for a mental health service to decide when it is safe to release such an application. Existing mental health chatbots deal with this issue by encouraging users to seek support from trained mental health professionals and have functions built in for recognising emergencies if certain key words are entered by users (Kretzschmar et al., 2019).

Impact of chatbots on mental health services

The impacts that chatbots could have on transforming mental health services could be far reaching. Chatbots allow individuals to overcome barriers such as availability, affordability and acceptability and, unlike mental health platforms that don’t leverage AI, they have the potential to provide a highly personalised experience, improving the likelihood of an individual continuing a course of treatment and alleviating their mental health symptoms.

Availability

Chatbots are available to individuals who have an internet connection and a computer or smart phone. Unlike face-to-face therapists, there is no wait list to secure an appointment and the chatbot is available at any hour of the day to consult. Improved availability leads to earlier intervention, lessening the impact that mental health disorders have on the economy due to reduced productivity and absenteeism (Australian Government, 2010). By providing care to people with less severe conditions, mental health chatbots have the potential to reduce the strain on hospitals and practitioners, freeing up availability for more critical cases.

Acceptability

Chatbots can be accessed in private, allowing those who would ordinarily be uncomfortable visiting a mental health service in person to seek support. Chatbots have the potential to provide even more effective treatment than a face-to-face practitioner, since they provide an anonymous, judgement-free space for the individual to discuss their condition; a recent study found that “interviewing with an automated virtual human makes participants more willing to disclose” (Lucas et al., 2014).

Affordability

The Commonwealth Department of Health has funded the development of self-guided online mental health services in the past and these are available free of charge to all Australians (Moodgym, n.d.). Chatbots, depending on the organisation developing them, could be available to users for free or at a very low cost. If virtual therapists reach a high level of effectiveness, it is possible that this could drive down the price of a face-to-face therapist as they will need to compete with their AI counterparts. Better access to treatment, whether online or in-person, could alleviate the economic impacts of mental health which is estimated to be costing the Australian economy an estimated AUD 5.9 billion per year (OECD, 2014).

Conclusion

Mental health services face many challenges in being able to provide adequate support to Australians. Online mental health platforms that attempt to address these challenges will benefit from data science led innovation in the form of highly evolved chatbots. If properly trained and tested, these chatbots could have significant positive impacts on individuals, the mental health industry and the economy.

References

Australian Government, 2010. A national health and hospitals network for Australia’s future delivering better health and better hospitals. Commonwealth of Australia, Capital Hill, ACT.

Corscadden, L., Callander, E.J., Topp, S.M., 2019. Disparities in access to health care in Australia for people with mental health conditions. Aust. Health Rev. 43, 619–627. https://doi.org/10.1071/AH17259

Hosie, A., Vogl, G., Hoddinott, J., Carden, J., Comeau, Y., 2014. Crossroads: rethinking the Australian mental health system [WWW Document]. URL https://apo.org.au/node/38336 (accessed 4.22.20).

Karyotaki, E., Riper, H., Twisk, J., Hoogendoorn, A., Kleiboer, A., Mira, A., Mackinnon, A., Meyer, B., Botella, C., Littlewood, E., Andersson, G., Christensen, H., Klein, J.P., Schröder, J., Bretón-López, J., Scheider, J., Griffiths, K., Farrer, L., Huibers, M.J.H., Phillips, R., Gilbody, S., Moritz, S., Berger, T., Pop, V., Spek, V., Cuijpers, P., 2017. Efficacy of Self-guided Internet-Based Cognitive Behavioral Therapy in the Treatment of Depressive Symptoms: A Meta-analysis of Individual Participant Data. JAMA Psychiatry 74, 351–359. https://doi.org/10.1001/jamapsychiatry.2017.0044

Khan, R., Das, A., 2017. Build Better Chatbots: A Complete Guide to Getting Started with Chatbots [WWW Document]. 1 Introd. Chatbots. URL https://learning.oreilly.com/library/view/build-better-chatbots/9781484231111/A446794_1_En_1_Chapter.html (accessed 4.17.20).

Kretzschmar, K., Tyroll, H., Tyroll, G., Manzini, A., Singh, I., 2019. Can Your Phone Be Your Therapist? Young People’s Ethical Perspectives on the Use of Fully Automated Conversational Agents (Chatbots) in Mental Health Support. Biomed. Inform. Insights 11, 1178222619829083. https://doi.org/10.1177/1178222619829083

Lucas, G.M., Gratch, J., King, A., Morency, L.-P., 2014. It’s only a computer: Virtual humans increase willingness to disclose. Comput. Hum. Behav. 37, 94–100. https://doi.org/10.1016/j.chb.2014.04.043

Matsuzaki, T., 2018. Ethical Issues of Artificial Intelligence in Medicine. Calif. West. Law Rev. 55, 255–274.

Moodgym, n.d. Moodgym Frequently Asked Questions [WWW Document]. URL https://moodgym.com.au/info/faq (accessed 4.16.20).

National Rural Health Alliance Inc., 2017. Mental Health in Rural and Remote Australia [WWW Document]. URL https://www.ruralhealth.org.au/sites/default/files/publications/nrha-mental-health-factsheet-dec-2017.pdf (accessed 4.16.20).

OECD, 2014. Australia at the forefront of mental health care innovation but should remain attentive to population needs, says OECD.

Sawyer, M.G., Arney, F.M., Baghurst, P.A., Clark, J.J., Graetz, B.W., Kosky, R.J., Nurcombe, B., Patton, G.C., Prior, M.R., Raphael, B., Rey, J., Whaites, L.C., Zubrick, S.R., 2000. Mental health of young people in Australia: (676582010–001). https://doi.org/10.1037/e676582010-001

Truschel, J., 2020. Top 25 Mental Health Apps for 2020: An Alternative to Therapy? [WWW Document]. URL https://www.psycom.net/25-best-mental-health-apps (accessed 4.16.20).

--

--