AI in Counseling & Spiritual Care

Predictive models are transforming mental health. They might also monitor and shape our moral lives.

by J. Nathan Matias, Lydia Manikonda, Scott Hale, and Kenneth Arnold

This post is the fifth in a series of short introductions to artificial intelligence designed for group discussion in non-technical Christian settings. To follow the series, sign up for our email list, hosted by the Oxford Pastorate.

He heals the brokenhearted and binds up their wounds. He determines the number of the stars; he gives to all of them their names. Great is our Lord, and abundant in power; his understanding is beyond measure. Psalm 147: 3–4

In Psalm 147, the work of God to heal the brokenhearted is placed next to His mathematical and astronomical wonders. Experimental technologies are already applying those mathematical wonders to God’s work of healing in the world. AI is transforming care for people’s mental and spiritual well-being through (a) large-scale data collection about people’s intimate thoughts and actions, (b) technologies designed to interpret people’s emotional and spiritual conditions, and (c) systems that coordinate counseling and emergency response.

In the 1950s, the Anglican priest Chad Varah applied telephone technology to mental health to create The Samaritans, the world’s first crisis hotline. AI systems are now taking on parts of this work.

On social media and in health care systems, machine learning models are being developed to detect mental health risks, including suicide and depression, and to coordinate care. In March 2017, the social network Facebook introduced an AI system that monitors its users to assess each person’s risk of suicide from what we write, what we say, or how our friends respond. When the AI ranks the risk high enough, it alerts the person’s friends and encourages the person to get help or talk to friends. We do not yet have evidence that these systems save lives. AI is also being tested by hospitals in the United States to predict a person’s risk of suicide as far as a year in advance, based on a person’s prior health record. The predictions are most accurate 7 days before a person dies of suicide.

Text-based crisis hotlines have prototyped systems that infer a person’s needs and route them to the most relevant counseling service. AI is also transforming care itself, reducing the costs of mental health counseling by intelligently assembling personalised therapies such as Cognitive Behavioral Therapy from the activity of online volunteers and automated systems.

In the next few years, we can expect predictive models to become a basic part of mental health services. In the longer term, machine learning systems may become trusted to coordinate and deliver ongoing care, expanding the reach of individual counselors and broadening the role for moderately-trained peer supporters. As new projects attempt to measure spiritual well-being by tracking people’s mobile phones and communications, we may see parallel efforts to monitor and intervene on a wide range of spiritual and moral issues.

  • What theologies of privacy and accountability could guide an era where our intimate spiritual, emotional, and prayer lives are observed and judged by automated systems?
  • What might Christian teaching on personal transformation offer a society that increasingly relates to each other based on predictive modeling?
  • Just as Christians have pioneered crisis telephone lines, how can Christian ministries use AI systems to heal the brokenhearted?

References

Cellan-Jones, L. K., Dave Lee, Rory. (2017, March 1). Facebook artificial intelligence spots suicidal users. BBC News. Retrieved from http://www.bbc.com/news/technology-39126027

Cep, C. N. (2014, August 5). Big Data for the Spirit. The New Yorker. Retrieved from http://www.newyorker.com/tech/elements/big-data-spirit

De Choudhury, M., Counts, S., Horvitz, E. J., & Hoff, A. (2014). Characterizing and Predicting Postpartum Depression from Shared Facebook Data. In Proceedings of the 17th ACM Conference on Computer Supported Cooperative Work & Social Computing (pp. 626–638). New York, NY, USA: ACM. https://doi.org/10.1145/2531602.2531675

Dinakar, K., Chen, J., Lieberman, H., Picard, R., & Filbin, R. (2015). Mixed-initiative real-time topic modeling & visualization for crisis counseling. In Proceedings of the 20th International Conference on Intelligent User Interfaces (pp. 417–426). ACM. Retrieved from http://dl.acm.org/citation.cfm?id=2701395

Molteni, M. (2017, March 17). Facebook’s AI Is Learning to Predict and Prevent Suicide. WIRED. Retrieved from https://www.wired.com/2017/03/artificial-intelligence-learning-predict-prevent-suicide/

Morris, R. R., Schueller, S. M., & Picard, R. W. (2015). Efficacy of a Web-Based, Crowdsourced Peer-To-Peer Cognitive Reappraisal Platform for Depression: Randomized Controlled Trial. Journal of Medical Internet Research, 17(3), e72. https://doi.org/10.2196/jmir.4167

Picard, R. W. (1997). Affective computing (Vol. 252). MIT press Cambridge. Retrieved from http://www.cell.com/trends/cognitive-sciences/pdf/S1364-6613(98)01190-5.pdf

Walsh, C. G., Ribeiro, J. D., & Franklin, J. C. (2017). Predicting Risk of Suicide Attempts Over Time Through Machine Learning. Clinical Psychological Science, 5(3), 457–469. https://doi.org/10.1177/2167702617691560