Can Empathy be Bottled in a Digital Health Technology?

Girish Subramanyan
14 min readSep 19, 2016

--

As if having a mental illness weren’t onerous enough in its own right, the subjective experience of living with some of these conditions engenders, almost universally, a desperate need for more human connection than sufferers have at hand. Regrettably, though, this need often cannot be fulfilled by the people in the sufferer’s life. Take depression, for starters. Presumably, it is the most talked about mental illness in our society. The experience of severe depression can create a painful, often unbearable sense of hopelessness — or state of joylessness — that makes many a sufferer wish to hit the PAUSE or STOP button on existence. Along the way, before the emergence of these suicidal longings, the afflicted may yearn to give voice to their exquisitely painful suffering. But, finding it either taxing to their relationships or too risky emotionally to test them, they frequently withdraw socially, isolating themselves in the chamber of their unceasing pessimism, implacable foreboding and negative ruminations that is, of course, their hijacked and diseased minds.

Even when one is lucky enough to have friends or family who are willing to listen, talking about it — the miserable mood, the unrelenting eddy of negative thoughts, the hopelessness, the sense of worthlessness, the impossibility of everything — the psychic torment inherent to this condition frequently does not subside. Not uncommonly — and vexingly enough — opening up about one’s misery also manages to create an emotional burden to those willing to listen, particularly if the depression is long-lasting or not especially responsive to treatments. One technical term for this unfortunate reality is caregiver fatigue. Loved ones can wind up feeling a range of emotions, including helplessness, fear, resentment, and a sense of hopeless and depression, themselves.

At issue are several discrete factors that work in concert to give rise to this phenomenon. One important, but often overlooked, issue is that the majority of Americans do not struggle with clinical depression and, apparently, never will. Understandably, then, it is difficult for most people to genuinely relate to (let alone understand) the experience of someone suffering from this malady of the mind. An even larger percentage of people, presumably, do not know how to “sit” with someone who is significantly depressed, certainly not in a way that is beneficial to the person who is suffering. Moreover, as a society, we are discouraged from sharing our experiences of mental anguish openly with others, particularly with those who do not occupy the inner circle of our close confidantes. This seems to be case, especially, in highly competitive, capitalistic societies in which individualism, the never ending quest for increasing social power, and financial pressures are pitted against the virtues of social cohesion and group harmony.

Another factor that contributes to caregiver fatigue is that, for most individuals — even those who have lived through depression themselves — it is emotionally trying to keep the company of someone who is significantly depressed. Certainly, while it may be meaningful and important to the relationship to help soothe a friend who is in this situation, I don’t really know anyone who would deem it pleasurable. In these situations, decorum calls for a purposeful shifting away from levity and jocularity — the stuff, perhaps, of friendships in their good times — while true empathy asks you to experience vicariously (or imagine the experience of) the hopelessness that plagues the depressed individual. Yet, in order to remain healthy in the face of another person’s depression, it is vital to possess the ability to recognize one’s own limitations in the matter and to know when to step away mentally from the bleak landscape into which one will inevitably be pulled. And to understand that, in almost all instances, there is a diseased part of the mind at play in the depressed individual that is decidedly impaired in its ability to solve some larger spiritual or psychic problem.

As “sitting” with a depressed person, for most individuals, is not an inherently pleasurable nor particularly rewarding activity (meaning, essentially, that one would not want to do more of it without some external source of reward), the process is intrinsically depleting of one’s mental energy. When this is the situation at hand, all of us need ways to recuperate and re-energize. Some individuals, through no fault of their own perhaps, have a strikingly low tolerance for the negative emotions of others. These individuals tend to pull away from the depressed person at the first hint of something being awry, at least until she has recovered enough to give semi-pleasurable company again.

All of this raises the question: where is someone with this sort of mental illness — the kind that is excruciatingly difficult to bear all by oneself — to find the connection and empathy that he craves? Naturally, licensed mental health clinicians are a start. But, in no model of treatment that I’m aware of (with the exception, perhaps, of very high end residential treatment facilities) is a patient afforded the luxury of spending nearly 100% of his time with trained clinicians. The suffering continues, nonetheless, in between treatment sessions or before medications kick in.

Hence, one deficiency in the current system of care here in the U.S. is the sheer lack of availability of adequate empathy, understanding and connection that relieves, even if only for a moment, the suffering of those with this sort of mental illness.

Naturally, as someone who is enthusiastic about the possibility that digital health technologies could fill gaps in our current system of mental health care delivery, I have wondered about how such technologies could mimic the empathy and understanding that trained mental health clinicians routinely provide to their patients. If such technologies could be developed and refined, it would allow for the possibility that patients could be relieved of their loneliness and sense of isolation, among other things; and, if they were evolved enough, the technologies may even have the capability of helping patients feel understood and cared about.

Empathy then could be bottled, as it were, and delivered equitably to those who suffer from mental illnesses, no matter which stratum of society one occupies. Although the idea may strike some as implausibly futuristic, downright crazy or even creepy, I’m of the opinion that if the notion can be imagined as iterative advancements of existing technologies, then there’s a decent chance that somewhere down the line, it could be actualized. Digital empathy, in other words, need not remain in the realm of wishful fantasy; it could be reified with the right entrepreneurial talent and willpower.

Indeed, the idea of a digital empathic “being” has already been captured in a fashion and introduced to the masses in recent popular cinema. In the 2013 movie “Her” by Spike Jonze, the protagonist, Theodore, played by Joaquin Phoenix, falls in love with his intelligent “operating system”, a “being” of sorts who personifies herself with the name Samantha. This is not your clumsy, prototypical or robotic Siri we are talking about. This intelligent technology mimics human relationships in incredibly complex and nuanced ways. The technology evolves with you, remembers the important and not-so-important details of your life, learns over time who you are, and reciprocates emotionally.

An intelligent technology that could serve as a virtual empathic being would, in addition, be able to imbue its speech with emotional inflection, as a skilled therapist would. And most importantly, perhaps, it would be impervious to taking things personally or feeling frustrated, a quality which would render it unmistakably superhuman, of course, but desirable in the context that we are talking about. You could imagine that the mental health version of this intelligent technology would be something the equivalent of an indefatigable friend, “someone” who, while attempting to mimic a human being’s humanness, would also have a kind of perfect invulnerability to caregiver fatigue, an ideal, ever-available, ever-understanding, ever-reassuring “being” who acts as a sort of Ben Gay for your emotional pain.

Sounds too good to be true? Well, perhaps, right at this moment, it is. But, the commitment by large and influential technology corporations to developing and refining artificial intelligence technologies applicable to human — computer interactions (e.g. natural language processing) is robust and reassuring. Take, for example, FaceBook’s DeepText and DeepMind’s (Google’s) WaveNet projects. According to FaceBook, DeepText is an artificial intelligence technology that allows for near-human comprehension of (human-generated) text. WaveNet, on the other hand, is an artificial intelligence technology that allows for near-human speech production by a computer. The coalescing of these two technologies (i.e. human language understanding and speech production) along with other technologies (i.e. technologies that pertain to the comprehension of human emotional states through audiovisual channels, technologies that pertain to the comprehension of human psychology and motivation, technologies that store and comprehend relevant relationship narrative, etc.) will allow for empathic digital “beings” to be born to assist in everyday life.

Alas, we are not there yet. What we have instead at this time are online chat rooms and support group forums that bring together peers who ostensibly suffer from similar conditions and, on occasion, trained mental health clinicians who oversee these virtual venues. Additionally, there are online and offline “listener” programs that connect those distressed by their mental illnesses to other human beings, so that the sufferer is able to communicate synchronously or “on demand”. Perhaps the most professionally developed of the former is Big White Wall, a UK-based company that has expanded its service stateside.

Big White Wall offers a 24/7 accessible sounding board where the mentally anguished can talk about their suffering anonymously. You can, for example, start a “talkabout” to explain your situation and symptoms to others and to ask for help. A “wall guide”, a master’s level clinician, will almost always post a response fairly quickly, to thank you, firstly, for sharing your experience and to comment, secondly, on the specifics of your post. Others who are logged into Big White Wall, either through a browser or through the mobile application, can review your post and jump into the conversation, so to speak, to provide understanding, compassion and even advice. Although these individuals are not your real life friends, the experience of having other human beings hear about, “sit” with, and relate to your particular struggles provides some relief from the stark isolation you may otherwise feel. You can also, of course, respond to others’ posts, creating for yourself and other members new emotional connections through digitally-mediated communication. In the best case scenario, a sense of community can be achieved, simply by sharing, hearing out and helping others on the platform.

On Big White Wall, it is also possible to “friend” the anonymous folks you meet through the service, to have private conversations. These could be your kindred spirits in mental suffering, fellow journeyers looking for a way out of the netherworld of illness. When your therapist is not available and you need someone to open up to before your next peer support group, starting a talkabout, following up on one, or having a private conversation with anonymous friends on Big White Wall can be invaluable. Indeed, one testimonial blurb on the U.S. site reads: “Anonymity keeps me safe, so I’m able to tell the world how I’m feeling, and not be a burden to those closest to me.”

In essence, you can think of the service as an online peer support network that also has other useful features. On Big White Wall, you can also create “bricks”, for example, personal artwork essentially that aims to express your inner state. Or, you can comment on other people’s bricks, or take standardized assessments to track your depression over time. The company also utilizes intelligent analytic technology to detect worsening clinical status and suicidality amongst its users and is able to convey this information to a parent organization (e.g. Kaiser Permanente, if the client is a patient in the Kaiser Permanente HMO).

None of these concepts is new, of course. The Internet already had its share of thriving online forums for the melancholic prior to the advent of Big White Wall. Some that come to mind are: Depression Forums, Depression Understood, Depression Sanctuary, Beating the Beast and various FaceBook groups dedicated to depression. Heck, even Reditt has a thriving depression support group. However, what distinguishes Big White Wall from the majority of these forums is the presence of trained clinicians overseeing the community. Moreover, the service is professionally developed and designed and offers a range of features that are not typically found on online mental health forums, such as self-assessment tools, educational materials on various mental illnesses, video courses and intervention in the case of detected suicidality. It also appears to be the case that the company offers tele-behavioral health services through its platform.

The main drawback to Big White Wall, in my opinion, is that it is not available to all comers. The company only contracts with large healthcare providers, insurers and HMOs here in the U.S. Thus, whether you have the means or not, you may not be able to access their services. In fact, at this time, you most likely will not be able to. (On the other hand, if you reside in the England, you would have a much better chance of being able to access Big White Wall through the National Health Service (NHS), for free. The service is also available for free to university students at select universities in the U.K. It is free for all U.K. military personnel, veterans and their families. You can also opt for a subscription service with Big White Wall in the U.K. for £25 per month).

Enter 7 Cups of Tea and Docz, two free services which provide alternatives to Big White Wall. 7 Cups of Tea is unique among all three of these services in that it provides an active listener program. Thus, if you are in need of communicating with someone straight away about your problems or distress and need the undivided attention of a single person, 7 Cups of Tea allows you to access any number of active listeners who happen to be logged into their site for synchronous, on demand communication. (On Big White Wall and Docz, you can post messages, but you cannot really carry on a conversation in real time). The active listeners on 7 Cups of Tea are not trained clinicians, certainly, but they do receive training in active listening and give you the opportunity to communicate with a human being on demand. Moreover, their performance is rated by members who have experienced their active listening skills, so you can determine who among them is generally doing a good job.

Like Big White Wall, 7 Cups of Tea also has forums in which you can post messages and expect supportive responses from other members. These forums, however, are not overseen by mental health professionals. As with Big White Wall, there is a real sense of community in the forums on 7 Cups of Tea.

7 Cups of Tea also has live chat rooms, another feature that distinguishes it from Big White Wall and Docz. These are categorized by illness, identity and age, among other things. There are group support rooms for people identifying with anxiety, depression, family problems, relationship problems, and LGBTQ issues, for example. So, if it suits your fancy, you can jump into an active conversation with people who identify with one of these categories. These chat rooms can have any number of users engaged at any particular time. While participating in such a chat room may give you a sense of connection to others who may have similar issues, it may not be the ideal setting to receive individualized attention to your specific needs.

Finally, if you are in need of more professional help and you do not have a local therapist, 7 Cups of Tea allows you to access online “therapy” with a licensed mental health professional. Unlike other tele-behavioral health platforms, this service matches you to a licensed clinician to engage in text message-based therapy. According to the site, you are allowed to communicate with your therapist any number of times per day, but the therapist with whom you are matched will only communicate with you once per day, Monday-Friday.

Docz, the second of these Big White Wall alternatives, is a free mobile application that provides some of the functions available on Big White Wall. Unlike Big White Wall, however, client communications occur exclusively through the mobile application; there is no option to use the services with a laptop or desktop via browser. Similar to Big White Wall, Docz offers the equivalent of online peer support, connecting individuals experiencing mental illness with others who are experiencing similar situations. However, a major limitation of Docz is that you cannot respond to people’s responses to your post, to keep a conversational thread going. Rather, you post a message or question and receive individual responses from people who may have a similar background or experience to yours. You can then thank them for their responses, but there is no way to write an individualized response back to them or offer additional information about your situation that may invite further responses. Thus, in my opinion, it is more difficult to achieve a cohesive sense of community on Docz than it is on Big White Wall or 7 Cups of Tea. Additionally, as there is no way to type posts or responses using a browser-based interface on a desktop or laptop, it is difficult on Docz to compose long, detailed messages that might offer more of your story.

Fortunately, the Docz application does allow you to carry on private chats with other members with whom you may wish to stay in touch. This feature is also available on Big White Wall and creates the impression of an online friendship, which may be invaluable to some for their recovery.

Other startups that have entered this space (digitally-mediated peer support), including Pacifica Labs, out of San Francisco and Quartet Health, headquartered in New York City. Pacifica Labs offers a smartphone application that primarily delivers meditation and cognitive behavioral therapy-based strategies to combat stress, anxiety and worry. However, there are also online communities with various foci that can be accessed through the application, as well as group chat. Quartet Health’s solution is more comprehensive in scope. Patients can participate in online treatment programs, access educational materials for various diagnoses, receive peer support and communicate with psychiatrists and therapists securely through its platform. However, the platform also allows clinicians to communicate with one another. Primary care physicians can refer patients to behavioral health providers through this solution. They can also, apparently, receive curbside consults from psychiatrists. Through Quartet’s solution, behavioral health providers can receive referrals, collaborate on treatment planning with primary care physicians, and access practice management features such as scheduling and billing.

For the time being then, it appears to be the case that entrepreneurial technologists are developing solutions to allow human beings suffering from mental illnesses to be more accessible to one another and to have access, in real time, to trained active listeners, all in the hopes of delivering empathy. This is a logical step in the evolution of digital mental health that uses existing technologies to democratize the delivery of empathy. However, one is hard pressed to deem any of it truly innovative or groundbreaking.

My sense is that it will take some time for artificial intelligence technologies, deep learning, psychology and the science of human-computer interactions to intersect in such a way as to create the experience of an empathic virtual companion. And when they do intersect, it will likely cater to the needs of the general consumer, initially, before the novel technology is refined and usurped as a digital mental health tool. When this does occur, the world might have an opportunity to finally dispense empathy equitably and virtually to those who suffer from mental illnesses. Until then, we will still need human intermediaries to listen and to speak compassionately to those amongst us who suffer from mental illnesses.

--

--

Girish Subramanyan

Psychiatrist. Hypothesist. Digital Mental Health Enthusiast