We all need some-botty to lean on

Jessica Hoban
Digital Shroud
Published in
6 min readApr 19, 2021

Technology is often the antagonist when we discuss mental health; it’s our crutch, our addiction, the blue light in our faces making us blind to the world.

But it doesn’t have to be. The original intent of a crutch was to provide support, not to promote dependency. With some ingenuity (and healthy boundaries), we could get back to that original definition. A recent paper in the IEEE Pervasive Computing journal, titled “A Decade of Ubiquitous Computing Research in Mental Health”, might just show us how.

Background.

Some technology is designed strictly for engagement, for getting its user base to scroll endlessly in a self-perpetuating cycle of insecurity and temporary validation. It treats the user as a commodity rather than a customer. And although a causal link between social media and mental illness has not been established, most of us can relate to this dysfunctional form of symbiosis.

These platforms have become behemoths, monopolizing our time — and yet, a different kind of technology has evolved in their wake. Something more compassionate. Caring. Something that we can bring with us, on us, an ever-present support system for our growing repertoire of mental illnesses.

It’s these kinds of systems that Drs. Jakob Bardram and Aleksandar Matic compiled in their recent 2019 paper. In total, they identified 46 peer-reviewed technologies, designed specifically to address varying severities of mental illness, ranging from clinical diagnoses to general well-being. These technologies are transforming mental healthcare into a continuous and integrated part of our lives, rather than the traditional appointments that we haphazardly schedule (if we do at all).

Bardam and Matic had three criteria when selecting these systems: 1) they must be wearable or mobile, 2) they must be published in a peer-reviewed paper, and 3) they must either be designed for non-clinical mental well-being or clinical severe mental illnesses (SMIs).

They searched for a broad scope of SMIs, which fell under 10 general categories:

  • Organic mental disorders (e.g., Alzheimer’s and dementia);
  • Disorders due to substance abuse;
  • Schizophrenia and delusional disorders;
  • Mood disorders (e.g., depression and bipolar disorder);
  • Neurotic or stress-related disorders (e.g., anxiety, PTSD, and phobias);
  • Behavioral disorders (e.g., eating or sleeping disorders);
  • Mental retardation;
  • Psychological development disorders (e.g., autism and ADHD);
  • Adult personality disorders (e.g., borderline personality); and
  • Behavioral and emotional disorders in children or adolescents.

Results.

FOCUS. Of the 46 technologies, 33 of them were clinical in nature (i.e., addressing one of the above categories). The majority of these addressed mood disorders (N = 26), whereas none were designed for behavioral disorders, mental retardation, or disorders in children. This is especially shocking since eating disorders affect at least 9% of the global population, yet they weren’t represented in this study. Why?

As for the other 13 systems, these addressed non-clinical concerns such as emotion, stress, or sleep. One of these systems, called EmotionSense, used a wristband to recognize the wearer’s emotions, activity levels, and interactions with others.

TIMELINE. The authors note two “waves” of when these systems were published, the first from 2010 to 2012, the second from 2013 to 2016 (see Fig. 2). The most prolific year in this area was 2016 by far, though they don’t speculate as to why.

FEATURES. In terms of the software itself, Figure 1 provides more information. We can see that 85% of these systems collected data — either passively, using various sensors, or actively from surveys or questionnaires. A third of systems allowed for clinical assessment, which inferred disorder stages, symptoms, and medication compliance from patient-reported outcomes. Slightly fewer (30%) performed some kind of prediction (e.g., mood forecasting for depression), while even less (24%) actively intervened and guided the patient in their care. The user interface was also addressed in 35% of systems, such as MOBERO which designed their UI for children with ADHD.

FLAWS. It seems convenient and sensible to capitalize on the technologies that we already have in abundance. However, the sensors in smartphones and other wearable devices weren’t designed with superior accuracy in mind, leading to data gaps and noisy or imprecise readings. Furthermore, the heightened interest in user privacy has led to restriction of access to many of these sensors, making it a bittersweet trend. In trying to protect you, it also penalizes use of your own data.

There also comes the problem of reproducibility. Half of the clinical systems had 20 or less participants, and half had a duration of under 2 months. With such small-scale studies, can we really generalize any results to a larger population? When you consider the confounding factors that haven’t been accounted for (age, culture, economic status, etc.) as well as the lack of validation studies (something endemic to computer science), it gives you room for pause. Without clinical evidence of their efficacy, these technologies will never reach widespread adoption in the healthcare setting.

Systems not mentioned.

This paper aggregated a robust set of peer-reviewed technologies. Here are some that may have not met their criteria, but that are still worth mentioning (at least to me).

SuperBetter. I discovered this tool from the podcast Ologies, during this episode on video games. SuperBetter is an app developed by Dr. Jane McGonigal which essentially gamifies mental health with the alleged benefits of higher emotional control, sense of purpose, self-efficacy, social connectedness, and optimism. The company’s website links a randomized controlled trial studying depression, a clinical trial studying concussion symptoms in teenagers, as well as two meta-analysis studies, all of which showed reduced symptoms in their respective samples when using SuperBetter. While this isn’t enough to prove clinical efficacy, it shows potential in this theory and methodology.

Forest. Forest is an app that is more focused on reducing screen-time. While in the app, you can plant a virtual tree — it grows while you stay in the app, and wilts if you leave. You have your own “forest” of these trees, records of your past successes. This company also partners with Trees for the Future, donating money to plant physical trees. The total so far? Over one million.

Reflection.

Here are some of my personal musings on this paper:

  1. While this paper aggregated and categorized these technologies in an informative way, I was still left curious about one thing: the preliminary results of each system. Yes, the sample sizes were too small to be considered “clinical evidence” — among other issues — but I would have liked a general sense of usefulness. For the participants with depression, did any system reduce the frequency or severity of any of their episodes? (Asking for a friend.)
  2. Here’s a particular scenario that I couldn’t get out of my head:
    ​ ​ ​​​ ​ ​​​ ​ ​​​“Alex experiences panic attacks, so they optimistically agree to try a tool designed to predict those attacks before they happen. This could help me, they think. About a week in, the system informs them of an impending attack. A wave of panic rushes over them, even though they were feeling fine a moment before the alert…”
    ​ ​ ​​​ ​ ​​​ ​ ​​​Could predictive systems become self-fulfilling prophecies for certain disorders? This is a hypothetical unintended consequence to keep in mind.
  3. The authors mentioned high drop-out rates due to the daily effort of surveys and questionnaires. Of course, when any activity is framed as a task, it becomes tedious and difficult to stick to. Even for simple yes/no answers, just defining your mental state can be deceptively hard. Maybe a new approach could be reframing these active data collections as journal prompts, the results of which can then be analyzed using various natural language processing techniques to assess overall tone and mood. ​ ​ ​​​ ​ ​​​ ​ ​​​​ ​ ​​​ ​ ​​​ ​ Journaling is a common cognitive behavioral therapy technique, but it only becomes enjoyable when framed as a form of self-expression and release. This way it wouldn’t be on the user to define their mental state, but rather to describe it.

Because therapy is expensive, comment your favorite coping mechanism below! Mine is videos of people seeing color for the first time.

--

--