Where Tech Meets and Misses the Goals of Therapy

Therapist Marcus Brittain Fleming discusses the digital turn in mental health care

Iretiolu Akinrinade
Data & Society: Points
6 min readMay 17, 2023

--

Photo by Nicolas Lobos on Unsplash

As people increasingly turn to digital mental health services like online therapy, their interactions with these platforms yield reams of data. Some mental health technology companies use a portion of that data to match patients with practitioners or to improve the clinical encounter. Meanwhile, to identify targeted treatment strategies, some researchers are using mobile device data to identify behavioral markers of mental health states — what they are calling digital phenotypes. But while improving or personalizing mental health is a worthy goal, it is only one of many potential uses of this cognitive-behavioral data, and not all of those uses are beneficial to patients or attractive to therapists.

Broadly speaking, therapists have not shown eagerness to access “better data” about their patients’ lived experience. Yet the promise of naturalistic data and computational therapy continue to hold allure. In this series of “micro-interviews,” we’ll explore the applications and sites of cognitive-behavioral assessment technologies, and whether they make sense. Does the proclaimed value or promise of improved care offset the unwieldy nature of collecting more and more data about individuals?

In this first installment, I talk with Marcus Brittain Fleming, the psychotherapist founder of Bandwidth Care, which describes itself as “a therapy practice and think tank that uses psychotherapy and digital activism to support individuals, families, and organizations with 21st century mental health needs.” Over the past few years, he has also performed virtual and in-person therapy at a community-based clinic in Brooklyn. I first encountered Fleming’s therapeutic orientation in an online workshop on digital harm reduction and data healing hosted by Neema Githere in 2021. The workshop brought together users and wellbeing practitioners to discuss life with technologies and strategies for identifying and evading psychologically coercive and harmful design. After connecting in the Zoom chat, Fleming and I spoke about how therapists understand patients’ experiences with technology, the violence of pathologizing mental health, and his own use of technology in therapy.

Therapists like Fleming make it their work to take a human-first approach to the delivery of care, valuing the relationship between two people and the patient’s knowledge of themself — things that can’t be meaningfully aided by AI and machine learning at the point of care. But technology has proved to be a useful tool in some of his therapy sessions, too, as seen in his use of games like Minecraft and Roblox with some younger clients. For this series, I was interested to hear more from him about how digital technology and categorization typically sit within therapeutic practice, and what he’s doing differently.

Iretiolu Akinrinade: Bandwidth Care works with clients to address how the time they spend online and in digital spaces impacts their lives and their health. Where did the idea for this practice come from? What kinds of tech-related mental health needs do you see, or expect to see?

Marcus Brittain Fleming: While in training, I was taught to meet psychotherapy clients “where they are at”: If someone is anxious, empathize with that anxiety. If someone hates their job but doesn’t want to leave it, sit with them in that ambivalence. If someone collects baseball cards, show interest in their favorites. This therapy MO is the most important ingredient in developing a close-knit therapeutic bond. In short, without rapport there is no treatment.

With this in mind, I started my on-the-job training. But instead of hearing about baseball card collectors and people wanting to leave their jobs, I heard about people investing in cryptocurrency, worrying about which selfie to post, and spiraling after seeing violent content on their Instagram feeds. With the average amount of screen time exceeding eight hours a day for many of us, “meeting people where they are at” has turned digital.

Despite this shift, however, I’ve noticed that many mental health workers have not turned digital with our clients. This makes sense, as we were never formally taught how to deal with 21st-century clients (avatars, usernames, and personal brands) in fully digital environments like Zoom. I was never trained to ask people about their tech use. I wasn’t taught about how different platforms affect our mental health differently. No one told me about doom scrolling. Instead, I was left in the dark and advised to tell people to stop using social media if it bothered them. I was fed the same pop psychology warnings that most of us receive: Social media is toxic. Smartphones are ruining our brains.

Bandwidth Care seeks to meet people where they are at now, in today’s world. Half nascent psychotherapy practice, half think tank, my practice is all about developing new therapeutic tools and research for modern-day mental health needs. As therapists, we can no longer meet people where they’re at if we don’t ask them about how they use Twitter, Discord, VR chat, Coinbase, and Candy Crush. So we shift our focus.

Akinrinade: Can tech-related mental health triggers have tech-related solutions? How would you imagine those working in conjunction with therapy?

Fleming: I’m immediately wary when I hear about people trying to “solve” mental health “problems” or “triggers” with technological tools. Any kind of techno-solutionism suggests a quick cure, and that contradicts my work as a psychotherapist, which involves a long process. This said, I’m trying to keep more of an open mind about technological tools that help people cope with mental health needs on a daily basis. My line in the sand is clear: if a wellness app or wearable device helps my clients and does not harm others, I’ll affirm their motivation to use technology to support themselves.

I think the best app for mental health continues to be any notes app. As depicted on many an Instagram photo dump, lots of us use our notes apps to chronicle the daily flotsam and jetsam of thoughts and feelings. This is great! Journaling about thoughts, feelings, and body sensations continues to be the number one suggestion I give to my clients. A little self-awareness can go a long way. And not only are our devices more available to us nowadays than pens and paper, notes apps also add the utility of being able to post links, copy messages from online text, and paste photos. With a little psychoeducation and dedication, your notes app can quickly become a garden for daily self care cultivation and psychological revelation.

Akinrinade: You’ve talked about people grappling with their interactions with devices and apps. At the same time, people and their data are being studied by those devices and platforms. For marketing and user experience design purposes, this data is understood as user analytics, but I’m interested in it specifically as a human-centered, personalized research strategy. What interests or epistemological concerns do you have about building digital phenotypes for common mental illnesses?

Fleming: My problem with using digital phenotyping to diagnose, develop therapies, and provide advertisements for people with mental illness has less to do with the violence of surveillance technocapitalism and more to do with the violence of pathologizing mental health in general. The idea of a “mental illness” is something to balk at, in my opinion. This being said, because many of us are acculturated in American traditions of modern psychiatry and psychotherapy nudged by white cis men in the mid 1800s (Freud, Dr. Benjamin Rush, to name a few), we spend our lives seeing diagnoses as facts. And while many of us find solace in knowing that we have moderate depressive disorder (F33.1) or generalized anxiety disorder (F41.1) — because then our problems exist outside of us in ways we can solve — mental health is less about problems that need to be fixed and more about our emotional needs that haven’t been met. Truth is, diagnoses are just made-up categories based on observed symptoms: “symptoms” that are forged in the same oppressive fires that had a major role in setting diagnostic standards in the first place. America’s 400-year history of colonization and racial/gender/class-based violence is at the root of so much of our collective trauma, which sadly now includes widespread psychiatric trauma rooted in a long history of biased and violent mental health care.

And who is currently defining categories and observing the symptoms, you ask? Today the American Psychiatric Association’s (APA) 36,000 members are still mostly white and male. And though much is being done to try to acknowledge the APA’s violent past, systemic discrimination and unchecked bias continues to deeply affect who is labeled “mentally ill” and who isn’t. My main concern with digital phenotyping for mental illness is that it will be used by tech companies to perpetuate and further legitimize the harms of diagnosing under the guise of supporting public mental health. Before building any technology to support mental health, we need to unravel the centuries of algorithmic injustice that’s been done in the mental health field.

For more information and to connect with Bandwidth Care, visit bandwidth.care.

--

--