Prathyusha Chilagani
Jan 28 · 4 min read

There is a relationship we’ve been trying to establish between AI and Mental Health more aggressively past few years. At the moment most approaches are spread all over the place without specific direction. There are very few groups which are focusing in areas that effectively converge to right objectives.

Photo by Possessed Photography

The main problem in this discipline as opposed to other disciplines in medicine is the subjectivity of it when it comes to treatments. Most medical disciplines are objective in that they require hard skills to solve problems [1]. Take Oncology. Requires end-to-end deterministic approaches in treatments. While the discipline of Mental Health is severe and its literature as strict as it can be, the nature of treatments is abstract in that they are heavily dependent on human-human interactions. The methods used by clinicians, psychiatrists maybe theoretically defined (cognitive re-attribution etc), but the exercising of these are highly subjective to patient, the type of illness etc. As a logical extension of this understanding automated tech interventions shouldn’t be deployed (yet, and maybe for many years to come) at a level that replaces human-human interaction.

Lets get a sense of how expansive this subjectivity is. Take a patient who is suffering from OCD. His profile will be unique in that his coping mechanisms could vary in strength from others drastically, the disorder could have been triggered from a very specific life event, degree could be anywhere in the spectrum from mild to dysfunctional, could have been left undiagnosed for a specific duration due to which the treatment approaches differ, etc. Trying to plug in a tool which is built by processing a corpus of data (most times largely irrelevant and biased) to help him in day-to-day recovery is a groundless approach and one with dire consequences. Not a thrilling example, GPT-3 asking a fake patient to kill himself (RME) [2]

Photo by Ehimetalor Akhere Unuabona

Isolate one Mental Illness, and there is further meticulous classification that needs to be done that depends on variables so complex and so inter-dependent that quantisation of these is far from achievable.

While humans, in this case clinicians, psychiatrists, even friends and akin, with their empathy, reasoning are crux of solving the problem, they are only siloed by their limited capability in computation or objective classification of huge data. They cannot for example, compare volumes of their notes across patients and isolate similarities, dissimilarities, outliers to streamline their process, make better decisions. They cannot combine data from various parallels — tests, notes, biometric data etc map them, model them for analysis. Technology that way, can be a huge aid in augmenting efforts of these professionals.

While there is so much possible computationally — Analysis of written text of counsellors via NLP, mood paucity studies with certain markers by probing video data (computer vision), audio data, tracking activities on social media, stamping biometric data against these or any other, temporal inspection of a combination of these, it would be good to interface these systematically all while staying true to pre-defined value addition strategies.

Research shows that Mental states (Happiness, Sadness, Anxiety) and Mental traits (Resilience, optimism, social engagement, which are directly proportional to Mental health and longevity) are interlinked with cardiovascular outcomes, physiological outcomes, cancer outcomes etc. Experts expressed massive need for behaviour based digital-phenotypes [1] (Phenotype, term used in genetics for the composite observable characteristics or traits of an organism. The term covers the organism’s morphology or physical form and structure, its developmental processes, its biochemical and physiological properties, its behavior, and the products of behavior [3])

It is important to chalk feedback loops with specificity so that observations made are more in-the-context and qualitatively contributive.

For example lets consider dementia. For improved clinical practices understanding of psychosocial functioning across patients suffering from it differently needs to be assessed. Structured activity monitoring, audiotapes of qualitative interviews, sleep, EMR, blood-based biomarkers etc could be sources of data. These could be fed to appropriate machine learning systems for pattern recognition.[1]

There could be many more parallels in pink depending on requirements.

It is more important in Mental Health space than in any other that experts from various parallels, tech, psychaitry, engineering, pharmacy, genetics, management etc work in congruition to sketch these processes and define problem statements that help them establish closed loops. Such streamlining is more imperative and should be the first of steps that help lay out procedures for effective research and method placements which help in early detection, prevention, effective diagnosis, expedited treatment, all scalable and incrementing in reach. It is important to not place technologies where they don’t fit in Mental Health space — where sentience and reasoning is required. It is important to not place them within ambigous problem statements, lest they should lead to calamitous consequences — ill deductions, costing lives.





Quantum Voice

Digging deeper into the world of AI.