Psychoinformatics & Digital Phenotyping: ethical complexities of emerging digital surveillance practices

Navigating a new era of digital surveillance for psychiatric insight generation

Eden Shaveet
13 min readJun 5, 2023
Made by Eden Shaveet on Canva

Digital Surveillance in Health Contexts

Digital surveillance methodologies, broadly defined as the use of digital technology to collect, store, process, and instrumentalize information about an individual, population, entity, or phenomenon [1, 2], have become increasingly popular in public and personal health care contexts [3, 4].

In the realm of public health, the use of digital syndromic surveillance is often referred to as “infoveillance,” an important aspect of the emerging field of infodemiology [5]. Notably, infodemiological studies leveraging infoveillance have demonstrated how aggregated search engine trends and free-text social media posts can be analyzed to predict the emergence and spread of communicable diseases [6]. By combining these methods with other forms of disease monitoring (like wastewater surveillance), public health agencies have been able to prepare and respond more effectively to potential disease outbreaks, including those caused by highly infectious strains of influenza and SARS-CoV-2 (the type of coronavirus responsible for COVID-19) [79].

While digital public health surveillance techniques focus on analyzing digitally-sourced information on the aggregate to gain community or population-level insights, personal digital health surveillance strategies are geared towards generating insights specific to an individual [10], including psychiatric insights.

Within the formal healthcare system, digital surveillance typically comes under the purview of remote patient monitoring [11]. This involves the use of mobile medical devices to gather data that were traditionally only obtainable in clinical settings, such as electrocardiogram readings, respiration rate, blood pressure, blood glucose levels, and neural system activity. In clinical psychiatric research contexts, remote, contact-based biosensors have been used to monitor heart rate, breath carbon monoxide and alcohol levels, chest expansion, hand movement, electrodermal activity, and skin temperature [12]. These data, combined with self-reported experiences and outcomes, are believed to constitute “behavioral biomarkers” indicative of psychiatric disorders [12].

In order to be used for clinical purposes in the U.S., these devices must be approved by the U.S. Food and Drug Administration (FDA) [13] and the data they collect and generate are considered protected health information (PHI) under the U.S. Health Insurance Portability and Accountability Act (HIPAA) [14].

There is a growing trend, however, towards using biometrics collected via personal devices for health purposes, including the use of commercial wearable technology like smartwatches and mobile health applications that are not affiliated with a healthcare entity covered under HIPAA, and therefore not subject to the same federal protections [15]. Similarly, it’s been suggested that user-generated content on digitally connected platforms (such as social media) may be used to infer an individual’s health status [16].

Within this conjecture also exists the idea that user-generated data on commercial platforms may be used to infer a person’s psychological state, leading to the emergence of a field known as psychoinformatics.

Psychoinformatics & Digital Phenotyping

Psychoinformatics is a subfield of generalized health informatics centered around the collection, organization, and synthesis of user-generated data to glean information about a person’s psychological characteristics and traits [17]. An interdisciplinary domain between computer science and psychology, psychoinformatics embraces a variety of digital surveillance methodologies enabled by the emergence of large amounts of user-generated data, collected both actively and passively [17, 18]. One of the most prominent and controversial techniques within the realm of psychoinformatics is digital phenotyping [19].

Digital phenotyping refers to the collection, quantification, and analysis of human interactions with personal devices to inform psychiatric diagnosis or treatment [20]. One example of digital phenotyping involves tracking changes in individuals’ typing speed and frequency of spelling errors on their smartphones, which has been suggested by its creators to indicate changes in mood or cognitive ability [21]. Other examples include the use of biometric data from wearable technology to monitor sleep patterns, physical activity, and mobility in attempts to predict various mental health outcomes [22, 23]. Another example is the use of textual analysis of social media data to identify language markers as a predictor of mental health conditions [24], drawing from the prior work suggesting that individuals diagnosed with depression tend to use more first-person pronouns in their writing [25].

Some academics have raised questions regarding the scientific validity, reliability, and implications of using digital phenotyping as an approach outside of controlled research contexts [9]. Furthermore, some community activists and scholars have focused on the ethical dilemmas posed by such approaches in psychiatric contexts [26, 27]. Specifically, concerns have been raised about digital privacy and surveillance, as well as the policy landscape that permits the use of these data for commercial purposes.

Complexities, Considerations, and Concerns

Health Data Protections

The level of protection afforded to personal health data collected by commercial mobile health entities and applications can be confusing when compared to those collected by hospitals and other healthcare settings. This may be attributed to a critical distinction between the information held by healthcare entities such as hospitals, and other commercial entities like mobile health applications and wearable technology.

Protected health information (PHI) is defined specifically within HIPAA and is subject to federal protections by the U.S. Department of Health and Human Services (HHS) and the Office of Civil Rights (OCR) [28]. This definition of PHI only applies to data within the stewardship of healthcare entities covered under HIPAA, which must meet a specific set of criteria.

Mobile health companies are often not covered entities under HIPAA and therefore the data within their stewardship do not constitute PHI [15]. In the U.S., the expanded health data ecosystem includes both data held by covered healthcare entities, which are subject to HIPAA protections, as well as commercial health data, which are not entitled to the same level of protections. Consequently, digital phenotyping techniques used outside of research settings can currently be implemented more easily outside the jurisdiction of the HHS. Nevertheless, large-scale clinical trials of digital phenotyping methods for psychiatric diagnosis and treatment are being conducted for clinical validation and applications [29].

Passive Data Collection

Passive digital surveillance often takes place for commercial purposes without an individual’s full awareness or informed and explicit consent for digital monitoring [30]. When it comes to public data (such as those posted on a public forum or social media platform), although aggregation at an appropriate spatial or numeric threshold may offer some sense of user anonymity, the use of publicly available data becomes more ethically questionable when digital surveillance is employed to obtain personal insights.

Data that are not publicly available, but are within the ownership and stewardship of a commercial entity, such as a mobile mental health application, can be utilized by the company for any purpose they see fit [31], including making health-related inferences based on digital phenotyping approaches and selling these insights to interested parties.

A recent study out of Duke University revealed that data brokers in the open data market are legally selling sensitive data related to individuals’ mental health [32]. These data include both self-reported information and “predicted ailments” based on other collected metrics from mobile health apps and personal wearable technology (i.e., “digital phenotypes”). The study also revealed minimal vetting of buyers and limited regulation of the purchased data’s use. This implies that, in addition to data already collected about individuals from their mobile devices, such as demographic information, real-time GPS locations, and projected political ideologies [33], “digital phenotypes” and other relevant data could be utilized for targeted advertising, opinion manipulation, or shared with other parties who gain to benefit from knowing individuals’ and populations’ perceived psychiatric conditions. The prospect of selling infirm diagnostic insights based on decontextualized interactions with technology should prompt concern, particularly in light of the extensive history of psychiatric diagnoses being exploited for malicious purposes in social contexts.

Psychoinformatics in Context

Psychiatry is a dynamic field that continues to evolve through research and practice, which are vital to enhancing the discipline’s ability to support individuals experiencing mental and behavioral health challenges. In recent years, a variety of perspectives and approaches have enriched the discipline, deepening our understanding of the complex interplay between biological, psychological, and social factors that shape mental health. Unfortunately, there persists a historically-rooted tendency to misuse pathological concepts in social contexts.

Throughout history, psychiatric practices and diagnoses have been influenced by the prevailing attitudes of their time, which have sometimes led to the use of psychiatric diagnoses as a means to justify discrimination under the privilege of medical expertise [34, 35].

For instance, the diagnosis of Hysteria was long believed to be a gynecological ailment affecting female people experiencing symptoms related to emotions that could not be explained by another medical condition [36]. Diagnosed Hysteria was often used to justify the idea that women and female people were irrational, unreasonable, and unstable, and therefore incapable of participating fully in society [37]. Homosexuality was considered a psychiatric disorder for many years and was used to justify discrimination against LGBTQ+ individuals, such as denying them access to employment, housing, and healthcare [35]. And in the 1960s, baseless diagnoses of Schizophrenia were used to justify the detention of Civil Rights Movement activists and invalidate their claim to civil liberties and advocacy for social change [38].

Though many psychiatric diagnoses have since been revised or removed from the Diagnostic and Statistical Manual of Mental Disorders (DSM) attributed to changes in societal attitudes and improved scientific understanding of emotional distress [39, 40], the field of psychiatry’s legacy of pathology-guised discrimination remains.

Efforts to reshape the mental health system have made it clear that person-centered approaches, rooted in an understanding of complex trauma, are needed to understand and support individuals’ well-being [41]. Relying excessively on decontextualized digitally-sourced data for individual-level psychiatric insights, diagnoses, and treatment decisions contradicts these efforts by prioritizing computational algorithms over individualized, trauma-centered approaches facilitated by other human beings.

In alignment with the field of psychiatry’s long history of misattributing people’s complex relationships with their environments to psychiatric disorders, digital phenotyping has the potential to oversimplify a person’s interactions with technology and inappropriately attribute them to mental health conditions, particularly when important contextual variables are ignored. Digital phenotyping may represent the latest iteration of a field plagued by a continuous struggle to capture the complexity of real-world situations in neat taxonomies.

A Way Forward

As our world becomes more technologically mediated and dependent, it is crucial that we stay watchful for possible misuses of digital tools that may take advantage of policy gaps for commercial gain and suffer from limited understanding of diverse human experiences, such as those related to mental health and emotional distress.

While the appeal of digital phenotyping is its ability to aid in diagnosis by more rapidly identifying and categorizing human behavior using technology, this is done via proxy metrics that may create a sense of pseudo-objectivity without due consideration of contextual variation and the complexities of human-device interaction.

The notion that we can rely on devices to expedite the identification of disorders using standardized proxy metrics is contrary to the work of clinicians, community activists, and other respected experts that have emphasized the need to embrace holistic approaches to mental health assessment that recognize the intricate and diverse nature of human experiences, particularly when it comes to comprehending complex emotional trauma, which can manifest in various ways contingent on context [4244].

Although technology can play a valuable role in the mental health space, it is essential to approach its integration ethically and with caution. This involves advocating for more robust data protection regulations and carefully considering the data management policies and practices of commercial health applications before using them. Additionally, it is crucial to prioritize the perspectives of those who have firsthand experience with mental and emotional health challenges when developing digital mental health projects and products. This includes uplifting computational citizen science and participatory projects in the mental health space, and creating more opportunities for individuals with lived experience in the psychiatric system to take part in, lead, and audit such initiatives.

By adopting a critical perspective on personal surveillance technology and placing a stronger emphasis on prioritizing the viewpoints of individuals with firsthand experience in the psychiatric system, we can mitigate the negative effects of disseminating detached digital representations in clinical and commercial environments.

Suggested citation: Shaveet, E. (2023) Psychoinformatics & Digital Phenotyping: ethical complexities of emerging digital surveillance practices. Medium [Personal Blog].

References

1. Wang V, Tucker JV. ‘I am not a number’: Conceptualising identity in digital surveillance. Technology in Society 2021 Nov 1;67:101772. doi: 10.1016/j.techsoc.2021.101772

2. Cover R. Digital Surveillance, Archives, and Google Earth: Identities in/of the Digital World. Digital Identities Elsevier; 2016. p. 243–265. doi: 10.1016/B978–0–12–420083–8.00008–0

3. Shakeri Hossein Abad Z, Kline A, Sultana M, Noaeen M, Nurmambetova E, Lucini F, Al-Jefri M, Lee J. Digital public health surveillance: a systematic scoping review. npj Digit Med 2021 Mar 3;4(1):41. doi: 10.1038/s41746–021–00407–6

4. Gupta A, Katarya R. Social media based surveillance systems for healthcare using machine learning: A systematic review. Journal of Biomedical Informatics 2020 Aug;108:103500. doi: 10.1016/j.jbi.2020.103500

5. Mavragani A. Infodemiology and Infoveillance: Scoping Review. J Med Internet Res 2020 Apr 28;22(4):e16206. PMID:32310818

6. Beckhaus J, Becher H, Belau MH. The use and applicability of Internet search queries for infectious disease surveillance in low- to middle-income countries. One Health Implement Res 2022; doi: 10.20517/ohir.2022.01

7. Alsudias L, Rayson P. Social Media Monitoring of the COVID-19 Pandemic and Influenza Epidemic With Adaptation for Informal Language in Arabic Twitter Data: Qualitative Study. JMIR Med Inform 2021 Sep 17;9(9):e27670. doi: 10.2196/27670

8. Hadi TA, Fleshler K. Integrating Social Media Monitoring Into Public Health Emergency Response Operations. Disaster med public health prep 2016 Oct;10(5):775–780. doi: 10.1017/dmp.2016.39

9. Santillana M, Nguyen AT, Dredze M, Paul MJ, Nsoesie EO, Brownstein JS. Combining Search, Social Media, and Traditional Data Sources to Improve Influenza Surveillance. Salathé M, editor. PLoS Comput Biol 2015 Oct 29;11(10):e1004513. doi: 10.1371/journal.pcbi.1004513

10. Chén OY, Roberts B. Personalized Health Care and Public Health in the Digital Age. Front Digit Health 2021 Mar 30;3:595704. PMID:34713084

11. Farias FAC de, Dagostini CM, Bicca Y de A, Falavigna VF, Falavigna A. Remote Patient Monitoring: A Systematic Review. Telemedicine and e-Health 2020 May 1;26(5):576–583. doi: 10.1089/tmj.2019.0066

12. W. Adams Z, McClure EA, Gray KM, Danielson CK, Treiber FA, Ruggiero KJ. Mobile devices for the remote acquisition of physiological and behavioral biomarkers in psychiatric clinical research. Journal of Psychiatric Research 2017 Feb 1;85:1–14. doi: 10.1016/j.jpsychires.2016.10.019

13. Health C for D and R. Device Approvals, Denials and Clearances. FDA. FDA; 2022. Available from: https://www.fda.gov/medical-devices/products-and-medical-procedures/device-approvals-denials-and-clearances [accessed Mar 10, 2023]

14. Getting started with telehealth | Telehealth.HHS.gov. Available from: https://telehealth.hhs.gov/providers/getting-started [accessed Mar 10, 2023]

15. Marlowe S. LibGuides: Medical and Health Data Privacy: HIPAA and Beyond: Health Data Not Covered by HIPAA. Available from: https://fclawlib.libguides.com/HIPAA/notHIPPAA [accessed Mar 10, 2023]

16. Merchant RM, Asch DA, Crutchley P, Ungar LH, Guntuku SC, Eichstaedt JC, Hill S, Padrez K, Smith RJ, Schwartz HA. Evaluating the predictability of medical conditions from social media posts. Ramagopalan SV, editor. PLoS ONE 2019 Jun 17;14(6):e0215476. doi: 10.1371/journal.pone.0215476

17. Yarkoni T. Psychoinformatics: New Horizons at the Interface of the Psychological and Computing Sciences. Curr Dir Psychol Sci 2012 Dec;21(6):391–397. doi: 10.1177/0963721412457362

18. Jacobson NC, Summers B, Wilhelm S. Digital Biomarkers of Social Anxiety Severity: Digital Phenotyping Using Passive Smartphone Sensors. J Med Internet Res 2020 May 29;22(5):e16875. PMID:32348284

19. Onnela J-P, Rauch SL. Harnessing Smartphone-Based Digital Phenotyping to Enhance Behavioral and Mental Health. Neuropsychopharmacol 2016 Jun;41(7):1691–1696. doi: 10.1038/npp.2016.7

20. Huckvale K, Venkatesh S, Christensen H. Toward clinical digital phenotyping: a timely opportunity to consider purpose, quality, and safety. npj Digit Med 2019 Sep 6;2(1):88. doi: 10.1038/s41746–019–0166–1

21. Zulueta J, Piscitello A, Rasic M, Easter R, Babu P, Langenecker SA, McInnis M, Ajilore O, Nelson PC, Ryan K, Leow A. Predicting Mood Disturbance Severity with Mobile Phone Keystroke Metadata: A BiAffect Digital Phenotyping Study. J Med Internet Res 2018 Jul 20;20(7):e241. doi: 10.2196/jmir.9775

22. Dlima SD, Shevade S, Menezes SR, Ganju A. Digital Phenotyping in Health Using Machine Learning Approaches: Scoping Review. JMIR Bioinformatics and Biotechnology 2022 Jul 18;3(1):e39618. doi: 10.2196/39618

23. Abdullah S, Choudhury T. Sensing Technologies for Monitoring Serious Mental Illnesses. IEEE MultiMedia 2018 Jan;25(1):61–75. doi: 10.1109/MMUL.2018.011921236

24. Liu T, Ungar LH, Curtis B, Sherman G, Yadeta K, Tay L, Eichstaedt JC, Guntuku SC. Head versus heart: social media reveals differential language of loneliness from depression. npj Mental Health Res 2022 Oct 18;1(1):16. doi: 10.1038/s44184–022–00014–7

25. Rude S, Gortner E-M, Pennebaker J. Language use of depressed and depression-vulnerable college students. Cognition & Emotion 2004 Dec;18(8):1121–1133. doi: 10.1080/02699930441000030

26. Logan J. Regulations Needed to Protect Privacy and Autonomy from Digitalized Psychiatric Tools. Mad In America. 2021. Available from: https://www.madinamerica.com/2021/09/regulations-needed-to-protect-privacy-and-autonomy-from-digitalized-psychiatric-tools/ [accessed Mar 10, 2023]

27. Digital phenotyping: A revolution or a privacy breach? Available from: https://distilgovhealth.com/2019/01/22/digital-phenotyping-a-revolution-or-a-privacy-breach/ [accessed Mar 10, 2023]

28. Rights (OCR) O for C. Summary of the HIPAA Privacy Rule. HHS.gov. 2008. Available from: https://www.hhs.gov/hipaa/for-professionals/privacy/laws-regulations/index.html [accessed Mar 10, 2023]

29. A Study to Evaluate Smartphone-based Digital Phenotyping for Relapse Prediction in Alcohol-associated Liver Disease. Mayo Clinic. Available from: https://www.mayo.edu/research/clinical-trials/cls-20516355 [accessed Mar 10, 2023]

30. Macnish K. Informed Consent. Center for the Governance of Change, IE University; 2019. Available from: www.ie.edu/cgc/research/data-privacy-individual

31. Vold K, Whittlestone J. Privacy, Autonomy and Personalised Targeting. Data, Privacy, and the Individual. Madrid: Center for the Governance of Change, IE University; 2019. Available from: www.ie.edu/cgc/research/data-privacy-individual

32. Kim J. Data Brokers and the Sale of Americans’ Mental Health Data: The Exchange of Our Most Sensitive Data and What It Means for Personal Privacy. Duke University Sanford School of Public Policy; 2023 Feb. Available from: https://archive.is/4K03p [accessed Mar 13, 2023]

33. Sherman J. Data Brokers and Sensitive Data on U.S. Individuals. Duke University Sanford School of Public Policy; 2021.

34. Shim RS. Dismantling Structural Racism in Psychiatry: A Path to Mental Health Equity. AJP 2021 Jul;178(7):592–598. doi: 10.1176/appi.ajp.2021.21060558

35. McHenry SE. “Gay Is Good”: History of Homosexuality in the DSM and Modern Psychiatry. American Journal of Psychiatry Residents’ Journal 2022 Sep 8;18(1):4–5. doi: 10.1176/appi.ajp-rj.2022.180103

36. Tasca C, Rapetti M, Carta MG, Fadda B. Women and hysteria in the history of mental health. Clin Pract Epidemiol Ment Health 2012;8:110–119. PMID:23115576

37. Bankey R. La Donna é Mobile: Constructing the irrational woman. Gender, Place & Culture Routledge; 2001 Mar 1;8(1):37–54. doi: 10.1080/09663690120026316

38. Bell CC. The Protest Psychosis: How Schizophrenia Became a Black DiseaseThe Protest Psychosis: How Schizophrenia Became a Black Disease MetzlJonathan M.; Boston, Beacon Press, 2009, 288 pages, $24.95. PS 2011 Aug;62(8):979–980. doi: 10.1176/ps.62.8.pss6208_0979a

39. Kuhl EA, Kupfer DJ, Regier DA. Patient-Centered Revisions to the DSM-5. AMA Journal of Ethics American Medical Association; 2011 Dec 1;13(12):873–879. doi: 10.1001/virtualmentor.2011.13.12.stas1–1112

40. Clegg JW. Teaching about mental health and illness through the history of the DSM. History of Psychology US: Educational Publishing Foundation; 2012;15:364–370. doi: 10.1037/a0027249

41. Ranjbar N, Erb M, Mohammad O, Moreno FA. Trauma-Informed Care and Cultural Humility in the Mental Health Care of People From Minoritized Communities. FOC 2020 Jan;18(1):8–15. doi: 10.1176/appi.focus.20190027

42. Mirmont Treatment Center. What is a holistic approach to mental health? Main Line Health. Published August 2, 2021. https://www.mainlinehealth.org/blog/what-is-a-holistic-approach-to-mental-health

43. Conner KO. Why Historical Trauma Is Critical to Understanding Black Mental Health. https://www.psychologytoday.com/us/blog/achieving-health-equity/202010/why-historical-trauma-is-critical-understanding-black-mental

44. Levy R. Beyond the Buzzwords: What Does Trauma-Informed Care Truly Mean? Mad In America. Published May 20, 2020. https://www.madinamerica.com/2020/05/beyond-buzzwords-trauma-informed-care/

--

--

Eden Shaveet

Eden is a Bridge to PhD Scholar in the Computer Science Dept. at Columbia. She holds an MS in health informatics & analytics from Tufts.