The Ethics of Digital Social Prescription

What are the ethical considerations around future digital social prescription? Recapping one of the lightning talks at HackMentalHealth 2019.

Shivani Patel
HackMentalHealth
3 min readApr 16, 2019

--

Dr. Craigen giving his lightning talk on Digital Social Prescribing at HackMentalHealth 2019 x UCSFThe HackMentalHealth Hackathon was the perfect opportunity for innovators and clinicians to come together and exchange their ideas. This year, Dr. Gerry Craigen delivered a lecture as part of the ‘lightning talk’ series on a topic that is attracting growing attention — digital social prescription. This lecture was based on a piece that we are currently working on, alongside our colleague Dr. Becky Inkster.

Here is a quick recap from the lightning talk:

Loneliness and Digital Social Prescription

In 2018 Cigna surveyed 20,000 US Adults to explore the impact of loneliness in the US. The survey indicated that nearly half of those surveyed reported sometimes or always feeling alone. Interestingly Generation Z was shown to have the highest loneliness scores on the UCLA Loneliness Scale. We know that the impact of loneliness is significant with one study indicating that loneliness has the same impact as smoking 15 cigarettes a day.

Social Prescription is a means of enabling primary care doctors and other frontline healthcare professionals to refer people to ‘services’ in the community instead of offering only medicalised solutions. The concept of ‘digital social prescription’ refers to a social prescription that has been facilitated through the use of technology, such as mobile phone applications and online platforms.

Appropriateness of an Activity for a Patient

Patients often have varying needs and this should be reflected in the activities that are made available for them. For example, a patient with limited mobility may not be able to participate in a high-intensity sporting activity. Cultural factors are also likely to play an important determining whether a social prescription will be effective. Stigma, language barriers and attitudes to mixed-sex activities are all important considerations.

Bias in the System

Algorithmic programming is central to the applications that we use today and is likely to be used in the development of a digital social prescription tool. Digital platforms including Airbnb and Uber have had to consider the impact of bias within their services, and it may be the case that similar issues relating to racial bias may also arise in the context of digital social prescription. It is important that these issues are carefully addressed, and tools such as the Data Ethics Canvas may help to ensure effective assessment and transparency.

Data Protection and Information Sharing

Data protection and information sharing are clearly important factors to consider in digital social prescription. It also appears to be a concern for consumers; in a 2017 survey confidence in data security of technology companies had declined from 31% in 2016 to 24% in 2017. Clear guidelines explaining how data is used and stored would be required in order to ensure that consent from patients would be valid. It would also be necessary to consider how these security rules would be enforced, and what remedies should be offered to those who affected by security breaches.

Patient Safety and Other Ethical Considerations

Patient safety is paramount and it is important to consider this from the outset. Several of the social programmes on offer may rely on individuals being part of a service, such as helping at a local coffee shop. In some cases this commitment may become very stressful for a patient, perhaps resulting in an exacerbation of symptoms. Furthermore, agendas of healthcare providers may differ from that of patients- for example, if the metric of success is the number of users, the application may only focus on increasing new usage rather than ensuring clinical benefit.

Originally published at https://www.hackmentalhealth.care on April 16, 2019.

--

--