Health data. Who owns it?

Sandy Wright
7 min readOct 14, 2018

--

The Royal Society of Medicine’s Digital Health Section holds regular meetings on the latest developments in digital health. On September 24th 2018, attendees convened on the RSM’s home in central London to hear from prominent speakers on the subject of health data. Entitled ‘Health data: Who owns it and how to keep it safe’, the meeting explored the current issues and ethics surrounding the use and misuse of health data. To find out about upcoming Digital Health events at the RSM, click here.

The Royal Society of Medicine — 1 Wimpole Street, London

Digital health events do not normally feature kettles. Let alone smart kettles. But the attention of the audience was trained on this piece of stainless steel perched precariously on the speaker’s podium. Whilst many may have questioned the value of a kettle you can control from your phone, Ken Munro, ethical hacker and security entrepreneur, was demonstrating the risk of bringing devices like this into your home.

Minimising his presentation slides, Ken swiftly pulled up a command line on his computer, accessed the kettle’s operating system, and successfully retrieved the unencrypted wi-fi password to which the kettle was connected. He explained how the ‘smart’ kettle was an easy way for any hacker to gain access to a home network and the devices and web traffic contained within it. Ken’s message was blunt. Medical IoT devices are no different from consumer focussed ‘smart’ products and suffer from the same security vulnerabilities. Theoretically, the same techniques could be used to take control of someone’s cardiac pacemaker or insulin pump.

It was a sobering start to a day of talks that were to focus on the topic of health data and ownership. But before we could broach the subject of ‘ownership’, David Ferbrache, technical director at KPMG, had revealed the extent to which our health data had already been stolen. According to Axios, over 175 million healthcare records have been hacked since 2010. There is currently a glut of health data on the dark web, driving down the price of individual health records for those who want to use them for illicit purposes. As David went on to explain, health records are traded and sold just like any other commodity, with discounts for bulk purchases and try-before-you-buy agreements.

So what is the key to preventing further hacks? According to Professor Paul Dorey, visiting Professor at Royal Holloway and founder of CSO Confidential, it’s about empowering people and creating a ‘just’ culture . During his presentation, he argued that people are wrongly labelled as the weakest link in data security. In fact, it is poor processes and unreasonable expectations that drive people towards workarounds that ultimately put data at risk. If companies can create an environment where workers feel comfortable to talk about security openly and assist with the development of cybersecurity protocols, data will ultimately be better protected.

The citizens of Estonia are a cohort who have been co-opted into the development of their digital democracy. Almost all public services in Estonia are administered or accessible online - making it one of the most advanced digital societies in the world. In the 2015 general election over 30% of votes were cast online. Dr Ain Aaviksoo, Digital Health Evangelist and former CIIO of the Ministry of Social Affairs in Estonia told the conference how Estonia had implemented blockchain as a electronic health record solution before blockchain became the heralded technology it is today. Estonian companies like Guardtime have helped the country develop sophisticated blockchain technologies such as KSI (Keyless Signature Infrastructure) that ensures data is authentic, private and safe.

Having explored the current vulnerabilities of health data, the conference turned to focus on the question of ‘who owns data’. No discussion on this topic would be complete without a healthy recap of EUGDPR and a review of the DeepMind Royal Free ‘Streams’ app controversy. Helpfully, Linklaters, the law firm who carried out the third party audit of ‘Streams’, were on hand at the conference to explore issues around data compliance.

To summarise, in 2015 the Royal Free hospital began to work with DeepMind on a project to help with the detection of Acute Kidney Injury. As the project started to gain public attention, the hospital was investigated by the Information Commissioner who raised concerns about the agreement between the two institutions and the degree to which patients had been informed about the joint project. The independent review by Linklaters found that the Royal Free’s use of streams was lawful, and concluded that Deepmind had only used patient data for the Streams project. Specifically, patient data was not used to train an AI algorithm (in fact the Streams project used a comparatively simple decision tree tool), and both institutions were deemed to have taken appropriate measures to protect sensitive patient data.

The issue, then, was one of communication. Poorly communicating with patients on how their data will be used. And in many ways, this is what EUGDPR has set out to improve. As our inboxes filled up in May with emails from every company we’d ever given our email address to, we were bombarded with their new-found policies on how our data was going to be collected and processed. We as consumers now had the right to portability and erasure of data, and companies had new requirements to brush up on their privacy notices and appoint data protection officers or else face significant fines.

The driving force behind EUGDPR is to empower consumers to take control of their data and ensure those who act as custodians of our data are subject to the appropriate checks and balances. But that doesn’t help us understand who actually ‘owns’ this data. John Rumbold, Senior Research Fellow at Nottingham Trent University, highlighted the case of ‘Oxford vs Moss’ during his presentation on data ownership in biomedical research. In this seminal case, a university student stole a forthcoming exam paper with the intention of returning the document having used the information to cheat on his exam. The court ruled that the confidential information contained within the exam paper could not be classified as property and therefore was incapable of being stolen (according to the Theft Act 1968). John concluded that if data isn’t technically classed as property, it is therefore impossible to own it.

The conference put forward the idea of individuals having ‘quasi-ownership’ over their data. GDPR has given us certain rights (to portability and erasure etc), whilst those who control and process our data also maintain certain data rights to use that data in appropriate ways. This was certainly the argument put forward by Luk Arbuckle, Chief Methodologist at IQVIA. But in the realm of health research, it is the role of the data controllers to ensure the appropriate risk based anonymisation is put in place to both satisfy GDPR whilst ensuring data is still a useful utility in health research.

When we talk about health data, many may think about specific health parameters captured at specific time points. Blood test results recorded when we go to hospital, blood pressure measurements when we visit our family doctor. But what about lesser contemplated forms of data? Data gathered from voice or facial recognition software for example. What about the data obtained from new types of medications that can track when we’ve actually ingested a tablet? These are not problems of the future. They are potential issues that are with us now.

Lienkie Diedericks from King’s College London discussed the ethical implications of the emerging field of ‘digiceuticals’. Proteus Digital Health in the US have developed a novel version of the antipsychotic medication — aripiprazole. Using their proprietary technology, the ingestion of the medication can be tracked via a smart patch on the patient’s skin and the data relayed to a mobile application. In effect, the patient’s compliance with their medication can be tracked and quantified.

This novel intervention digitises pharmacology. It shifts the focus away from the product’s interaction with the human body towards the consumer’s behaviour in relation to medication adherence and compliance. Although technological developments like this are being mooted as a way of giving patient’s more ‘control’, it raises the uncomfortable question of how pharmaceutical companies will use this data. Will they intervene if compliance is poor? Will health insurers have access to the compliance data? In the specific case of aripiprazole — is it ethical to digitally track drug compliance in a population who are taking a medication for a condition that already predisposes them to suspicion and paranoia? Lienkie’s conclusion is that regulation lags way behind the current technological developments.

As the conference wrapped up, there was a prevailing conclusion that the regulatory and ethical considerations were not keeping pace with current technological advancements in the healthcare space. Although many of the speakers highlighted the value of patient data in developing novel technologies and ultimately improving patient outcomes, ‘personal data’ is a term that has significant emotive connotations for a large proportion of general public. Whether this scepticism and trepidation serves as an impediment to further advancements or acts as a healthy checkpoint to rogue players, the answer is still unclear. The argument of data ownership is one that is likely to continue for many years to come.

If you enjoyed this article and want to get in touch, find me on Twitter @SandyCEWright

--

--

Sandy Wright

NHS Doctor | NHS England Clinical Entrepreneur | Royal Society of Medicine Digital Health Council Member