Health and appiness: Respect and trust are vital when it comes to our most personal data

Simon Carroll
Internet of Me
Published in
7 min readAug 1, 2016
Our everyday tech can plug medics and researchers into rich data about our health and lifestyles but innovation must be built on a foundation of trust and respect for our rights

A little knowledge can be a dangerous thing, especially when it comes to your health. Google a symptom and you might well be confronted by the prospect of that minor rash or ache being some horrific, rare, possibly fatal condition rather than the minor common ailment it really is. But then, quick fixes for problems are seldom the best fixes. Think of the famous Einstein quote “If I had an hour to solve a problem, I’d spend 55 minutes thinking about the problem and five minutes thinking about solutions”.

The more informed we are, the better our chances of finding the right answers and solving problems. The more information that’s available, the less we have to rely on assumptions and hunches.

John Snow famously discovered that a contaminated water pump in London’s Soho was responsible for a major cholera outbreak in 1854, rather than the ‘bad air’ many believed to be the cause. He did so by mapping where victims lived which revealed a pattern that led to the water source, helping to prove ‘germ theory’ in the process.

He used data to solve the problem.

Today, the amount of data readily available to help us solve problems is vast and ever growing. The opportunities for innovation in healthcare are limitless, but the speed of progress is far from consistent. The private sector is busy creating platforms, APIs, services and products to feed growing consumer and business markets. Parts of the public sector are similarly enthused, while others are stuck in a time warp.

As individuals, we are generating huge amounts of valuable data on our health, fitness and lifestyles. Our smartphones — and, increasingly, our wearable tech — can constantly monitor everything from steps walked to hours slept. The market for apps helping us live better is crowded and appropriately active. Our devices offer similar opportunities for medics, healthcare providers and researchers, with quality data at low cost.

Consider just a few examples of what’s happening at the medical end of things. Omada offers health coaching to tackle chronic diseases that are mostly lifestyle related. Users undertake a 16-week plan featuring a real human coach plus online lessons, games and support, and tech including a digital scale, pedometer and exercise bands — all synced and ready to go.

AliveCor’s Kardia allows DIY heart monitoring through a medical grade ECG monitor that works with a smartphone app. It can detect abnormal heart rhythm known as atrial fibrillation, which affects 835,000 people in the UK, and can relay results directly to a doctor for analysis. Human API provides a platform for businesses to access their users’ clinical and wellness data, from medical records to fitness tracker information.

Apple HealthKit provides APIs for iOS developers to integrate their products with the Health app dashboard which, as well as creating a complete picture of a user’s health, lets them share data with their doctor. Apple also has CareKit for developers building wellbeing partner apps and ResearchKit to open all this rich health and lifestyle data to researchers.

The side effects

Apple isn’t the only major tech firm eyeing the potential of these accurate, real-time, mass data sets. Having triumphed over the human world champion of Chinese board game Go, DeepMind, the artificial intelligence company acquired by Google, has shifted focus to collaborating with several NHS trusts to pilot tech-driven healthcare apps. Streams is an app being developed with the Royal Free Hospital London which detects acute kidney injury. Hark, which the company acquired, offers planning tools that drive efficiency and effectiveness. Both aim to tackle “a reliance on outdated healthcare IT, like pagers and fax machines”. DeepMind has also recently announced a partnership with Moorfields Eye Hospital to tackle common causes of blindness related to diabetes and ageing.

The potential benefits from such collaborations — from better medical outcomes to lives saved — are clear to see. However, they are seldom without a degree of controversy. In its work with the Royal Free Hospital NHS Trust, DeepMind came under fire for the data-sharing agreement which allowed it access to a massive sweep of patient information not limited to the kidney condition which is the focus of the project. DeepMind has given assurances that the data is not being shared with other parts of the Google empire and insists that when Streams moves out of prototype phase it will comply with the necessary regulations. Questions have also been raised over Apple’s ResearchKit and whether anonymised data subjects could become identifiable by third parties sharing the information.

More recently, the UK Government has pulled the plug on care.data, a scheme to share NHS patient data with the aim of improving care and research. After a number of stalled starts, the project was shut down after two reports called for tougher data governance. While there were clear shortcomings, care.data was ultimately scrapped for fear of a data disaster rather than because of one.

It is interesting to note, though, that much of the blame for the failure of care.data has been pinned on inadequate communication — they failed to sell it to the public, who were suspicious of what was being done with their data. More than a million opted out of participation.

In a way, it is encouraging that people approach innovation that is driven by their personal data with this degree of scepticism. We should care who has our information and what is done with it. However, this was a golden opportunity to get people on board with a major public data initiative and the benefits to healthcare — not to mention the potential cost savings — should have made a compelling case if they had been communicated properly. It is a cautionary example of how personal data can very quickly make people recoil if there is a hint of impropriety or risk.

A patient data taskforce has now been set up by charitable foundation Wellcome, whose director Jeremy Farrar, said: “We will only unlock the immense value of patient data if we have open and honest discussions about how and why data can be used for care and research, what’s allowed and not allowed, and how personal information is safeguarded.”

The “immense value” of patient data is clearly front of mind for NHS chiefs. Professor Sir Bruce Keogh, NHS England’s Medical Director, believes wearable devices that can unobtrusively measure health indicators such as heart rate, calories consumed, exercise taken and even detailed physiological changes will revolutionise the health service. He said: “This monitoring will help keep people safe in their own homes rather than just waiting for serious deterioration necessitating an ambulance or GP call, followed by admission to hospital for several days.”

However, the problem of who does what with our data becomes greater when major players such as Google are involved. There are those who feel strongly about the motives of large corporations operating in public services. There are many cases in which big businesses have done themselves few favours in the way they have behaved and performed on government contracts. Then there is the wider negative issue of how consumer data is used to target and market in the commercial world.

The pressures on our health service and on public finances, though, mean there is too much at stake for the debate to become polarised on ideological grounds.

Profit emotive

Making a profit from public health services is surely only wrong if it as the expense of patients. If those patients are getting a better service that is more efficient and cost effective, where’s the problem in a private sector provider making money?

Without a bottomless pit of public funding, the alternative is poorer services and worse outcomes which will still make huge demands on scarce resources.

Sacrificing our personal information, no questions asked, should not be the price we have to pay for such innovation. But nor should we allow unfounded fear to strangle progress. To accuse tech giants of profiteering opportunism is short-sighted.

Making the case for DeepMind’s work with the NHS co-founder Mustafa Suleyman sought to reassure doubters by insisting: “We want to earn public trust for this work, and we don’t take that for granted.”

But we don’t have to just take his word for it. It is worth noting that it was kidney specialist Dr Chris Lang at the Royal Free Hospital who approached DeepMind, not the other way round. Doubtless like many medics, he saw DeepMind as the most effective means of making progress — work with a company that has the knowledge, technology and money to make things happen. If the result is lives saved and health improved, that is a compelling argument for such collaboration in itself.

It would be nice to think that fabulously wealthy corporations might want to pursue social and health innovation for the good of society regardless of profitability. There is a moral argument for doing so but it is also in the interests of business to operate and trade within functioning, healthy and fair societies.

It is also in the interest of the innovators such as DeepMind — and its parent Google — to understand both the importance of respecting people’s information and privacy and the real danger to future ambitions if they fail to do so. Take liberties with personal data and the well could be poisoned for a very long time. Lay solid foundations of trust and transparency now and it will be possible to do far more in the future.

Please hit the ‘heart’ icon below to share this with your network.

More from Internet of Me:

Internet of Me is supported and sponsored by digi.me

--

--

Simon Carroll
Internet of Me

Editor of Internet of Me, a forum exploring the issues surrounding personal data. Journalist and writer for businesses and brands. simon@internetofme.net