Tech is coming after your health records

Dorien Luyckx
CuriousRobot
Published in
5 min readApr 3, 2018

If our personal data is already handled so poorly by tech companies, then why are they also pushing to get their hands on our medical and genetic information?

This Medium post first appeared as a newsletter. If you want to get future posts straight in your inbox, you can subscribe here.

Photo by rawpixel.com on Unsplash

Why tech companies are pushing is clear. The healthcare industry is worth 3 trillion dollar in the US alone (2080 billion dollars in Europe). But does that mean we should allow it unconditionally?

Passing almost unnoticed Apple announced in January that it will work with 13 health organisations to add medical information to the iPhone’s HealthKit. The goal is to give people control over their own data and to make it easier to share that data with whomever.

Medical data kept by the iPhone’s HealthKit (Photo by Apple)

There are many valid arguments why giving people control over their data is a good thing. Harvard Business Review sums up the upsides really well here. But if we give people the responsibility to keep their own medical data safe, who says that shady companies won’t find backdoors to squeeze this information out of people, like with personal information?

Can shitty UX/UI be a good thing?

Technology is designed to work effortlessly. Buying things with one click at online retailer Amazon, ordering a Lyft taxi in seconds, tracking your health with a smartwatch and so on. The user experience has to be seamless and quick.

But there is something to be said for a less seamless experience when it comes to sharing your medical data. If it is just as easy as pressing your thumb on a button or scan your face to approve access, how well have we thought about that permission?

Reform Terms and Conditions

Stephanie Alys from MysteryVibe, a company that builds sex toys, argues for a different kind of Terms and Conditions (T&C) and Privacy Statements people need to read and sign (anyone here who reads those? I’ve tried and failed many times) when using an app or technology.

Stephanie Alys (MysteryVibe) on SexTech, Data & Privacy at TNW conference 2017

No time to watch? In short, Alys says that companies have to take more steps to make sure people really understand these T&C. For the sextech industry where companies have a very intimate relationship with their customers, a quick Read and Agreed won’t cut it, so companies had to improve their privacy policy to build trust.

She argues that not only sextech but every tech company sould strive for an informed, enthusiastic consent and makes sure customers really understand what they are signing up for.

Can you give an informed consent if you need to read through thousands and thousands of legal jargon?

Because tech companies also haven’t given people much confidence lately that they are capable of handling our data correctly. Like Facebook and Cambridge Analytica (read more here), Google’s artificial intelligence lab DeepMind got in trouble last year when it illegally used patient records to test out their app Streams. The app was built to give care to people diagnosed with acute kidney injuries, but DeepMind used the data to test the app first (link). Afterwards the company said it had ‘underestimated the complexity of the NHS (the British National Health Service) and of the rules around patient data’.

Big Tech doesn’t tend to understand highly regulated industries that well. To think they’ll ‘disrupt’ a highly regulated industry like healthcare.. Well that feels a little naive, no?

Tech as a disguise to avoid responsibility

In her book Technically Wrong, Sara Wachter-Boettcher argues how hiding behind technology has given companies a pass on following the rules of the sector they are disrupting. Uber doesn’t profile itself as a taxi company or the social media Facebook and Twitter avoid at any cost to be called media companies, because it forces them to follow the rules of these sectors.

However, for the health industry it is dangerous to flee responsibility. Rather than hide, tech companies should champion existing data laws and put their own rules in place to avoid misuse. Apple is treading carefully and has put safeguards in place to protect the health data in their HealthKit. Apple guards the gate to the app store, but I don’t know if and how Apple is checking developers to follow the rules once they’re in. For example if they are selling the extracted data to advertisers or using it for other purposes than health and fitness. Moreover laws about selling, keeping or sharing the data are not yet fully in place (read more).

Snippet of Apple’s documentation for developers

Next level: Genetic data harvest

To go even a step further, some tech companies are gathering genetic information. The American company 23andMe sells your anonymised (up for debate) genetic data to third party companies, research institutions and nonprofit. Its privacy statement counts over 6000 words and according to an article from Gizmodo the buyers of the data don’t want to sell you things (yet), but only use the data for research. Sounds familiar? The researcher working for Cambridge Analytica said he collected the data for scientific research as well, so I hope 23andMe has a stronger vetting in place than Facebook or people will soon get stalked online by ads targeted at their genetic code.

Talk to us!

This newsletter exists for you. Do you have your own story about how technology has affected your life or are you wondering about something? Share it with us and we’ll look into it for you.

I am also building a tool for high school teachers to talk with teenagers about data, privacy and cybersecurity. Have ideas, want to help or test it out, get in touch by replying to this email!

--

--

Dorien Luyckx
CuriousRobot

Tech reporter. millennial. Founder of Curious Robot, a publication focused on the impact of technology on us as human beings and our society.