Voice Assistants: NHS & Alexa

No, the NHS hasn’t sold your health data to Amazon.

Tessa Darbyshire
Voice Tech Podcast
5 min readJul 19, 2019

--

Anyone could be forgiven for thinking that a partnership between a precarious UK National Health Service and the behemoth that is Amazon sounds like the beginning of a truly terrifying Black Mirror episode.

The day may come soon when the world careens into hell in a handcart, but it is (probably) not this day.

In the wake of the Facebook-Cambridge Analytica scandal and the hype around the first fines under the EU General Data Protection Regulation (GDPR), scepticism about private sector corporations having access to sensitive personal data are entirely justified.

A few that are floating around in the digital ether include:

Has the NHS paid Amazon for this partnership?

Is my health information now owned by Amazon?

Does this mean Amazon knows what pills I’m taking?

The answers are, respectively, no, no and not unless you’ve told Alexa.

It’s important to know that sharing data with Alexa isn’t necessarily a bad thing (though the tone of many of my articles may suggest otherwise). For example, a skill for the virtual assistant called My Carer is designed to support people living with Dementia by reminding them which pills to take, when to take them and where they are stored. It can help them remember appointments, answer questions they may need to ask multiple times and recall facts about family and friends.

The skill was the product of a collaboration between The Alzheimmers Society, McCann Worldgroup and Alexa application specialists at Skilled. The team is hoping to offer support to the 1 million patients predicted to suffer from the disease by 2025. There’s a good case to be made that other health focussed Alexa skills might improve the quality of living for patients with a wide range of medical difficulties.

The deal isn’t that Amazon has access to your health data, it’s that amazon can use approved NHS sources to enable Alexa to answer the questions people ask their personal virtual assistant.

So, this partnership might not be as bad as the headlines imply.

As OneZero reported, some health care providers are taking to social media in an attempt to firefight against the spread of medical misinformation. The rise of the anti-vaccination movement is the case with the most coverage in the press, but each specialism is facing it’s own war against oppositional communities in the digital sphere. In 2022, the World Health Organisation will release the 11th version of its influential International Classification. The publication has been a source of controversy because of the decision to include a chapter on Traditional Chinese Medicine.

It isn’t necessarily the decision itself that is a cause for concern, but the impact it may have on global demand for substances associated with TCM, however legitimately, including products of the body parts of increasingly rare wild animals including tigers, pangolins, bears and rhinos. The problem isn’t the inclusion of TCM in official documents, it’s the spread of misinformation that is extremely difficult to tackle. David Gorski, a surgical oncologist at the Barbara Ann Karmanos Cancer Institute and managing editor of the popular medical blog Science-Based Medicine, took to Twitter to argue this point:

Build better voice apps. Get more articles & interviews from voice technology experts at voicetechpodcast.com

“We frequently assume the answer to bad information is good information, in greater quantity than the bad information, but we’ve known for some time that this is not how human minds work,There just aren’t enough doctors and scientists who are interested enough in and good enough at social media to execute this strategy, at least not compared to the armies of cranks, trolls, and bots spreading the misinformation.”

If this is the case, if there really aren’t enough qualified individuals to take on the medical information war and win, the partnership between the NHS and Alexa might be a battle that could start to turn the tables.

Want to know what to take for that headache? Ask Alexa. Want to know if you should talk to someone about that rash? Ask Alexa.

Some experts predict that by 2020 half of all searches will be via voice (enjoy the wikihole on that one, here). If that’s the case, ensuring that the searches that relate to personal health are answered using health care proider approved sources, rather than delivering the results of popular internet searches, seems like a good way to begin tackling misinformation.

This partnership also looks set to improve accessibility. Enabling voice search could support people with various disabilities including those who are blind or who are unable to use their hands.

The third bonus benefit is the potential to reduce the pressure on emergency services. If people can be persuaded to wait for their common cold to pass, rather than turning up at A&E on a hectic Saturday night, that might support staff who would otherwise spend time processing patients with minor ailments. As the gov.uk blog on the subject states:

“Many searches on the NHS website are for common minor ailments, and for information on symptom control for those living with long-term conditions. Often — with the right information — people are better able to self care or ask a pharmacist’s advice.”

Sounds ideal. So, what’s the problem?

The potential problem is that, whilst the NHS isn’t selling the health data it has on you already, the process of asking Alexa about your medical woes means that each time you ask, Alexa stores a recording and a transcript of the conversation you have. These are encrypted in transit and stored securely. You can log into your account to view both the recordings and the transcripts, but you can only delete the sound file.

What about the transcripts? Well, there’s no option to delete these, and the privacy policy only states that “the voice recordings associated with your account are used to improve the accuracy of the results.” Whilst Amazon has made statements to the effect that it isn’t in the business of selling personal data to third parties, profiling you based on your use of Alexa is exactly what ‘improving the accuracy of the results’ means.

There are multiple open ended questions here about whether searches over time can be aggregated to give Amazon a detailed health profile of an individual and whether or not this digital health persona, which is inaccessible to the user, could have detrimental consequences down the line.

Going forwards: Caution advised.

Voice assistants are here, and they can help in a myriad of previously unimaginable ways. So, if you need a hand, ask Alexa. However, if you’re worried, ring 111 and if you’re still worried, make a GP appointment. If you think you’re at risk, head to walk in or A&E. If you can’t get to A&E, ring 999. If you’re worried about your data privacy, ask Amazon.

--

--

Tessa Darbyshire
Voice Tech Podcast

Scirntific Editor @ Patterns (Cell Press Data Science) Fascinated by cross sector applications of DS.