Given that I run a messaging platform, Hospify, specifically designed to offer people a data-compliant alternative to tools like WhatsApp, Messenger and Telegram when chatting in a health care context, it’s no surprise that I’m often asked: “What’s the matter with WhatsApp?”
So here it is: my cut-out-n-keep guide to the subject, in eight easy lessons.
1. Where it’s at
Under the EU’s General Data Protection Regulation, which got enacted in UK law back in May (just in case you’ve had your head under a rock all year), personally identifiable data held about other people by you as a user of a technology platform should be stored, physically, somewhere in Europe. Meaning that the servers have to be in Europe and only in Europe, not spread all round the planet, like WhatsApp’s are.
Why does this matter in health care? Because users of a health care messaging platform are likely to include doctors and nurses, and doctors and nurses tend to talk about patients. As soon as you mention a patient by name in a text message and add any details about their condition, then you’re holding personally identifiable data about them — and data of the most personal kind.
If you worked in insurance or marketing, you’d have to ask (and get a record of) the patient’s permission before you could send or store that information on the internet. But thankfully GDPR contains an exemption for those who work in health and care: they are allowed to communicate and store details about patients without asking express permission, as long as they’re doing it in the course of delivering their care.
To take advantage of this exemption, though, UK & EU-based health care professionals need to use a communications system that handles data in a way that is otherwise compliant with both GDPR and the information governance rules of their health care employer. WhatsApp is compliant with neither, purely on the basis of the geographical location of its servers.
2. Hand it over
The second problem is to do with accessing the information once it exists. WhatsApp messages are encrypted both in transit as they ping around the internet and at rest on WhatsApp’s servers, where they’re stored. But storing them like this creates big problems. If you’re a doctor and you’ve chatted with another doctor about one of your patients — to get some advice or a second opinion about their condition, for example — then you don’t own that data. Your employer, i.e. the hospital or surgery where you work, owns it instead, even if it’s on your phone. Your employer is therefore ultimately responsible for it, and — by law — has to be able to hand it over to the patient if the patient asks for it, which patients can do by issuing a fairly straightforward subject data access request.
As we’ve seen from cases like the 2017 Westminster knife attack, when WhatsApp refused to hand over the content of the attacker’s messages to the Home Office on the grounds that even it couldn’t de-encrypt them, getting access to WhatsApp messages is tricky. This creates a paradox. In the case of the patient, the law says that the hospital has to hand them over. But if they’re on WhatsApp it cannot hand them over, because without de-encrypting them it can’t work out which ones they are. So because a doctor talked about a patient on WhatsApp, and that patient issued a subject data access request, the hospital is now in data breach twice over: because the messages are being stored on a server outside of Europe (most likely at a WhatsApp server farm on the Eastern seaboard of the US), and because it cannot de-encrypt the messages and hand them over.
3. Snap happy
Another issue one is photos. Have you ever received a picture on WhatsApp? Have a look in your phone’s main photo gallery. The picture will most likely appear there, as well as in WhatsApp itself. This is because nearly everyone’s devices automatically backup such pictures to cloud services that are likely to be geographically-located outside of Europe, and often shared with other members of your family. Even if you switch this feature off, gaffs by Apple and others can mean it gets switched back on without your knowledge.
4. Notify me
Another inadvertent source of data breach is the home screen notification. You can switch notifications off for WhatsApp, but almost no one does — you want to know when you’ve got a new message, after all. The trouble is that the notification contains a snippet of that message, available for anyone within viewing distance of your phone to see. This potentially exposes sensitive patient data to prying eyes, breaks most employers’ “clean screen” policies, and is therefore another reason that WhatsApp doesn’t pass muster when it comes to health care information governance.
5. UnPINned access
It’s also not possible to set a separate PIN code or fingerprint lock on the WhatsApp app itself, which therefore relies solely on your phone’s security lock to keep intruders out. If your phone is stolen or you leave it on the train and you’ve left it unlocked for any reason — increasingly likely now that lots of phones offer to keep themselves unlocked for convenience when they’re connected to wireless devices like keyboards or headphones — then there’s nothing to stop someone getting access to your entire message history.
6. Conspiracy theories
Then there’s the question of what WhatsApp is really doing with your data. Earlier this year Google struck a deal with WhatsApp (which itself is owned by Facebook) to allow WhatsApp users to back up all their chats and photos to their Google Drive accounts without impinging the 15GB free storage limit set on those accounts.
Now, this seems quite an odd thing for Google to agree to, given that Google and Facebook are major league competitors for online advertising spend. Would Google do such a deal out of the goodness of its heart? Call me paranoid, but I don’t believe it would. Presumably it’s getting some kind of value out of storing all that content which, despite being encrypted, would still be rich with all kinds of associated metadata that the search giant could use to improve its profile and advertising of — yes, dear reader — you.
7. Secure doesn’t mean secure
All of which bring us to the thorny issue of security. People think that WhatsApp is really secure because all its messages are encrypted. But it turns out that it’s not that secure at all. A bunch of white-hat hackers called Check Point Research recently found that WhatsApp’s QR-code feature, which allows a user to route his or her account via a laptop or desktop computer for ease of access, contains a vulnerability that allows an attacker to intercept group messages, change the identify of the sender, alter the text of replies to the group, and send private messages that go public to a group when responded to — all of which open the app to abuse and compromise privacy.
8. WhatsApp is changing
Finally, did I mention that WhatsApp is now owned by Facebook? Back in June WhatsApp’s original founders Jan Koum and Brian Acton resigned from the board of the company in protest at Facebook’s plans to introduce marketing and advertising into their chat app — which they’d faithfully promised from the service’s inception would never be allowed. (They were serious, too — their resignations cost them around $1.5bn in forfeited share options; a hefty price to pay for sticking to your principles). What does this mean? It means that Facebook’s coming after the data you expose through WhatsApp in order to allow businesses to target you. And if the data you’re exposing is information about someone else’s health, then that’s a major problem.
Don’t get me wrong. WhatsApp is a great tool that delivers 65 billion message a day to its 1.5bn users around the world with incredible efficiency. I use it to keep in touch with family and friends, and you probably use it too. But that utility does not make it appropriate for communicating in situations where one user has a legal and social responsibility to safeguard another user’s privacy, and that’s the case in health care.
Which is exactly why we built Hospify — a chat app with the utility of WhatsApp but without the vulnerabilities outlined above, that health care professionals and patients can use without worrying that they are inadvertently going to fall foul of the increasingly stringent data protection laws now in place in the UK and EU.
If you work in health care check it out — the basic service is free because there’s a premium version that people pay for, not because we sell your data!