Why Privacy Matters In Contact Tracing

Ian Varley
The Startup
Published in
9 min readApr 21, 2020

Contact tracing is a ray of hope in this pandemic.

The basic idea of contact tracing, if you’re not already familiar, is that when someone is infected, we can identify everyone they might have passed it on to, and then tell those people to quarantine before spreading it to the next hop. This works because there’s a delay between when you contract the virus and when you’re spreading it to others; but this also means you have to do contact tracing pretty quickly for it to work.

Contact tracing is high stakes. As we begin to reopen society, if we don’t keep the rate of spread low, we’ll face multiple waves of re-infection (and return to social distancing). Digital contact tracing, using mobile apps, offers a promising opportunity for us to do so. (For more on this, read my previous article, Contact Tracing: One Path Out Of Social Distancing).

But digital contact tracing is also a privacy risk. In this article, I’m going to lay out (in gory detail) what risks we face if contact tracing is done in a way that isn’t careful to preserve the privacy of people’s identity, location, and contacts. (There are plenty of other things that also need to be kept private in general, like someone’s online behavior, etc; but I won’t talk about those because they’re not specifically jeopardized by digital contact tracing.)

To be clear: there are absolutely ways to do contact tracing that minimize privacy risk–that use location and contact data without revealing it. And there are important ways to further safeguard privacy, through policy (as described in Data Rights for Digital Contact Tracing and Alerting). I’ll cover those ideas in an upcoming article; for now, though, let’s just focus on what goes wrong if you mess up and fail to protect privacy.

(The points in this article aren’t just my random thoughts; they are primarily sourced from the MIT white paper Apps Gone Rogue, as well as a few other papers, which are cited at the end.)

A Rundown Of Risks

If You’re Infected

Ok, the first set of risks to talk about is for people who are already infected with COVID-19–diagnosed carriers. We’ll start here because without data from this group, there’s no contact tracing in the first place.

Let’s start with the worst case scenario. What would happen if data about infected people (including their location history) was just shared openly with the public?

We don’t have to speculate about this, because it’s exactly what South Korea did! They sent daily text messages to the entire population with specific details on infected individuals, and while they didn’t name names, it turned out that in some cases, they were sending enough information to figure it out. A few examples of who was implicated:

  • “a man in his 50s who returned from Wuhan province in China with his 30-year-old secretary”
  • “a man contracted the virus from an instructor during a sexual harassment class”
  • “a man in his 30s was in an area known for prostitution, and was accused of paying for sex”

In mob-mentality style, information like this quickly turns into witch hunts. This can have a huge impact, especially in cultures where social stigma is a powerful force. Another article on South Korean points out that malicious comments online have long been a problem in South Korea, and in some cases have even led to suicide.

If social stigma like this is hard to relate to, other kinds of persecution might hit closer to home. If you’re in a politically polarized country like the US, and you identify as liberal, imagine a far-right group deciding to target you and your family because you’re sick and posted something left-leaning on Twitter. Or, vice versa, if you identify as conservative or a person of faith, imagine if a far-left group started targeting infected churchgoers by revealing their locations. It doesn’t take much to imagine this getting very ugly, very quickly.

And before you think “I have nothing to fear because I haven’t done anything wrong!”, remember that being actually innocent doesn’t matter here; people will jump to conclusions and there’s no accounting for coincidence or even technical glitches. (When was the last time your driving GPS momentarily thought you were in the wrong place?) And the risk doesn’t just touch people who install contact tracing apps. If your friend or family member’s private data is disclosed, it might implicate you as a target as well.

For all these reasons, it’s extremely important to never publicly disclose the identity (or home location) of someone who’s infected. They might face retaliation, or even physical danger. Remember: this isn’t about what well-adjusted, calm people will do with the information; it’s about what people on the brink of disaster will do with it.

Risks If You’re Not Infected

Now let’s talk about risks to people who aren’t known to be infected (i.e. the general population). Let’s start with individual risks (i.e. to specific people). What happens if your private data (about identity, location, contacts, etc) is made public without your consent?

Chances are pretty good that if you’re reading this, you’re in a comfortable position, relatively speaking. Statistically speaking, you’re probably in a developed nation, with high literacy rates, strong public services, and a democratic government. Some large percentage of you (again, statistically speaking) are in the economic middle class, and likely privileged in other ways too.

I mention this because if any of these things are true, then you might have a hard time putting yourself in the shoes of someone who is acutely at risk, when their personal information isn’t kept private. For example:

  • If you don’t live in an authoritarian country, it might be hard to understand why sharing your location data is dangerous.
  • If you’re not in a minority group, it might be hard to see why public knowledge about your infection status is risky.
  • If you’re not in an abusive relationship, it might be hard to see why people finding out about your contact history could put you in danger.

If you have some level of comfort and safety, just remember that it is not shared with everyone equally. Privacy matters to all of us, but for some of us, it can be a question of life and death.

And for the fortunate among us who don’t feel directly at risk in these ways, remember that anyone’s location and contact data can be profitably used by criminals. What times are you typically away from home? How much money do you make? What can someone blackmail you with? These are all things that criminals can learn a lot about if your private location and contact data is made public.

So, let’s be blunt: if any app collects identifying information, location data, or contact data and fails to keep it strictly private, that will be directly harmful to individuals.

Risks To Businesses

Next up, let’s think about businesses. What do they have to lose from contact tracing without proper privacy?

Imagine you’re a small neighborhood Italian restaurant. You scraped through during the height of social distancing in March and April by furloughing some servers and doing some takeout business. But once social distancing starts to relax, you open your doors again for sit-down business, and things are back on track. Your regular customers, like Bob, are overjoyed and come back every day for a week.

Then, Bob catches COVID-19. Let’s say he finds out early (thanks to a contact tracing app!) and self-isolates, so he doesn’t actually infect anyone at your restaurant. But then, his location data is made public, and it comes out on Twitter that he spent time at your restaurant every day. How’s that for business?

(This brings up a really good point: even if you are personally unafraid to share your private location data, remember that it’s not just your privacy that’s at stake.)

Again, we don’t have to just use hypotheticals here. Examples from China and South Korea suggest that this is exactly what happens; businesses who were connected with people having the virus faced significant losses. (In fact, if you even allow systems to share location data with that level of accuracy, you’re open to all kinds of bad behavior; take, for example, the case of the person who blackmailed businesses in China saying that he would release location data showing that he was infected in their restaurants unless they paid him.)

The same mob mentality that applies to individuals can also apply to businesses, and so while it’s not exactly right to say that businesses can have “privacy” in the same way a person can, they do depend on the privacy of their customers.

Risks To Society

So far, the risks we’ve talked about only impact specific individuals or businesses. But tracing tools also pose a collective risk, to our entire society. There’s an acute risk that powerful data collection tools for defeating COVID-19 might end up contributing to a surveillance state, where individual liberties and privacy rights are no longer respected by our governments.

Every society, regardless of its espoused values, will have forces that pull in the direction of more central surveillance, often for seemingly good reasons. Law enforcement agencies, for example, would love to be able to catch more criminals, and make life safer for the rest of us. Government agencies would love to distribute services more equitably by getting more data about what everyone is really doing. The intent in these cases is not to trample privacy, and it’s theoretically possible to create surveillance technologies that are tightly hemmed in by policy and never used in ways that disturb individuals’ right to privacy and anonymity.

But if history is any guide, this theoretical possibility is rarely what actually happens. Instead, we get “mission creep”: it was started to do X, but now it also does Y. As the revelations from Edward Snowden about the NSA have shown, personal data dragnets are all too real, and how “benign” a data repository like this is depends entirely on who is currently in power. There are few checks and balances on governmental overreach, and what we are told is not always the full truth (as this exposed memo about de-anonymization in the British NHS COVID-tracking app shows).

So if you value basic liberty, as most people in modern democracies certainly do, then it’s absolutely crucial to look very carefully at the technologies of tracing, and ensure that we build things in a future-proof way that not only sings the right tune about privacy, but backs it up with guarantees (preferably technical ones). As a group of concerned scientists say in this recent open letter, “solutions which allow reconstructing invasive information about the population should be rejected without further discussion.”

Is this theoretical? As in our above examples, no. To quote from the MIT paper Apps Gone Rogue:

“In China, users suspect an app developed to help citizens identify symptoms and their risk of carrying a pathogen spies on them and reports personal data to the police. The Google Play store also pulled the Iranian government’s app amid similar fears and South Korea’s app to track those in self-quarantine automatically notifies the user’s case worker if they leave their quarantine zone.”

Many international organizations, including the EFF and Amnesty International, have issued statements about the importance of this. There has never been a more critical time for us all to stand up and demand it. Because, to quote the introduction from The End Of Trust (the EFF’s curated issue of McSweeney’s in 2018), “Next to go will be our rights to speak truth to power and to express our uncensored selves, anonymously or otherwise.”

Bonus: Privacy Increases Adoption

So that’s a broad (but certainly not exhaustive) survey of the risks if we aren’t careful with our digital contact tracing technologies.

There’s one more important point about why privacy matters: people care about it, and thus will be more likely to install an app that respects their privacy.

To state the obvious: for digital contact tracing to be effective, it has to be widely adopted. If Alice has the virus and she infected Bob, it’s not enough if just one of them has a contact tracing app on their phone; they both need it for Bob to be notified about the contact. But if people fear for their privacy, they’ll be less likely to do so. The more privacy people perceive in contact tracing apps, the more adoption there will be, which leads to stopping more virus transmissions before they happen.

So in this sense, regardless of the actual benefits to our privacy, it’s also important to preserve privacy (in a very transparent and obvious way) so more people use the apps in the first place. As the ACLU’s Jay Stanley put it recently, “This is a situation where privacy and public health are very aligned.”

I’m Ian Varley, a software architect living in Austin, TX. I am not a trained epidemiologist, nor do I play one on TV; I’m just a concerned layperson trying to help people sort through a mass of information. The statements in this article are entirely my opinion, not the opinion of my employer. As much as possible, I’ve tried to include only statements that are backed up by other sources, including the inline linked articles, as well as the following references:

Big thanks to Shivan Sahib for contributions to this paper!

If you see any misleading or inaccurate information , please comment!

--

--