When Google and Apple get privacy right, is there still something wrong?

As the tech giants develop privacy-friendly technologies to help tackle the COVID-19 pandemic, we need to focus on the bigger picture of what’s at stake

As the understanding that we are in this for the long run settles in, the world is increasingly turning its attention to technological solutions to address the devastating COVID-19 virus. Contact-tracing apps in particular seem to hold much promise. Using Bluetooth technology to communicate between users’ smartphones, these apps could map contacts between infected individuals and alert people who have been in proximity to an infected person. Some countries, including China, Singapore, South Korea and Israel, have deployed these early on. Health authorities in the UK, France, Germany, the Netherlands, Iceland, the US and other countries, are currently considering implementing such apps as a means of easing lock-down measures.

There are some bottlenecks. Do they work? The effectiveness of these applications has not been evaluated — in isolation or as part of an integrated strategy. How many people would need to use them? Not everyone has a smartphone. Even in rich countries, the most vulnerable group, aged over 80, is least likely to have one. Then there’s the question about fundamental rights and liberties, first and foremost privacy and data protection. Will contact-tracing become part of a permanent surveillance structure in the prolonged “state of exception” we are sleep-walking into?

Even if the Apple/Google proposal does get the privacy issue right, what other trade-offs are involved in letting these companies contribute to the development and deployment of what might be the largest scale crisis management measure for the pandemic so far?

Of course, Europeans and civil society are not the only ones who care about privacy. Apple and Google do too. On April 10, the two tech giants revealed they were collaborating on their own version of contact-tracing technology — one they claim takes privacy very seriously. While privacy-friendliness may be something we’re getting used to hearing Apple promote, precisely as a means of distinguishing itself from other data corporations, it’s certainly not the first design principle we associate with Google. And yet, the specifications of the Apple/Google proposal seem very close to what top privacy advocates are laying down as necessary conditions for privacy-friendly contact-tracing. Non-traceable identifiers? Check. Transparency? Check. Opt-in? Check. De-centralised? Check. Indeed, the European Data Protection Supervisor offered an initial endorsement of the initiative, while the authors of the ultra-privacy aware DP-3T welcome the proposal as “very similar” to their own.

Image: SDB

Not all privacy experts agree about the level of privacy protection the Apple/Google proposal will deliver. And some are calling for the implementation of additional safeguards, such as careful auditing and mechanisms for ensuring the technology can be uninstalled once the pandemic is over. But these discussions risk losing sight of the bigger question at hand. Even if the Apple/Google proposal does get the privacy issue right, what other trade-offs are involved in letting these companies contribute to the development and deployment of what might be the largest scale crisis management measure for the pandemic so far? Privacy is certainly an important issue, but it cannot address the bigger picture.

The Apple/Google contact-tracing proposal is hardly the first contribution that the tech giants on either side of the globe are making to help tackle the COVID-19 pandemic. Since the beginning of the pandemic, they have been involved in developing (somewhat less privacy friendly) surveillance tools using location data (Google, Facebook, Alibaba, Palantir), setting up screening services (Verily, Apple), giving away laptops to facilitate distance-learning (Google), developing AI for diagnostics (Alibaba, Baidu), and funding COVID-19 related research (Microsoft, Facebook), amongst others.

They effectively move from having a seat at the drawing table, where inclusion is (and should be) determined by technical expertise, to having a seat at the decision-making table, where inclusion should be (but hardly is) determined by democratic values.

Function creep does not only apply to technologies. If rolling back an app may prove difficult once the pandemic is over, try rolling back the presence of such powerful actors.

Associate prof philosophy of technology, Radboud University. Co-director at @ihub_ru. Politics and ethics of digital (health) tech