When Google and Apple get privacy right, is there still something wrong?
As the tech giants develop privacy-friendly technologies to help tackle the COVID-19 pandemic, we need to focus on the bigger picture of what’s at stake
As the understanding that we are in this for the long run settles in, the world is increasingly turning its attention to technological solutions to address the devastating COVID-19 virus. Contact-tracing apps in particular seem to hold much promise. Using Bluetooth technology to communicate between users’ smartphones, these apps could map contacts between infected individuals and alert people who have been in proximity to an infected person. Some countries, including China, Singapore, South Korea and Israel, have deployed these early on. Health authorities in the UK, France, Germany, the Netherlands, Iceland, the US and other countries, are currently considering implementing such apps as a means of easing lock-down measures.
There are some bottlenecks. Do they work? The effectiveness of these applications has not been evaluated — in isolation or as part of an integrated strategy. How many people would need to use them? Not everyone has a smartphone. Even in rich countries, the most vulnerable group, aged over 80, is least likely to have one. Then there’s the question about fundamental rights and liberties, first and foremost privacy and data protection. Will contact-tracing become part of a permanent surveillance structure in the prolonged “state of exception” we are sleep-walking into?
Even if the Apple/Google proposal does get the privacy issue right, what other trade-offs are involved in letting these companies contribute to the development and deployment of what might be the largest scale crisis management measure for the pandemic so far?
Prompted by public discussions about this last concern, a number of European governments have indicated the need to develop such apps in a way that would be privacy preserving, while independent efforts involving technologists and scientists to deliver privacy-centric solutions have been cropping up. The Pan-European Privacy-Preserving Tracing Initiative (PEPP-IT), and in particular the Decentralised Privacy-Preserving Proximity Tracing (DP-3T) protocol, which provides an outline for a decentralised system, are notable forerunners. Somewhat late in the game, the European Commission last week issued a Recommendation for a pan-European approach to the adoption of contact-tracing apps that would respect fundamental rights such as privacy and data protection.
Of course, Europeans and civil society are not the only ones who care about privacy. Apple and Google do too. On April 10, the two tech giants revealed they were collaborating on their own version of contact-tracing technology — one they claim takes privacy very seriously. While privacy-friendliness may be something we’re getting used to hearing Apple promote, precisely as a means of distinguishing itself from other data corporations, it’s certainly not the first design principle we associate with Google. And yet, the specifications of the Apple/Google proposal seem very close to what top privacy advocates are laying down as necessary conditions for privacy-friendly contact-tracing. Non-traceable identifiers? Check. Transparency? Check. Opt-in? Check. De-centralised? Check. Indeed, the European Data Protection Supervisor offered an initial endorsement of the initiative, while the authors of the ultra-privacy aware DP-3T welcome the proposal as “very similar” to their own.
Not all privacy experts agree about the level of privacy protection the Apple/Google proposal will deliver. And some are calling for the implementation of additional safeguards, such as careful auditing and mechanisms for ensuring the technology can be uninstalled once the pandemic is over. But these discussions risk losing sight of the bigger question at hand. Even if the Apple/Google proposal does get the privacy issue right, what other trade-offs are involved in letting these companies contribute to the development and deployment of what might be the largest scale crisis management measure for the pandemic so far? Privacy is certainly an important issue, but it cannot address the bigger picture.
The Apple/Google contact-tracing proposal is hardly the first contribution that the tech giants on either side of the globe are making to help tackle the COVID-19 pandemic. Since the beginning of the pandemic, they have been involved in developing (somewhat less privacy friendly) surveillance tools using location data (Google, Facebook, Alibaba, Palantir), setting up screening services (Verily, Apple), giving away laptops to facilitate distance-learning (Google), developing AI for diagnostics (Alibaba, Baidu), and funding COVID-19 related research (Microsoft, Facebook), amongst others.
They effectively move from having a seat at the drawing table, where inclusion is (and should be) determined by technical expertise, to having a seat at the decision-making table, where inclusion should be (but hardly is) determined by democratic values.
Together, these efforts point to the growing involvement of a handful of data corporations into ever expanding sectors of life that we have been witnessing over the past decade — from information retrieval and communication, all the way to transportation, urban planning, education, biomedical research, and now public health crisis management. To be sure, governments and decades of cuts in public spending have helped create the vacuum that data corporations have begun filling. But as these companies move into these sectors with what might be very useful, quick and low-cost solutions, they also increase our dependency on them for the provision of (public) services, and they make themselves necessary passage points for the adequate functioning of these sectors. Ultimately, this will allow them to reshape these sectors to align with their own values and interests, which may or may not be those of citizens. In other words, they effectively move from having a seat at the drawing table, where inclusion is (and should be) determined by technical expertise, to having a seat at the decision-making table, where inclusion should be (but hardly is) determined by democratic values.
Function creep does not only apply to technologies. If rolling back an app may prove difficult once the pandemic is over, try rolling back the presence of such powerful actors.