Europe cannot afford to “shelter-in-place” on digital regulation

Frederike Kaltheuner
Data & Society: Points
8 min readMay 19, 2020

By Frederike Kaltheuner, Mozilla Foundation and Corinne Cath-Speth, Oxford Internet Institute & Alan Turing Institute

blue radar screen with yellow stars. COVID-19 virus floating

The COVID-19 pandemic temporarily upended the European Union’s (EU) most defining principle: free movement across the continent. To stop the rapid spread of the virus, most of its Member States closed their borders and reinstated stringent border checks. It’s hard to understate how unprecedented and historical this decision that has been. All the more surprising, that Member States are struggling to coordinate on a European-wide tech system to support contact tracing, which many consider could play at least a small part in loosening restrictions and opening borders. Some, like the UK and France, are developing centralized systems in-house, which are likely incompatible with the systems used in other countries. Others, like Germany and Italy, will build on Apple-Google technologies. Belgium will not be developing an app at all, as things stand.

One explanation for this fractured approach is that EU Member States hold primary responsibility for organizing and delivering health services and medical care. But Europe’s uncoordinated tech response to the pandemic also reveals a deeper truth about the continent’s digital strategy: it’s ridden with contradictions and doesn’t quite live up to its own ambition.

Europe’s uncoordinated tech response to the pandemic also reveals a deeper truth about the continent’s digital strategy: it’s ridden with contradictions and doesn’t quite live up to its own ambition.

Europe wants to position itself as the third way between the US and Chinese approach to regulating tech. A responsible alternative to US-style corporate surveillance capitalism and authoritarian state control. Take, for instance, the European Commission’s novel proposal for guidelines on Artificial Intelligence (AI), published in early 2020 as part of the new Commissions Digital Strategy. There are some flaws in the proposal, which we discussed elsewhere, but overall it is an important step towards realizing Europe’s tech ambition. These ambitions, initially articulated in privacy and personal data protection legislation such as the General Data Protection Regulation (GDPR), contributed to the expectation that Europe is the world’s leading tech watchdog.

The COVID-19 pandemic halted momentum on many policy initiatives around the world. The ongoing consultation on the Commission’s AI white paper was pushed back from May to June. It is also likely that a similar fate is awaiting planned discussions about new platform regulations and revisions of the ePrivacy Directive (commonly known as the “The Cookie Law,” which covers a whole raft of tracking technologies, not just cookies).

The pandemic might temporarily halt the Commission’s grand strategy, but it is also surfacing existing problems that were present all along. The first of these is persistent susceptibility to corporate pressures. The tech industry is clearly seizing on the “Great Pause.” Warnings of “overburdening regulation” are becoming louder as the economy contracts, and so are claims that more stringent rules on AI could hinder the development of cutting-edge AI to help fight the pandemic. Today, a Brussels think tank is hosting Commissioner Thierry Breton and Mark Zuckerberg to discuss “what the collaboration between tech and government could look like post COVID-19” — a framing that would have been unthinkable just a few weeks ago.

Even without the backdrop of a global pandemic, industry has been able to shape the language, narrative, and principles used to discuss tech regulation in Europe.

Yet, even without the backdrop of a global pandemic, industry has been able to shape the language, narrative, and principles used to discuss tech regulation in Europe, especially when it comes to AI. With the pandemic putting immense pressure on academic and civil society funding, COVID-19 will have an undeniable impact on their ability to contain the negative influence of a tech-sector that will likely emerge stronger.

A different concern, which came into the focus in the past few weeks, are the gaps and contradictions with Europe’s existing privacy and data protection framework. This is significant, since the Commission’s ambitious new digital strategy is supposed to build on and supplement existing rules. Yet the response to the pandemic has made the shortcomings of existing rules painfully obvious.

At first glance, Europe’s existing laws, especially when it comes to privacy and data protection, make it well-positioned to lead on responsible uses of technology and rights-respecting solutions. Contrary to many initial reports, there is no general conflict between data protection (especially the GDPR) and the use of personal data in the fight against an epidemic. GDPR principles and concepts, such as purpose limitation, data minimization, and ‘Privacy by Design’, however, limit the use of data to what is strictly necessary and help design products and services that automatically come with a certain level of protection. It is no coincidence that it was a consortium of European academics and technologists that spearheaded attempts to develop privacy-friendly protocols for contact tracing apps.

The GDPR’s main advantage in contrast to the US regime is that it is comprehensive and general, rather than sector-specific. In contrast to HIPAA (The Health Insurance Portability and Accountability Act of 1996), which covers protected health information that can be used to identify a patient, the GDPR covers all personal data, including data that can identify someone indirectly, while designating health-related data as a special category data that receives additional protection. What may sound like a technicality has far-reaching consequences.

Apps and websites that are about health, but don’t offer health services don’t necessarily fall under HIPAA but the data they process could easily fall under special category data under GDPR. It also means that data that reveals information about someone’s health, including their browsing history or financial transactions, can be treated as health data under GDPR but aren’t by HIPAA. Enforceable data rights and more stringent requirements for anyone who processes personal data also mean that Europeans are much more protected from the exploitation of their health data than their American counterparts, especially in sensitive contexts like the workplace.

That is, in theory. In practice, current regulatory frameworks, like the GDPR and the ePrivacy Directive, leave gaps and suffer from under-enforcement. Nearly two years in, there has been limited enforcement of the GDPR, and the law has thus far largely failed to rein in “surveillance capitalism,” or at least challenge the internet’s dominant business model. Last year, for instance, two separate investigations by Privacy International and the Financial Times revealed how countless European health websites routinely share sensitive personal information with online advertisers.

Now, in the midst of a global pandemic, some of the apps that were developed by health care providers and public health authorities are also falling short of their GDPR obligations. MPs and rights groups have warned that lack of data protection could make UK’s contact tracing app illegal.* An analysis of 46 global corona-apps by the French security company Defensive Lab Agency showed that many apps make use of third party trackers, including for advertising.

Again, this is a theme that has been present for a while. Although one of the main aims of the GDPR is to harmonize data protection law across the EU, Member States have introduced vast, and sometimes open-ended GDPR derogations, for instance for national security or immigration purposes (see, for instance the former EU member, the UK). And while Europe has been praised for implementing the world’s toughest privacy laws, countries like France, Germany, and the UK have passed laws which granted their surveillance agencies virtually unfettered power to conduct bulk interception of communications across Europe and beyond, with limited to no effective oversight or procedural safeguards from abuse.

In the face of a global pandemic, individual Member States have preferred a rhetoric of digital sovereignty over one of European unity.

All of these issues come in addition to the visible tensions between European visions, tech industry’s ambitions, and national-level practices. Hungary, which under the leadership of Viktor Orban has systematically dismantled the country’s constitutional checks and balances since his return to power in 2010, has implemented an indefinite state of emergency in response to the pandemic. The Hungarian government is also suspending GDPR rights for COVID-19 related data processing. (The Commission is examining Hungary’s emergency regime.)

Hungary might be an outlier, but in the face of a global pandemic, individual Member States have preferred a rhetoric of digital sovereignty over one of European unity. “States should be able to make their own choices on such a critical matter — it’s a question of sovereignty,” Cédric O, France’s junior minister for digital affairs, told the Financial Times, as he promoted the country’s COVID-19 app. These words were meant to rebuke Apple-Google efforts to develop a single underlying infrastructure for such apps. The fact that Apple and Google can de facto dictate what contact tracing apps European governments can build is indeed nothing short of remarkable. If anything, it further illustrates the urgent need for more stringent action of antitrust and competition on a European level.

The growing trend of digital sovereignty, in its zeal for tech-lash, can hamper legitimate pan-European efforts.

This rhetoric comes at a great cost. The growing trend of digital sovereignty, in its zeal for tech-lash, can hamper legitimate pan-European efforts. Both the European Parliament and the European Data Protection Board took a stance in favor of decentralised COVID-19 tracking apps. But when it mattered, Europe’s institutions were unable to develop this ambition to practice. Instead of mustering support, for what began in March as a modest but collaborative effort between academics and computer developers to create a single European contact tracing protocol, the project was derailed by divergent national interests (and internal disagreements).

In many ways, the pandemic resurfaced and amplified some of the most pernicious effects of technology on society. Data brokers and advertisers are targeting consumers who feel anxious and overwhelmed by the virus. There is a new wave of workplace surveillance, from corporate spyware software installed to monitor people working from home, to the facial recognition and temperature checks rolled out in factories. Some of the surveillance industry’s worst offenders are successfully pitching their products to governments and employers around the world.

The pandemic emphasized existing political, legal, and technical fault lines in Europe.

The pandemic emphasized existing political, legal, and technical fault lines in Europe. Achieving the EU’s digital ambitions requires reinvigorating the politics of unity that led to its creation as much as it does mandating novel regulatory frameworks. Several initiatives are underway to develop standards that ensure interoperability between different European contact tracing apps, and Margrethe Vestager, an EU commissioner, who encourages countries to use “decentralized storage” to ensure the apps would be compatible says “more and more countries are coming to have that approach.” Unity also needs to be a driving factor in national debates.

To fully address the disparate impact of technology on socio-economically marginalised populations, a broader political project is necessary. Prof. Lilian Edwards and others proposed such legislation in the UK. While this legislation is obviously UK-specific and specifically about contact tracing apps, it contains important provisions that are equally relevant for EU countries and technologies. For instance, the draft bill maintains that “no one shall be penalised for not having a phone (or another device), leaving house without a phone, failing to charge phone etc.” This reassures that those who are already digitally excluded that new technologies will not further entrench their exclusion.

The EU should go beyond patching existing frameworks and bolstering effective enforcement. The increased use of AI systems in particular suggests that the EU’s regulatory vision must expand to include tech’s impact on discrimination and socioeconomic inequities. Rather than retreating or “sheltering in place,” now is the moment for the EU to translate “the new possible” of a post-pandemic world to its digital ambitions.

*Now that the UK has a Withdrawal Agreement with the EU, the country is in a transition period, currently expected to last until the end of 2020, during which the GDPR will continue to apply in the UK.

Frederike Kaltheuner is a public interest tech policy fellow with the Mozilla Foundation.

Corinne Cath-Speth is a PhD researcher at the Oxford Internet Institute and the Alan Turing Institute.

--

--

Frederike Kaltheuner
Data & Society: Points

Holding tech to account. Mozilla Tech Policy Fellow, writer, researcher and activist