The “Great Difficulty” of Mass Data Retention
In late July, the EU Advocate General (AG) released his opinion in the Watson et al. case pending before the Court of Justice of the EU (CJEU). The case challenges the UK’s Data Retention and Investigatory Powers Act 2014 (DRIPA), which requires communication service providers to retain all communications data passing through their networks. The AG’s opinion concludes that general data retention powers of this kind must be accompanied by certain strict safeguards if they are not to violate EU law.
The opinion of an AG, given before the judges themselves deliberate on a case, is advisory and has no legal effect (though in practice, the Court’s judgment tends to be consistent with the AG’s recommendations). As such, the opinion is often able to more fully explore a debate than the judgment that follows it. AG Henrik Saugmandsgaard Øe’s opinion in this case is no exception. Citing US President and founding father James Madison’s “great difficulty” — the tension between giving government the powers it needs to protect citizens, and preventing it from abusing those powers — the opinion addresses frankly the issues arising out of general data retention.
In 2006, the EU institutions passed the Data Retention Directive, a measure sponsored by the UK that called for states to retain all communications data in their countries (for between six months and two years) and for public authorities to have access to that data on request. Communications data — variously termed ‘traffic data’ or metadata — is everything but the content of communications. It includes the senders and recipients of emails and calls the timing, duration and methods of communications, and device location data. Despite being likened by the Government to itemised telephone bills, or addresses written on postal envelopes, it has the potential to reveal a huge amount about the life of an individual. The 2006 Directive was challenged by the NGO Digital Rights Ireland, and was ultimately invalidated by the CJEU as a disproportionate interference with the rights to privacy and protection of personal data under the EU Charter of Fundamental Rights (Arts 7 and 8 respectively). In response to the Directive being struck down — and the resultant invalidity of the UK regulations implementing it — the UK Parliament quickly passed DRIPA to preserve its power to order communications service providers to retain data.
Both DRIPA and a Swedish law being challenged alongside it impose a duty of general data retention; in other words, blanket storage of all communications data passing through the networks of communications providers. The AG concludes that general retention probably is compatible with the Charter rights (though he emphasises that national courts should decide whether there might be less intrusive measures that are as effective, and whether a retention obligation could be narrowed in scope, for instance by geography or persons). In Digital Rights Ireland, the untargeted nature of data retention — the lack of any connection between the purpose of retention (crime prevention) and the people whose data was captured — was one of the factors that led the Court to invalidate the Directive. To the disappointment of many privacy advocates, the AG does not read this as meaning general data retention is per se incompatible. Rather, he takes a narrower view: that general retention is potentially lawful if accompanied by certain safeguards.
In reaching that view, he grapples directly with the costs and benefits of general retention. The difference between general retention and authorities having to request capture of data for an identified target is the ability to “see into the past,” because authorities can look retrospectively at the behaviour and connections of people they come to suspect. Conversely, retaining the data of an entire population presents risks of abuse — for example, using the data to identify critics of government policy or people attending anti-government protests (and the AG cites the example of Ukraine police ominously texting the phones it tracked at a protest, letting them know they were registered at a “mass disturbance”). This capability, the AG suggests, could actually make bulk access to communications data more intrusive than the ability to look at content.
It is these concerns that render safeguards vital. Crucially, the AG opines that only the prevention, detection or prosecution of serious crime can justify the authorities having access to data. If adopted by the Court, this will put paid to the UK’s long list of justifications for accessing data that includes tax collection and vague concepts such as public health. Echoing recent decisions of the European Court of Human Rights is the AG’s insistence on independent (ideally judicial) authorisation of any government request to access data, stemming from the valid concern that self-authorisation by the executive is a recipe for abuse — and the AG cites as evidence for this the extremely high numbers of authorisations currently given. Moreover, while the AG does not specify any time limits for retention, he reiterates the CJEU’s point that data must be destroyed when no longer necessary. Overall, the AG declares that the safeguards enumerated in the earlier Digital Rights Ireland decision are all mandatory, and must all be followed — nations cannot pick and choose which they will implement.
The great difficulty that James Madison identified remains as pressing as ever. The Snowden revelations ignited fervent debate in the US about the right way to resolve that tension, and even led to some action to limit expansive surveillance powers, with Congress voting last year to curtail bulk data collection by the NSA. By contrast, on this side of the Atlantic the debate has been far more muted. As the Investigatory Powers Bill, which seeks to entrench and expand intelligence agencies’ bulk powers, rapidly makes its way through Parliament, we ought to think carefully about Madison’s words and demand that our representatives thoroughly grapple with the competing values at stake.