An Open Letter to Prime Minister Cameron

20th-century solutions won’t help 21st-century surveillance

Jonathan Zittrain
The Message

--

Why a seemingly sensible proposal to compel back doors in Internet communications apps is a bad idea.

Dear Prime Minister Cameron:

You recently proposed that Internet apps be required to make users’ communications accessible by state authorities. I want to explain why this is a very bad idea even though it might seem like a no-brainer.

You said:

I have a very simple principle which will be the heart of the new legislation that will be necessary. In our country, do we want to allow a means of communication between people which even in extremis, with a signed warrant from the home secretary personally, that we cannot read? Up until now, governments have said: ‘No, we must not’.

That is why in extremis it has been possible to read someone’s letter, to listen to someone’s telephone, to mobile communications. …

But the question is: are we going to allow a means of communications which it simply isn’t possible to read. My answer to that question is: ‘No we must not’.”

President Obama appears to agree with you.

Heads of government bear the burden of keeping their populaces safe. That’s a crushing responsibility. Police solve violent crimes — and intelligence agencies predict and avert them — largely by intercepting the conversations of people conspiring to get away with them.

For at least thirty years democracies have kept eavesdropping within bounds by requiring a warrant or some other form of meaningful review before setting up something like a wiretap. As telephone companies upgraded to digital (but still not Internet-based) networks in the 1990s, governments around the world began to require that the new networks still allow for authorities to listen in to calls. The rationale was simple and generally uncontroversial: so long as the government respected the rule of law, its demands for information shouldn’t be trumped by new technological facts on the ground.

Why, then, you reasonably ask, should that long-established balance between security and privacy be disturbed simply because the internet has replaced telephony? The answer, it turns out, is that baking government access into all Internet apps will in fact not extend the long-established balance between security and privacy to all mediums of communication. It will upend it.

Here are four reasons why:

1. The Internet’s open ecosystem is fundamentally different from the closed world of telephony — so you can’t copy and paste the old order.

First, the landscape of Internet communications apps is profoundly different from telephony, where lawful intercept’s habits were honed. Traditional telephone systems were run by a single large company or by governments themselves. They overwhelmingly served the single purpose of letting people talk to each other at a distance, and the experience of using a phone in 1990 was hardly different from that of using one in 1950. A stable service run by a big company is susceptible to government regulation with little friction. Supporting lawful eavesdropping was done with no impact on telephony’s basic model — and often governments would pay to offset any costs incurred in keeping phone lines open to tapping.

Credit: reynermedia (flickr) — CC-BY

The Internet evolved in a wildly different way. It supports applications written by anyone, and a new application can become popular in heartbeat. Some people write and share apps for fun rather than money. To restrict how one might build an Internet application that enables person-to-person communication — that is, nearly all of the hundreds of thousands of apps out there — would expect that software developers be professionals who can hire compliance attorneys or risk breaking the law.

In the worst case, software development would be relegated to a handful of incumbents ready to do the kind of partnerships with governments that sophisticated phone companies do. Facebook, Google, and Microsoft could cope (if unhappily) with that, and software authors and service providers the next tier down would be hugely disadvantaged. The best case from the pro-government-access point of view would be one where app authors across the spectrum give up on encryption entirely. Instead of orchestrating a complex scheme of scrambling communications to all but the parties and a government, there’d be no scrambling at all. That best case is a nightmare for the public’s — and therefore national — security: it exposes their communications to anyone ready to hack. Lawful telephone eavesdropping wouldn’t have come about if it meant that it would be easy for others — even those at a distance — to also listen in on a conversation.

2. Comprehensive app regulation is either self-defeatingly leaky or unacceptably intrusive.

Second, again unlike telephony, Internet users who don’t like the way an app works can choose to use another.

Credit: blakespot (flickr) CC-BY

As a practical matter, WhatsApp could be successfully required to change the way it encrypts its users’ communications since it’s owned by Facebook, a behemoth that can’t readily gainsay what a government wants of it. Facebook has executives and engineers in London who can be arrested, and bank accounts that can be seized. But that’s the exception, not the rule. While you may be looking at large companies like Facebook and thinking that regulation will be easy, you may not be considering the millions of other sources of code. Despite WhatsApp’s $19 billion price tag to Facebook, its basic functionality could be reproduced in a weekend by two caffeine-fueled university sophomores. Such people are notoriously difficult to regulate, especially if they’re not attending school in the UK, or haven’t bothered to claim credit for their code, making them tricky to find.

The speed with which the public could migrate to a new coder’s NextApp would up the stakes for the kind of enforcement you’d have to do for your proposed requirement to have any impact. Indeed, you’d have to constrain the application ecosystem itself. To be sure, you might have a better chance at it today — in a world of app marketplaces run by Apple and Google, who can impose their own restrictions on software — than ten years ago, when software traveled directly from one machine to another with a simple click.

But that would accelerate a profound and undesirable flip from software that flows freely except in the most unusual of circumstances, to software that can only move once it meets government standards. PCs would have to become like iPhones, running only what their originators — Microsoft and Apple — permit. Seriously: this isn’t just telling British Telecom to go ahead and tweak its software. Rather, it would place a handful of companies into the role of gatekeepers for the vast and colorful universe of code that flows from millions of source. And these gatekeepers turn out to be the very companies whose market dominance has so deeply troubled European competition authorities.

3. Compelled exposure to a state with a warrant means vulnerability to a state without one.

http://opennet.net

A third problem: a requirement to make encryption breakable by the prevailing legal authority would be a gift to states that do not embrace the rule of law. Billions of people live in such countries, and Western technology has represented one of their best shots at the freedom to communicate enshrined as a universal human right. Their governments have had to invest enormous amounts of effort to try to allow the economic benefits of connecting with the rest of the world while still permitting censorship and surveillance. If you succeed in shaping our software so that we can’t keep secrets from authorities bearing valid warrants, you will also make it so that people can’t keep secrets from regimes who don’t bother to get one.

4. Building systems to secure communications against all but the communicating parties and the government is not an easy task.

All of these reasons are grounded in the fundamentals of the way the Internet has evolved — and the nearly-unthinkable costs of trying to push it to a place where a proposal approximating telephone eavesdropping could be enforced across all Internet applications. None relies on a fourth, more practical point that others have made when decryption mandates have come up before: building systems to secure communications against all but the communicating parties and the government is really, really difficult, and entails its own risk of catastrophic failure, rendering the communications worse off than if they hadn’t been encrypted at all.

Key escrow for the original Clipper Chip of the 90's, designed to structure a government back door — “law enforcement access field,” or LEAF — into encrypted communications systems. Source: http://www.cryptomuseum.com/crypto/usa/clipper.htm

I understand the imperative to provide security, and the way in which it makes sense that democratically-enacted, constitutionally-sound law, rather than the cat-and-mouse of technological hacks and counter-hacks, determine the boundary between state and citizen. I realize that it can seem that entire sectors of formerly-surveillable communications are “going dark.” But a simple technological mandate to prevent the use of strong encryption is not, in fact, simple. The toolkit for law enforcement and intelligence agencies to do their necessary work is deep and growing, and thanks to that, the fact that some apps encrypt need not stymie investigations of large-scale terrorism that make the decryption proposal appear so urgent to begin with.

Prime Minister Cameron, I do not envy you your job. There is only the solace that a choice about whether to pursue a proposal like this is not, in the end, a close one: you shouldn’t do it. The Internet we have has been a force for modernity and openness. That’s exactly what those who believe in indiscriminate violence despise. Let’s not try to build them a more agreeable network, in the name of the short-term imperative to uncover and prevent their worst.

Jonathan Zittrain is the George Bemis Professor of Law and Professor of Computer Science at Harvard University, and co-founder of its Berkman Center for Internet & Society.

This essay is published under a CC BY-ND license. A shorter op-ed based on its ideas can be found in the Financial Times. Thanks to Maria Balinska, Julie Dickerson, Virginia Heffernan, Gillian Morris, Ben Sobel, and Shailin Thomas for suggestions.

Cover photo credit: gengish (flickr) — CC-BY-NC

--

--

Jonathan Zittrain
The Message

Prof. @Harvard_Law, @HSEAS, @Kennedy_School + @BerkmanCenter for Internet & Society; @EFF board member; a small creature who likes to run around in universities