We Cannot Have A Master Key
Why Apple and Tim Cook Have it Right
As Tim describes, backdoors are vulnerabilities — they are a technical feature, not a controllable policy. If introduced, they can be exploited not only by our vendors and government, but by others (criminal hackers, foreign governments, rogue individuals in organizations with legitimate access).
For more than twenty years, security experts have come out against this inbuilt vulnerability: “Key recovery systems [aka systems with backdoors] are inherently less secure, more costly, and more difficult to use than similar systems without a recovery feature.”
There is overwhelming evidence that actors good and bad, public and private will have access to increasing data about us all. Despite the concerns and frustrations about law enforcement investigations “going dark” — instead of sparse breadcrumbs for law enforcement to follow, there are now endless troves of new, rich communications and behavior data. Even encryption (the adoption of which is still very slow and incomplete) only partially obfuscates it.
How would the FBI have tracked someone years ago? The rise of smartphones bring only more opportunities for heavy-handed surveillance, not less. One could argue that even the metadata (the data about the data) of encrypted communications in the mobile age gives richer visibility than the full, unencrypted content of the early internet age.
Today, if you had unencumbered access to everyone’s smartphone, you could find out where I’ve been, who I’ve spoken to, texted, emailed, what I asked Google, what I bought, where I bank, if I’ve run, what media I read and watched, where I live, and who my family, my friends and colleagues are. The more internet services I use, the more digital exhaust I leave behind. Depending on what other connected devices I have (many of which interact with my phone) you might be able to find out what I eat, and look into my house by remote video. The potential granular visibility into individual lives is startling.
I have confidence that law enforcement can continue to innovate in their defense of their citizens, and can use the ever-growing digital footprint we all have to effectively surveil terrorists and other bad actors. Even our best systems and services — and we use so many — are not that secure. I have confidence that a highly motivated, sophisticated and funded actor — of which law enforcement is one — intent on learning about an individual can succeed in more cases than not. The trend is ever in their favor.
Unfortunately, even when intent is good, I do not have confidence that enabling easier bulk surveillance will be a net positive. History suggests that “surveillance creep” is the norm — use for inappropriate reasons, or for the silencing of dissident voices, or by a broader and broader base (see use of Stingray devices by local law enforcement). If so easily available, the data is too attractive to abuse.
We all have a right to privacy. We should applaud Tim Cook for trying to uphold that right (and enabling it by default).