Apple is (still) wrong, and tech needs to grow up.

Blair Reeves
8 min readFeb 22, 2016

--

(This is a follow-up to my post last week, “Apple is wrong. Your iPhone is not a black box.”)

It seems that the storm that has been gathering for several years between the government and the tech industry around privacy, encryption, and the proper role of law enforcement is upon us. Apple has chosen its ground to stand on, and has now been joined — at least in spirit — by many of the other heavyweights of Silicon Valley, including Google, Facebook (and WhatsApp), Twitter, Microsoft and many more. Broadly, the Valley has closed ranks behind Apple’s contention that it should not be compelled to cooperate with the FBI’s request to decrypt a locked iPhone from the San Bernardino terrorists.

After I wrote about why Apple was wrong last week, I’ve continued to follow this issue closely, there are a few observations that are well worth making about how this debate has begun to evolve.

This is not a technical debate. It’s really about policy.

Specialists in many fields have long used smokescreens of technical jargon to present their personal ideologies as scientific truths that others must simply accept as fact. Tech is no exception. A small but very vocal army of activists have joined Tim Cook himself to insist that due to the very nature of how cryptography works, Apple cannot possibly allow law enforcement access to encrypted devices without complete chaos ensuing, QED.

Activists are using technical jargon as a smokescreen to present their personal ideologies as scientific truths that others must simply accept as fact.

The Brits have a term for arguments like this that I particularly like, because it very neatly encapsulates the self-serving misdirection it implies: bollocks.

Make no mistake: this case is absolutely not, fundamentally, about the technical details of encryption. Rather, it is about where our public consensus now stands between protection of vital civil liberties, like privacy, and the needs of law enforcement and maintenance of public safety in the 21st century. That is a debate that every American who has read the Fourth Amendment has a right to hold an informed opinion about, not just the specialists who know the difference between a 256-bit UID key and the Secure Enclave firmware.

Apple has deliberately framed the debate as a technical matter of the company being forced to “break” its encryption, and provide “backdoors” to government, both as scare tactics and in an attempt to turn the issue into a referendum on encryption itself. Perhaps the company really does, or does not, see it this way; either way, it’s fairly irrelevant. I do not recall the American public ever deciding to outsource to a wildly profitable for-profit public company — the world’s most valuable, by the way — the right to decide on our behalf what the boundaries should be between law enforcement’s ability to investigate crimes and the citizenry’s expectations of privacy. Those are public policy decisions made by the voters’ representatives, not by industry.

What the FBI is asking for in this particular case is fairly narrowly tailored; but what it needs is wholly consistent with how American law enforcement has approached expectations of privacy for decades. Just as the police may obtain a warrant, based on valid cause and signed by a judge, to search your home, car, bank or health records, tap your phone or even detain you, it is entirely reasonable that they have the same access to electronic data. Apple may call its devices “magical,” but no amount of magic somehow immunizes them against being equally subject to search in a criminal investigation.

I don’t remember voting to outsource to a for-profit company the right to decide where the boundaries between law enforcement and citizens’ expectations of privacy should be drawn. Do you?

It was not long ago that people dismissed the idea of mobile payments as impossibly insecure. So quickly we forget!

It is an understatement to say that boatloads of credulity are required to believe that Apple is simply incapable of making this possible. The most valuable public company in the world that has defined the smartphone era spends over $8 billion each year in R&D; built its own whole custom chip design unit; created a new secure chip that allows users to store their credit cards securely on their phones and uses one-time tokens to pay for things by — magically! — swiping their cards through the air; and is widely known to have embedded numerous features into the phone operating system that are kept secret (for example, the recent news that some iPhones now disable themselves if a non-Apple technicians attempt to repair them (!)). Recall that Apple also knows exactly which phone you have, where it is, and can access it, at least at the OS layer, at pretty much any time. Anyone who has used the “Find My Phone” feature knows this. Apple has long been capable of disabling or even wiping your device remotely.

Given all this, I find extremely specious any argument that Apple cannot provide a straightforward way for law enforcement to gain access suspect device data.

Since Apple posted their now-famous Customer Letter, there has been a scrum of specialists rushing to speculate about the technical details at stake. Many of the hottest takes were soon contradicted by clarifications made by Apple engineers themselves. (Here is a pretty good rundown of the conventional wisdom today.) It has quickly become clear that, strongly as anyone may feel about the case at hand, almost no one outside of a small group of engineers at Apple actually knows what is really possible and what is not.

Law enforcement policy is ours to decide as a democracy. Not Apple’s. Technology, remember, exists to serve society, not the other way around. Just as you don’t need to know how exactly an internal combustion engine works to grasp perfectly well how a Ford F-350 hurtling down your neighborhood street at 80 miles per hour might affect public safety, technical mastery of cryptography is utterly irrelevant to the choices to be made here.

Apple’s problems are of its own making

Let us review some recent history.

In 2012, Apple decided to begin encrypting end-to-end communications by default and to not retain a key to recover them. After the Snowden affair, Apple began loudly boasting about its efforts in this area, and much was written about how the company began to explicitly use “privacy” as a marketing weapon against competitors like Google. The company expanded the categories of data beyond their reach (and that of law enforcement), despite early and repeated warnings from the government about the problems that would create. And in a 2014 iOS update, Apple removed its ability to retrieve data from devices.

In other words, Apple has specifically and deliberately engineered its product for years to make it harder for it to comply with valid government requests for assistance, and now complains that it is being asked to create software to restore that capability.

Apple has deliberately engineered its product for years to make it harder for it to comply with government requests for assistance, and now complains that it is being asked to restore that capability.

None of this is because Apple is evil. Apple is just a for-profit company that serves its customers. Exactly like Google, Facebook, Microsoft or any other tech company, Apple relates to its customers as users, who have a range of needs, pain points and desires it can fulfill with hardware and software. In this way, Apple’s worldview is filled with existing or potential customers, and wants to serve what it perceives to be their demands better than competitors. Sort of like McDonald’s does with all-day-breakfast.

Yet before we are Apple customers, we are citizens, whose broad range of needs supersedes our preferences as consumers. (This, for example, is why I am not allowed to own a Howitzer.)

Remember, too, that Apple’s security standards are anything but uniform. Few seem to recall, for example, that the company began offering Chinese authorities “security checks” on all its devices over a year ago as a condition for continued access to that market. This was widely interpreted as allowing the Chinese security authorities access to iOS source code. As Apple begins encrypting more device data by default, you can count on their cooperation with Beijing to expand.

A more cynical observer might suggest that Apple, knowing it will face more stringent such requirements in the future by Chinese authorities and predicting a backlash here in the West, chose to pick this fight with American authorities first. That way, it could revel in the public theater as its hands were tied by bad-guy government officials. Later, when faced with inevitable similar requirements in other jurisdictions, the company would be seen as a reluctant, rather than willing, participant.

I must confess that I am perplexed by the argument that it would be bad if Apple were “forced” to follow the law in whatever jurisdiction it did business in. If numerous other countries, including most Western liberal democracies, require Apple to cooperate more forthrightly with law enforcement, I suspect that says much more about the role of digital technology in modern life than any erosion of privacy.

Tech needs to grow up

Do not pretend that the balance we strike between the protection of civil liberties and our ability to investigate crimes is a made-up or abstract one. This week, BuzzFeed reported that the Manhattan District Attorney’s office alone has 175 Apple devices that it cannot access because of encryption, directly affecting its ability to prosecute cases. Examples of all manner of crimes, from petty theft to murder, rape and kidnapping whose critical evidence is tied to smartphones emerge regularly, and will only increase in number. Terrorism is only the scariest tip of this iceberg.

Given how deeply smartphones have become embedded into our lives, it is naive in the extreme to suppose that allowing them to “go dark” to law enforcement would not have major negative implications. Imagine, for example, if other consumer technologies that have become the infrastructure of our lives — like the car, telephony or email — had been legally declared immune to police scrutiny in their infancy. This is not an idle metaphor. If anything, smartphones are a more critical vector for communication and organization than any of those, making them disproportionately valuable to normal people and criminals alike, as equally for law enforcement.

There is much about our criminal justice system and intelligence apparatus in the United States that needs improving — of that, we should have no doubt. But fixing them will require improving the public institutions our society relies upon through the ol’ fashioned democratic process — not lashing law enforcement to the technology of 1995. This route is harder, and relies on organization, advocacy, and voters demanding better from their leaders. But we must. We have no choice. Throwing away the keys to modern life and hoping criminals won’t notice is a classic case of the triumph of hope over experience.

I’m on the twitterwebs.

--

--

Blair Reeves

Product at Salesforce. Holder of strong opinions. I mostly write on my blog, BlairReeves.me, but occasionally post here too.