Mr. Fart’s Favorite Colors
Why your phone’s security is unlike any other
For a country fighting over security, we seem to know very little about it. I’m reminded of this every time I fly out of LAX, since they’ve got a security hole big enough for Timothy McVeigh to drive a truck through.
LAX has two security lines. The “9/10” line gives you the classic metal detector of days gone by. The “9/12” line gives you the billion-dollar body scans we bought after 9/11. Only passengers in the “precheck” program can go through 9/10 security. At the apex of the lines sits a TSA magician who ensures everyone’s in their proper place.
But here’s what actually happens when I arrive at the seventh largest airport in the world:
- My TSA magician, whom I call The Great Tagliatelle (I’m usually starving by the time I reach him), meticulously runs through a series of checks.
- If I pass, he waves me through.
- The 9/10 line then immediately merges with the 9/12 line, completely negating everything he just did.
- He begins to weep of existential futility, whereupon his boss promptly fires him for exceeding the 3-ounce teardrop limit.
A few minutes later, the now-blended line reaches the magician’s assistant, Ziti. This is the real gatekeeper. Ziti decides who’s tall enough to ride the metal detector and who must submit to the explosives machine. He has zero information from the prior screening. He’s usually baked. Ziti’s security test is always the same: Can you show me a piece of paper that smells anything like a boarding pass and has the word “PRECHECK” on it?
Unlike his boss, Ziti doesn’t have even the pretense of high technology. At most, he wields blue latex. But unless these gloves are designed to authenticate boarding passes, I can only conclude that airport security — like most security we tolerate — is based on the idea that a terrorist will never get access to one of these:
I’ve been to other airports that don’t want to take this chance. Instead, they’ll have The Great Tagliatelle hand you an Asymmetric Cryptographic Polycarbonate Cipher to pass to Ziti. That’s industry lingo for “laminated clip-art”:
But those are maximum security airports: Their terrorists must have the emotional maturity to walk into Kinko’s.
At LAX, you don’t even need toner. If you can just manage to make the words “PRECHECK” appear on your phone screen, in anything vaguely resembling an electronic boarding pass, you’re in. If you can’t figure this out, ask the 9 year-old in line behind you…once they finish airbrushing the love handles out of their latest Instagram.
How Programmers View Security
To understand the real tradeoff we’re debating in Apple v. FBI, we must think like programmers, not like the TSA.
For starters: If an engineer interviewing at Google proposed LAX-style security for your Gmail, he wouldn’t get hired. “Hopelessly naive; weird pasta-magician fetish,” the feedback would say.
Although we think of technologists as bright-eyed optimists — A computer in every home! Cars that drive themselves! — programmers are actually pessimistic, paranoid lunatics. I say this with love, because I respect the hell out of the mindset.
It starts when we’re 8 and coding our very first program. “What’s your favorite color?” it asks, sweetly, twirling a lock of Visual Basic around its finger. You type in your answer, the screen changes color accordingly, and boom — time to show off to family.
Then Aunt Jody calls.
“Honey, it froze on me. ‘Color.exe has crashed.’ I don’t know what that means.”
You take a look at her entry. She entered: 2.
“I thought it asked how many favorite colors I had?”
But how could you…but what does it even mean to have more than one favori…ok, fine. No big deal. You add a sliver of code to stop people from typing numbers into the box.
Next you post your program to the Internet. Thirty seconds later, you receive another crash report. That user entered: fart.
You can patch this, too, but you’d really like to understand it first. Was this just, somehow, another honest mishap? You send the user an e-mail: “Why? Why would you enter ‘fart’?”
He writes back: “blue.” This is the moment you realize that some people just want to watch the world burn. And nothing is ever the same again.
Seasons change, skills grow. Color war is over; now you’re programming your high school newspaper. But the lessons linger. You can’t guess everything people might do wrong, so you must ensure they only do what’s right. Before anyone can hit Sign Up, for instance, your code verifies that their name contains a real name.
The next day, a signup comes in that technically comports with your rules: It contains the name “John”. But it’s…quite a lot of Johns, with some other weird stuff at the end. Johnjohnjohnjohnjohnjohnjohnjohn… You only programmed the system to expect one. With a whorehouse worth of johns, the text overshoots the space you set aside for the name. The leftover begins spilling into other fields that should’ve been off limits from meddlers. Like the paper’s headline.
And that’s how Monday’s top story comes to be: “BUTTCHEEKS.”
Mercifully, you graduate high school to bigger and better things: quant trading at Renaissance; digital co-chair for Ross Perot ’92. The sharper you get, the more important the work. But the more valuable the work, the craftier — and more determined — your adversaries. Every attack is more novel than the last.
By the time you land an engineering gig at Apple, you are a twitchy, tinfoily mess. When your giggling date pokes you in the side and asks your favorite color, you shout: “WHO WANTS TO KNOW?!”
But at work, you’re in good company. Your peer code reviews look like a meeting of the Flat Earth Society:
The innocence of color.exe is long gone. Nobody asks why or whether anymore; you take it for granted that someone, somewhere is breaking everything he possibly can. And it is in this spirit that you develop one of the most secure systems the world has ever known.
- It is not “secure” as the Coke recipe is secure. Coca Cola has the key to its vault, but you don’t have the key to yours.
- It is not “secure” as the Pentagon is secure. Those blueprints are closely guarded, but your plans — even much of your security code — are known to all.
- It is not “secure” like the planes or the trains, like your tax returns or your copy of The Sandlot or Hillary Clinton’s emails.
It is secure in that your worst enemy can steal your phone and the engineer who built it and flee to a cabin in the woods, and 50 years hence — if our understanding of physics and mathematics is correct — a hunter will stumble on two skeletons cradling a still-locked iPhone.
So adversaries be damned: You finally win on the merits. But who said anything about meritocracy? During the champagne toast, Mr. Fart steps from behind the curtain and pulls the pistol of last resort:
“Don’t ship this. Or else.”
To be clear: Contrary to a lot of shoddy reporting, and some rather “generous” language from Apple, this is not the world we’re living in yet. The fact that Apple can break into your phone demonstrates that we’re still firmly in “Coke recipe” territory. But this is not hard to fix, and Apple intends to fix it.
Despite the best efforts of government and non-government hackers around the world, then, there may come a time when the only person who would be able to enter your locked phone — come hell or high water, hearing or warrant or bullet — is you.
That means the debate will ultimately end here: Should a private company be allowed to sell unbreakable security?
— APPLE: This security protects everything from your father’s health records to the President’s communications. We’ve even repurposed the stuff to build decentralized, non-counterfeitable money. This is good news. We’re finally building real security. Based not on our fragile ability to keep secrets and behave well, but on potentially inviolable natural laws. Why do you want to bend those back with the barrel of a gun?
— GOV’T: Our job is to follow every lead to keep the country safe.
— APPLE: Yet you’re building other security systems using the worst technology we’ve ever made: consumer printers.
— GOV’T: Please respect the subpoena.
— TECH: You didn’t issue one.
— GOV’T: Well, there was a paper jam…“PC Load Letter”? What the fuck does that mean?
In short, Apple argues that this isn’t privacy versus security; it’s security versus security. That’s a much harder question than either side is willing to admit. We saw this viscerally a year ago…
What happened on March 24th, 2015?
For as much money and time as we’ve wasted on printer-powered air security, only one innovation has prevented another 9/11: Locked, reinforced cockpit doors. These doors can withstand gunfire and even small grenades.
But sometimes, 6 hours into a Cancun flight, 3 helpings into Delta’s Cargo-Class Seafood, a pilot needs to deposit a few small grenades of his own. So there’s a handshake protocol:
- When the pooping pilot wants to reenter the cockpit, he calls the flying pilot on the intercom to buzz him in.
- If there’s no answer, the outside pilot enters an emergency keycode. If the flying pilot doesn’t deny the request within 30 seconds, the door unlocks.
- The flying pilot can flip a switch to disable the emergency keypad for 5 to 20 minutes (repeatedly).
Like Asimov’s three laws, these checks and balances try to approximate safety while accounting for contingencies. If the flying pilot risked Delta’s gefilte fish and passed out, you want to make sure the other pilot can still re-enter. But add all the delays and overrides and backstops you want; you still have to make a fundamental decision. Who controls entry: the people on the inside, or the people on the outside?
Governments decided that allowing crew members to fully override the flying pilot using a key code would be insecure, since it would be too easy for that code to leak. Thus, there is nothing the outside pilot can do — whether electronically or violently — to open the door if the flying pilot is both conscious and malicious.
This design came back to bite us on March 24th. The very system meant to keep passengers safe enabled a mentally ill Germanwings co-pilot to lock himself in the cockpit and slam into a mountain at the speed of sound. The locked-out captain was armed with full knowledge of the system, a fire axe, and a motivation that dwarfs any hacker’s — and still he could not penetrate the door.
History shows that there is no perfect level of accessibility when it comes to airplanes: sometimes it’s important to keep people out; sometimes it’s important to get inside.
What’s striking is that this incident did not prompt any change in cockpit protocol in the United States. The FAA is improving mental health checks, but at 30,000 feet, we still have a security system where the parameters are widely known to criminals; where the method of abuse is clear; where we see no way for people outside the cockpit to stop it; and we’ve still decided the public is best served by keeping the people in the cockpit in charge of the lock.
This is the right choice — there are far more potential suicidal bombers in the cabin than in the cockpit.
Still, I’d hate to be the government official who has to explain this tradeoff to the mother of someone on Germanwings 9525.
Black or white
The security we encounter every day — when it works at all — is usually built out of shades of gray: Lock your door. Need more? Arm your alarm. Even more? Don’t feed Fido for a day. Marginal benefits, marginal costs.
It’s easy to assume that digital security is just another spectrum, and politicians love to reinforce that — gray’s their favorite color. Every presidential candidate is offering the same Michael Scott solution: Let’s preserve everyone’s security at once! Give a little here, take a little there, half-pregnancies for all.
Unfortunately it’s not that complicated, which means it’s not that simple. Unbreakable phones are coming. We’ll have to decide who controls the cockpit: The captain? Or the cabin? Either choice has problems, but — I’m sorry, Aunt Congress — you crash if you pick 2.