Apple vs the FBI — Why Comparing the iPhone to a House Gets it Wrong

I have seen and heard many articles and podcasts discussing the current case of Apple v the FBI that has been in the media lately. A common comparison of the situation seems to be “well, the government has the right to search your house.”

There’s a fundamental problem with this analogy, though: the iPhone is not a house. Stick with me, here…

The FBI could, most definitely, get a search warrant signed by a judge and enter my home to search for something. Ideally, something rather specific, because that’s how warrants are supposed to work. They could, for instance be looking for intel on plans they believe I might be making to commit a crime. Plans that may reside on a USB disk. But there’s nothing stopping me from encrypting the data on that USB stick. I might even have written the software to do the encryption myself, to make sure that no one else can possibly have a back door. This is where the comparison falls apart, in my opinion. The iPhone is a lot more analogous to the USB stick than it is my home.

I believe what the FBI wants from this case isn’t even the precedent that it would set. Not directly. I’ll get into that, but let’s make sure we’re all on the same page, first.

I have seen a metric digital ton of discussion over the past week about this case, both from journalists and bloggers and average everyday people who barely understand what encryption is. The facts and details have been scattered and often misstated, or just plain wrong. It’s a complicated issue, and I believe many people are rushing to judgement without even trying to understand the minutia due to the emotional nature of this case, so fresh on everyone’s concious. That, of course, is exactly why the FBI is making such a big deal about it.

So, let’s do a quick primer on the facts:

  1. Apple can definitely create this software that will only work on this one iPhone, and will even only work once on the phone.

After listening to the wondering Security Now podcast with Steven Gibson this past week, Steve cleared up a few of the common pieces of FUD that have been floating around the ‘net. The first, that iPhones will update their software version while encrypted and locked. The second is that each individual update to an iPhone is not only signed with Apple’s private key, but also includes a verification piece in the form of a “nonce” generated by the device when checking for an update. This is a random number that is generated and used only once, to make sure that iOS devices cannot be targeted with downgrade attacks.

2. The FBI wants a back door

I liked how Steve put it on Security Now this week, that it’s not so much a back door as “weakening the front door”. The software that the FBI wants would allow a device attached to the phone’s lightning port to pushing a sequence of possible pin codes until they find the correct one. With a hardware enforced delay of 80 milliseconds (0.008 seconds) between attempts, that means that it will take a maximum of about 13 minutes to guess the 4 digit pass code. Which is great for the FBI, but what happens next time?

3. This software is only possible because it’s an iPhone 5

Apple made full disc encryption standard with iOS 7, but previous models of iPhone implemented the pin code security via software. With the introduction of the fingerprint sensor on the iPhone 5s, Apple started including what they call the Secure Enclave. This is a special hardware chip whose sole purpose to to store and protect the private key used to access the devices data. On a newer iPhone, this secure enclave would enforce the 10 answer limit, delay between attempts. In the event of more than 10 wrong codes being entered, the secure enclave erases the private key, rendering the encrypted data on the device useless.

Here is where we get the heart of my concerns. If the FBI wins this case, it will set a precedent. A precedent that will be used to put Apple back in the business of bypassing pin codes. At least, for devices that use a pin code. What happens then the FBI, after having this precedent established, demands that Apple bypass a device encrypted with a fingerprint and a secure enclave to enforce security?

I believe that is the situation that the FBI is really hoping for. A scenario where they can point to and say that strong encryption prevents Apple from complying with a legal court order. It is at that point, that they will be in a position to try and push a government-mandated back door. A way that the encryption can be “bypassed” with a warrant.

But there’s a problem with this concept. The idea that weak encryption can be enforced. Make no mistake that, anytime you discuss banning encryption, you’re really talking about banning convenient encryption. Bad guys who have something to hide will just start using inconvenient encryption, and it will end up that the only people with weak encryption are the good guys.

In the discussions about this, I am frequently reminded of the saying “do not make permanent decisions based on temporary emotions.” I fear encryption laws being based on fear.