How Your iPhone Can Thwart the FBI
Apple has several security features in iOS that make it very difficult to access data without the consent of the device owner. These iPhone security features are notable for being both easy to use and powerful. In fact, I’d say Apple’s success in making great cryptography easy to use is pretty much unprecedented. Millions of people get this level of security without even trying.
There are some great technical articles about the debate and a ton of non-technical articles, so I thought I’d write up something that’s in-between.
Here are those security features
- Encryption: Data cannot be read from the device because it’s encrypted. In fact, when data is encrypted, it’s virtually gibberish. It’s stored in a manner that cannot be accessed unless you have the decryption key. That key is a very long random number. The raw key is virtually impossible to guess, even with very powerful computers, because it’s so long. Trying to guess a key, password, or PIN is called “brute forcing” it.
- Passcode: Part of the encryption key is a passcode that is probably known only to the owner of the device, in this case, that person is dead. That part of the key is probably a relatively short PIN. For instance, let’s say it’s only 4 numbers, there are 10,000 combinations. That’s a very very easy number of combinations for a computer to “brute force”. It would probably take just a few seconds. That’s where the other security features come in…
- Delay: There is a delay between passcode guesses, making brute-forcing 10,000 combinations doable, but now it’s relatively hard. Plus the user might actually have a long and strong passphrase, making brute force virtually impossible again. In practice, most people don’t use long and strong passphrases.
- Screen entry: The passcode can only be entered by tapping it onto the screen of the device. That means a human being has to do it, and suddenly 10,000 combinations is very time consuming.
- Self Destruct: The data “self destructs” after some number of failed attempts to guess the code, meaning that 10,000 combinations is usually enough to stop someone from guessing even a bad 4 digit PIN.
- Signatures: The iPhone software that enforces security features 3, 4, and 5 can only be changed if Apple creates an “approved” version that bypasses these features. To approve such a version, Apple has to “sign” it with their own secret cryptographic key, and no one but Apple has that key.
So what’s the debate? The FBI is asking Apple to create a signed version of the iPhone software (6) that will only work for this specific device and will bypass security features 3, 4, and 5. These will allow the FBI to brute-force the passcode, which will defeat (2), and then decrypt the device, which will defeat (1).
Beyond the Technology
The FBI in this case isn’t requesting a general purpose backdoor to get into any iPhone, but wants one that only works for this specific iPhone.
In practice, however, if Apple agrees to this, they will get similar requests in the future and will probably have to create a method of easily producing these single-device backdoors. This is probably the most likely scenario Apple is trying to avoid when Apple CEO Tim Cook talks about setting a precedent.
It’s quite possible that the single-device backdoor that they create for the FBI could be made to work on other devices (turned into a general purpose backdoor), so I agree with Apple that creating it is dangerous. (See below.)
Instead of having Apple produce the signed firmware, the FBI could potentially produce the firmware itself and compel Apple to sign it. It would probably be difficult, but not impossible.
The nuclear option would be for the FBI to demand the code and private part of Apple’s signing key. I guess I doubt that will happen, but the FBI has been hinting that this could be its next step.
It also may be possible for the FBI or other hackers to defeat the secure boot sequence of the iPhone by finding a vulnerability in it. Secure boot is notoriously difficult to get right. That is, an unsigned or improperly signed version of the iPhone software could possibly be made to load.
An interesting aspect of the argument is that folks occasionally argue that the government is not requesting a backdoor. Some even claiming that the term is “pejorative and misleading”. But as the author even points out, “a backdoor is a flaw or access point intentionally introduced into software to allow access to unencrypted text.”
I’d argue that the term “backdoor” is a neutral technical term that both sides of the argument should freely use. Maybe some feel that it sounds nefarious, but it shouldn’t. Metaphorically, there’s nothing particularly wrong with a backdoor. Many homes have them, in fact. ;)
Technically, a backdoor means bypassing normal authentication. Logging into a user account without a password, accessing the user’s plain text without their private key, etc. There can be legitimate reasons for doing this. Administrative access is the most normal reason. For instance, routers have sometimes had a backdoor to allow the vendor to log in and fix issues.
Unfortunately, a backdoor does increase the attack surface of any system, both metaphorically (in the case of your home) and technically. That’s one of the major points that the security community has been making about why it’s dangerous to create them. Indeed, even non-malicious backdoors are dangerous.
Apple’s iPhone encryption is robust and very secure, particularly for being so easy to use. The FBI wants a backdoor into the system, but anyone designing a backdoor needs to now protect two doors: The normal system and the backdoor. In fact, attackers very often target backdoors and account recovery mechanisms because they’re usually not as well designed as the normal security path.