Your iPhone already has a backdoor.

Liam Kirsh
4 min readFeb 20, 2016

--

You may have heard in the news yesterday that Apple took a public stance against the FBI’s court order for a ‘backdoor’ for the iPhone. The Department of Justice is interested in using this ‘backdoor’ to help read data off of an iPhone used by one of the shooters in December’s terrorist attack in San Bernardino, CA.

The media headlines and Apple’s statement have blurred some of the details, which has caused a great deal of misunderstanding. To clarify things, the FBI court order demands that Apple do the following:

  • produce and sign a version of the iPhone operating system (iOS) to
  • disable the auto-erase feature that wipes the phone data after 10 incorrect passcode attempts,
  • remove any purposefully introduced delay between passcode attempts, and
  • enable the FBI to submit passcodes to the phone programmatically, from a separate device.
  • This software shall be coded with the unique identifier of the phone so that the operating system can only be loaded and executed on the device in question.

What people are getting wrong:

In Apple’s letter, they cite a traditional argument used against encryption backdoors.

this software […] would have the potential to unlock any iPhone in someone’s physical possession. […] In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks.

This is simply not true.

By definition, the software that the FBI is requesting is not a backdoor. The Linux Information Project defines a backdoor as:

any hidden method for obtaining remote access to a computer or other system.

The software the FBI is requesting does not inherently grant access to an iPhone’s data. (In fact, it’s impossible for Apple to directly grant access, but more on that later.) Rather, it only makes it easier for the FBI to submit passcodes and conduct a brute force attack. A brute force attack is an attempt to discover the password to a system by systematically guessing every possible password until the attacker finds one that works.

Many are concerned that giving in to this request would open up the possibility for this version of iOS to be used on other devices — the ‘one key opens up millions of locks’ argument. That argument is invalid here. Because iPhones will reject an operating system that is not signed by Apple, it would be impossible for the FBI or a third party to re-code this exploit for a different device without breaking the signature. iPhones would reject any modified version of the exploit that is not verified with Apple’s private signing key.

Why should you be worried?

In their customer letter, Apple emphasizes their choice not to comply with the demand. They do not deny the feasibility of the exploit. According to New York Times journalist Farhad Manjoo, Apple has admitted that all iPhones are at risk:

Security blogger Dan Guido predicts that the software security features could be overridden in fewer than two updates to the device.

If this is true, in order to develop the custom operating system the FBI describes, one would only need a copy of the private signing key and extensive knowledge of iOS. Thus, in the sense that the word is being used in the media, the ‘backdoor’ already exists. The ‘key to all locks’ is Apple’s private signing key. Through hacking or otherwise obtaining unauthorized access to Apple’s systems, any of the following could retrieve it:

  • a malicious Apple employee
  • a hacker group
  • a foreign government
  • the US government

Apple’s statement exposes an already present vulnerability. Whether or not Apple complies with the order, your data privacy is at risk: the key exists, and it’s out there.

What can you do?

Because we can’t know that no one will ever gain unauthorized access to Apple’s systems, users must put their trust in the math behind the encryption. Securing your device is as simple as creating a longer passcode.

Recent versions of iOS use an encryption protocol called 256-bit AES. As of now, researchers haven’t discovered any practical attacks on this encryption that are better than random guessing (brute force).

The essential weakness in a brute force attack is this: the time it takes to guess all possible passcodes increases exponentially with the passcode length. If you use a passcode entirely made up of digits (numeric), each additional digit increases the time required tenfold. For one made up of letters, digits, and symbols (alphanumeric), the time required increases even more quickly.

A 2012 Apple security whitepaper estimates that recent versions of the iPhone process a single passcode attempt in 80 milliseconds. Assuming that the attacker has unlimited computing power, we can construct the following graphs:

These graphs demonstrate that even if Apple were to comply with the FBI order, anyone using a good password at least 11 digits long or 6 characters long shouldn’t be worried. Your passcode won’t be cracked within your lifetime.

Sources

--

--

Liam Kirsh

Information Security grad student at CMU and Internet enthusiast.