Apple’s fight with the FBI has nothing to do with encryption, back-doors or hackers.

Lot’s of people are talking about how Apple has been forced to comply with the FBI’s investigation into the San Bernardino shooters by building a back door into iOS’s mobile security encryption system that protects your files from unwanted intruders; how it might not even be possible to do so (it is); and how everybody is rallying in support because if Apple caves, it’ll set a dangerous precedent that will ruin all the wonderful security that keeps your data safe, cuz, you know ‘hacker gangs.

Take it from someone who makes iOS apps for a living — it’s not true.

First of all, let’s be clear. Apple was not asked to build a back door into their phone’s encryption in order to create a master key to unlock any device at any time, as is being reported by many. Apple was ordered to turn off the feature that delete’s an iOS device’s contents when you guess an incorrect password 10 times. By doing so, the FBI can guess the password as many times as it wants.

The judge’s decision does not force Apple to create a new version of their operating system that gives the FBI power to decrypt iOS devices whenever they want. They’re not even asking them to do this for the particular device in question. Encryption has nothing to do with the case, yet it’s what everybody is talking about.

By disabling the feature that wipes the phone’s data (it actually throws away the devices stored key pair, which makes the phone theoretically impossible to open), they’re just giving the FBI the ability to guess the device’s passcode as many times as they want. This is known as a technique called “brute forcing,” they FBI can simply program a script to guess the every possible PIN of the device until it gets it right.

For example, since an iOS device’s password is a 4 digit pin, they’ll probably start with 0000. Then they’ll move on to 0001. You get the point.

Given that there are only 10,000 combinations of a 4-digit pin, they theoretically can gain access to the device in no more than 6.9 days (since the operating system makes you wait a 40 seconds between guesses) — but maybe only 13 minutes if they can guess faster.

Now let’s say Apple has to create a new version of the operating system that when compelled by a court order (they claim this) will disable this specific feature. Could a hacker, or other malicious force unlawfully gain access to your data because they do so for this particular phone in this particular case?

If Apple complies with the court order, actually no. The court order actually specifically says that the software will not load on any device but the San Bernardo shooter’s device.

The SIF will be coded by Apple with a unique identifier of the phone so that the SIF would only load and execute on the SUBJECT DEVICE.

But, let’s say somehow the FBI accidentally put the SIF on a thumb drive and lost it and it magically got into the hands of ‘hacker gangs.’ Could they reverse engineer the image file sent by Apple to the FBI to unlock other devices? Or could the FBI do the same? Yes and no. Yes because it’s technically possible, no because in order for the software to run on a device it needs to be electronically signed by Apple or it won’t run on any device. So the software would be completely useless.

This is what is keeping the FBI, Chinese intelligence agencies, and hacker collectives like Anonymous from doing something like this right now. An iPhone will not run any software update that has not been signed by Apple. Period. And nobody has the signing keys but Apple. This is the whole point of Jailbreaking.

Should Apple loses control of its keys there are bigger things to worry about then the 10-try-wipe feature because hackers can completely change the operating system to do any number of much more scary things — like turn your microphone or camera or location services on at any point, or collect all of your passwords and credit card information through text entry, or just completely brick your device.

Don’t panic though, Apple will never lose control of it’s signing keys. It’s not a valid concern.

Here’s my main point: Apple can easily comply with the order without subverting the security of their devices, or making it easier for your device to be hacked.

It just doesn’t want to.

Why not? They probably want to set legal precedent for political reasons. If Apple appeals the decision and wins and especially if it goes to the supreme court and it wins, Apple and every other technology company never ever has to unlock an encrypted device (at least in the US) ever again for anybody.

Why does it want this? We can’t really know what’s in the mind of Tim Cook. I truly believe that he and many other Silicon Valley companies think it’s a conflict of interest for his company to provide user data to government organizations, no matter who they are, and that’s its a moral imperative to not comply.

But even if that’s true, they have financial motives as well. If they comply with the court order, they have to keep staff on retention to comply with warrants from the hundreds of law enforcement agencies regularly. But more importantly, it may hurt their public image as a company that cares about their customer’s privacy, then they might not sell as many iPhones … you get the point.

Yes, that. It’s possible that Apple is fighting an investigation into a terrorist that murdered 22 people as a publicity stunt so they can keep making money hand over fist selling iPhones and iPads.

That’s probably not true. I’m just saying it’s a possibility.


Whatever their reason though, let’s be clear about one thing. No matter what version of iOS, jailbreaked or not, hacked by custom FBI operating systems or not, the security of your average iOS device isn’t that great in the first place.

Even if Apple loses the case, and is later forced to create a piece of software that will unlock the 10-guess-wipe feature on any device, and it becomes publicly available, the feature itself is a band aid.

4 digit numeric pins are incredibly insecure. Even if you have only 10 guesses. Most hackers prefer to use social engineering (guessing passwords by trolling social media for things like birthdates or major events in your life) over brute-forcing because it takes less effort and less time.

Why? Because if your reading this, and you have an iPhone, your pin is most likely a) the pin to your debit card, b) your birth-year or the birth-year of a loved one, c) the pin to a safe, or d) all of the above.

How did I know that? No, I’m not a magic sorcerer. 4-digit numeric pins are just incredibly difficult to remember, let alone 6-digit numeric pins that Apple recently introduced to increase security in iOS 9. That’s why they’re so incredibly easy to crack.

This why John McAfee can reliably say ‘I’ll decrypt the phone without Apple.’ He’s not saying he’ll make a version of the OS, he’s saying he’ll use social engineering to crack the device. He’ll go deep into the life of the shooter, find relevant 4-digit information and then use that to guess the password. And 99% of the time, he’d be right in 10 guesses. So will your average hacker.

And that’s sort of my point. For most people, many of whom who don’t even use a PIN, Apple’s security system is worthless.

And although the iPhone and other iOS devices have many incredibly secure features, such as 256-bit ASCII encryption, a secure enclave that stores the key-pair to your pin, and features like the 10-mistakes-wipe, these are meaningless because Apple’s entire security philosophy is reliant on the user to care about device security to use the device security.

Apple cares more about ease-of-use then security, and in their mind, that means hiding these security features from their users behind multiple screens in the settings section of your device. That means not forcing users to create a 6-digit alphanumeric code (300 years to crack).

But don’t newer iOS devices have Touch ID? Doesn’t that make the device more secure?

Nope, hackers can simply bypass the feature and guess the pin. Or just find your fingerprint and copy it. Or just put your phone up to your finger when you’re unconscious or using slight of hand.

If the shooters device was a 6 or later with Touch ID enabled, the FBI wouldn’t need Apple to do anything. They’d just push the phone up to the hand of the shooter’s corpse.

Or, in your case, a hacker can simply take anything you’ve touched, 3-d print an artificial fingerprint, and put it up to the scanner on your phone.

So secure.


As a software or hardware developer, security is tricky. You always have to balance security and the User’s Experience. If Apple really cared only about security, it would make you jump through at least 3 different hoops every time you opened your phone. But nobody would use it because nobody wants to deal with that to just open their phone and text their bestie.

But that doesn’t excuse the mixed bag that is Apple’s security for iOS. There’s these wonderful things called passphrases, which are incredibly secure and easy to remember. Much easier to remember a 6-digit alphanumeric code. And many of these 12 to 20 digit passphrases take trillions of years to brute force.

There’s this even cooler thing called multi-factor authentication. What if your phone opened instantly because it detected your fingerprint AND was connected to your Apple Watch over bluetooth AND your GPS location was in a familiar place, and only if all three of those weren’t true would it ask for a secure passphrase. That’s incredibly secure, and also the easiest UX imaginable for you.

This is my problem with Apple calling foul about the FBI case because it’ll make their devices less secure: your iOS devices are already probably insecure.

You can actually go online and buy a black box to hack an iPhone. Plug it in, and it takes care of everything. Right now. Even though Apple hasn’t built their special OS. Even if the judge denied the FBI’s request, that would still be true.

Your phone’s security is shit, anybody can get into it, but you don’t care about that because Apple fought the FBI so you can have a ‘secure’ device.

That’s some good marketing, there Apple. Well played.


Once again, Apple could really be doing this for the right reasons. I just don’t buy it 100%. And it’s harmful for you to believe to your iPhone is secure because of it. If you have an iPhone go and create 6 digit alphanumeric password and disable Touch ID right now. You know you should.