Perspectives on Apple’s response to the FBI court order, part 1

Mobile Makers Academy
Making Mobile
Published in
4 min readFeb 23, 2016

Apple. FBI. We know the letter has stirred up quite a discussion. However, the issue of privacy in mobile devices isn’t a new one, and with the globalization of technology, we wanted to see if thoughts across the pond may differ from those here in the U.S. To explore the topic, we reached out to Jeff Goldberg, Security Architect for AgileBits (makers of 1Password), and Jorge Ortiz Fluentes, a developer and security advocate/expert from Madrid, Spain.

Here’s what Jeff had to say:

MMA: What are the implications of the tool the FBI is asking Apple to build?

Jeff: At one level, the FBI demand and accompanying court order look innocuous. There is absolutely no question that the government has the right to the data on that phone. But we need to understand that what is being asked is not a search warrant. From what I understand, Apple has fully complied with all search warrants and handed over any data that they possess.

What is happening here is that Apple is being ordered to build a version of iOS specifically for that phone and sign that version as an update. One of the security measures that iPhones and many devices is that they will not install updates unless those updates are cryptographically signed by the right entities.

So Apple isn’t being asked to hand over information Apple holds about the San Bernardino terrorists (they presumably have already complied with that kind of request), nor are they being asked to hand over some key to that phone (Apple doesn’t have one). Apple is being asked to build a security-impaired version of iOS and sign it so that it can be installed on that device.

So the question is whether software producers can be compelled to produce and deliver systems with deliberately weakened security.

MMA: Is the FBI’s request ethical and/or reasonable?

The FBI is right to want to have access to data that it legally has the right to. If the FBI were not trying to break into that phone, they wouldn’t be doing their job.

So it is right for the FBI to request this. But I think it is wrong for the courts to grant the request. The FBI is right to want it, but they are wrong to expect it.

MMA: Why is Apple defending its users’ privacy so vehemently when other companies like Facebook don’t seem as concerned?

I’m going to give two separate answers to this. The obvious one is that businesses like Facebook are in the business of holding, analyzing, and using data about people who use the system. Apple, on the other hand, has taken the position that they don’t want to have data about people. Apple has worked very hard to build their systems exactly so that Apple doesn’t have any data about people.

But this is far more than a customer privacy issue, and I expect we’ll see Facebook say that they support Apple in this case. This is because there is a difference between turning over the information you have about people versus being ordered to create weakened versions of your own system. Even Facebook has something to lose here.

This matter is coming to a head for Apple because the *only* way for the FBI to get the data it seeks from the device is to have Apple produce and sign a weakened version of iOS for that specific device. If the FBI wants someone’s Facebook data, a simple search warrant to Facebook will suffice because Facebook actually hold the data that the FBI is interested in. Apple does not hold the data, or at least not in a form that it can decrypt.

MMA: What does this mean for the future of security and privacy?

Well, it depends on how things play out in court. It also depends on how future courts look at the ruling as a precedent.

If Apple loses, I am terrified. At AgileBits, we have worked extremely hard on 1Password to simply not be in a position to learn our customers’ secrets. This is not because we want to protect criminals from prosecution, it’s because we want to protect our customers from criminals. If we don’t have the ability to learn our customers secrets, there is nothing for criminals to steal from us. By designing systems so that we don’t have and can’t acquire your secrets, our customers enjoy security that protects them from both us and someone who may compromise us.

However, there is one weakness. The security of this system depends on the software we deliver to people. That software running on people’s own computers and devices must do the right thing. If we were to establish a mechanism for delivering weakened versions of our software, there would be no way to guarantee that the mechanism was only usable by ‘the good guys.’

Creating a mechanism to deliver weakened apps would undermine the security architecture that we have built. Even if that mechanism is never used, its existence creates a nasty weak spot in our security. Nasty weak spots get exploited by bad guys more often than by good guys.

And this isn’t just about us and Apple. This is about pretty much everything that you use to maintain basic security.

MMA: And of course, anything else that you’d like to add.

It is important to note that the phone in question is an iPhone 5C, which is the last iPhone that doesn’t have the “Secure Enclave”. The Secure Enclave is a bit of very-firm-ware on some hardware that adds stricter limits on attempts to guess the device passcode. Had this been an iPhone 5S (or later), the FBI would have to find a way around the Secure Enclave in addition to what it is asking Apple to do.

In a sense, the Secure Enclave is there to maintain device security even when the operating system fails to do so.

Originally published at mobilemakers.co.

--

--

Mobile Makers Academy
Making Mobile

We’re a coding bootcamp that teaches iOS and Swift, no experience necessary. We make iOS developers who get jobs, and we’re obsessed with how people learn.