Perspectives on Apple’s response to the FBI court order, part 2

Mobile Makers Academy
Making Mobile
Published in
7 min readFeb 24, 2016

Apple. FBI. We know the letter has stirred up quite a discussion. However, the issue of privacy in mobile devices isn’t a new one, and with the globalization of technology, we wanted to see if thoughts across the pond may differ from those more local. So, we reached out to Jeff Goldberg, head of cryptography for AgileBits (makers of 1Password), in part one of this series. For part two here, we spoke to Jorge Ortiz Fluentes, a developer and security advocate/expert from Madrid, Spain.

The FBI Request

If I had to summarize the request, it boils down to “make a modified version of the iOS operating system in order to make a particular device accessible to forensic analysis.” Digital Forensic Analysis is the application of scientific and analytic techniques to technologic infrastructure in order to identify, preserve, and analyze data, so as to reconstruct past events or simply confirm or deny hypothesis with them.

Basically, it’s the ability see what you’ve been doing. This means the data that you have available in your device — your pictures, contacts, browser history, and messages — can be the target of this process. It also means anything that you have attempted to delete could be recovered. Ring a bell? Let me remind you it has happened before.

Device makers like Apple and Samsung are aware of this risk and take it so seriously as to devote resources to protect us from some of these threats. The basic strategy is to encrypt the data not only while in transit through communication networks, but also when stored in each device.

On Apple devices, once the user has used their PIN or fingerprint to be authenticated, iOS provides decrypted access to a partition of the internal storage of your device that contains the application data. That partition remains available, i.e. decrypted, until the device is locked (when they press the power button). The most sensitive data (whatever has been declared as sensitive by the application that owns it) is decrypted on demand when it is accessed for the first time. This process, transparent to the user, is the authentication process that grants access to the key used to decrypt the data. The key is stored in a secure part of the hardware of the device and cannot be exported. Without that key — which gets deleted when the device is told to erase itself — the data cannot be decrypted.[1]

In the real world, that becomes a huge problem for the law enforcement teams that must prosecute crimes involving the use of an iPhone. Thus, law enforcement agencies and politicians have asked device makers to include access or some other solution when the situation requires it and the judge authorizes it.

Squaring the Circle

The problem with the court order is there is no safe way to achieve what the FBI wants without seriously increasing the risk for every user to have their data unwillingly available to other parties. In this case, the data that law enforcement teams need to access is in an already existing device without such a backdoor. But let me spend a minute thinking how to do that for future devices.

iOS’s encryption method is known as a symmetric[2] cryptographic algorithm named AES. There are three possible ways to introduce the backdoor that law enforcement would like to have:

  1. Change the algorithm for a weaker one. This would reduce the time required to use brute force (trying random passcodes in rapid succession) to decrypt the data. But the drawback is exactly that: the protection would be weaker for the users of the devices.
  2. Introduce changes in their implementation of the AES algorithm to make it weaker. Those changes must remain secret and only available to an authorized group of people. But this is a bad idea because there are people very skilled in reverse engineering. If you believe that hardware can keep the secrets, think again and try to explain why jailbreak or credit card cloning is a real problem. Finally, there is the problem of controlling rightful use of that power.
  3. Introduce a way to access the key stored in the device by a preauthorized user. The password or token or whatever is required to be authenticated in order to grant access to the key must be one of the best kept secrets on earth, because anybody who can pass the authentication will have unlimited access to your data. Again, there is still the problem of watching the watchers, but there is also the issue of keeping that ultra-powerful information secret.

My point is that there is no way to make the encryption weaker for the law enforcement, without undermining the protections put in place for the safety of our data. Even if we accept that, there are huge problems to solve. For example, does the remote wiping of the data work? If the bad guys can delete the key, none of the previous solutions would work.

In the FBI case Apple discusses in its open letter, the mechanism is similar to the third aforementioned option: the FBI is targeting the data of an already existing device which doesn’t have those backdoors in place already. It would have to modify the operating system in order to circumvent other mechanisms, like the authentication that grants access to the key. But the same principles apply. That version of the operating system provides unrestricted power. Access to it must be restricted and controlled for as long as such version of the operating system can be used in a supported device.

The Right of Privacy

Let’s agree privacy is not only something the bad guys need in order to perform their evil acts, but it’s also good for everybody. The need for privacy is obvious when we talk about targets of harassment, public people, or even business owners. Now the question becomes: how important is our privacy?

As far as I know, the U.S.A.’s constitution doesn’t explicitly protect privacy. However, some other constitutions in the world, like the one of my country, Spain, have a paragraph providing explicit protection to the right of privacy to every citizen.

Obviously, there is some political component to this discussion. But in any case, even fundamental rights can be suspended when required and authorized by a judge. I don’t want to make bold statements about privacy here, but I can see the reasoning: Let’s make the data private, but let’s introduce a mechanism that allows intervention when authorized by a judge. Let’s reconcile both interests: privacy and the need to collaborate with the law.

This is where the Gordian knot is. There is no way to weaken privacy that cannot be misused.

The Digital Implications

Over the years, I have heard many times how the analog world and the digital one are at the same time so different and so similar. “Do you want to create a successful business? Then build up trust.” As usual, the devil is in the details.

In the analog world, you build trust with your customers by being respectful — honest, sincere, open, empathic, competent. You want to do exactly the same in the digital world, except you need tools and technologies to replace what you would perceive in the analog world. For example, in the analog world, you know that your are buying from your grocery store because it is in the same place it has been for the last ten years. But it’s harder to know where the bits are coming from in the digital world, so we need things like certificates to prove that a site is the one it pretends to be.

In this case, some cynical people may ask why should we make the digital world stronger that the analog world. After all, it has always been illegal to snoop private communications unless authorized by a judge, yet intelligence agencies, unethical people, and who knows who else have acted otherwise. If somebody, i.e. Apple in this case, implements some backdoor which allows access to our data when required, it shouldn’t be that different. Or should it?

I believe it is significantly different in three specific ways. If you are a U.S. citizen, I hope that you will agree with the first two and at least understand the third is also important to your country. If you come from anywhere else in the world, I hope you agree with all three.

  1. The bar to take advantage of the possibility to access the data unrightfully is much lower in the digital world. Most of the time, malicious hacking is done remotely, and I don’t mean the office next door, but the other side of the world. I am not only referring to reading the data of a device, but having access to the authentication secrets. Add to that the ability to make an infinite number of identical copies of the accessed information, and the result is a huge problem waiting for you in the darkest alleys of your future.
  2. It is impossible[3] to maintain these (temporarily) secret back doors. No matter how smart the people who create these mechanisms are, or how well the system is conceived, it will always have flaws. Do you believe that the people who created Enigma expected that it could be cracked?
  3. If you are a citizen from anywhere else in the world and you buy a product with such a backdoor, what is your situation? How can you trust that device? Everybody knows that U.S. laws protect U.S. citizens, but the purpose in life of intelligence agencies is to gather information, and some of them are devoted to gathering that information from other countries. Please don’t consider me naive. I understand that every country does this to the amount of their skills and dedicated resources. My argument is a more pragmatical one. If the U.S. government forces American companies to sell devices with a backdoor, you are justifying the need for everybody else to use another device. I could easily foresee and even back a European strategic project to have a company that creates competitive devices that cannot be controlled by the American government. Do you think it’s fair to force one of your most relevant industries into that situation?

So I understand the frustration of the people trying to do their best to make a better world, to get rid of organized crime and terrorism. I share it. But back doors are simply not the way to go.

1. To be fair, I have to say that the data cannot be decrypted with even with a fair amount of resources using the techniques that are publicly known in a reasonable amount of time. ↩

2. Symmetric means that the same key is required to encrypt and decrypt the data. ↩

3. Read hardly unlikely if you are exceptical about the power of science and, in particular, reverse engineering. ↩

Originally published at mobilemakers.co.

--

--

Mobile Makers Academy
Making Mobile

We’re a coding bootcamp that teaches iOS and Swift, no experience necessary. We make iOS developers who get jobs, and we’re obsessed with how people learn.