Here’s How Apple CAN Make iPhones That Are BOTH Secure And Accessible By Search Warrant, And Why It Should Do That

David Grace
TECH, GUNS, HEALTH INS, TAXES, EDUCATION

--

By David Grace
www.DavidGraceAuthor.com

Up front:

1) I don’t think the FBI can legally make Apple build special software for them;

2) I don’t think that any government agency should have a passkey or backdoor to unlock all iPhones;

3) I do think that without unreasonably endangering the security of the data on all iPhones Apple can create a secure mechanism so that if it is served with a search warrant it will be able to provide law enforcement with the password to that specific, locked iPhone.

In this post I will suggest one way Apple could do that.

Apple’s Justification For Choosing To Make Uncrackable Phones

Apple has made two claims with regard to locked iPhones:

(A) that the only way to protect the reasonable privacy of iPhone data is to make a totally, 100% unbreakable phone that not even Apple itself can access and

(B) that when balancing (1) the value of totally, 100% guaranteeing the privacy of all iPhone-users’ data versus (2) the value to crime victims of law enforcement having search warrant access to a specific iPhone connected to the crime, that less harm will be done to society and more good will be done for society by totally locking up all data on all iPhones than by very strongly protecting the privacy of iPhone data and creating a secure process for search warrant access to the specific iPhone named in the warrant.

I think Apple is wrong on both counts.

My Original Post Disputing Apple’s Claims

On February 28, 2016 I posted this article on Medium: “Do iPhone-Owning Criminals Have A Right To Be Immune To A Search Warrant?” in which I disputed Apple’s claim that it was impossible to have both very secure iPhones and also provide law enforcement with access to a specific iPhone pursuant to a search warrant.

I argued that Apple could and should have a “tool” or a mechanism what would allow Apple itself to provide the password to a specific iPhone if Apple is served with a valid warrant for that specific phone.

The Response Supporting Apple’s Claims

Several people whom I think don’t want law enforcement to have any access to any data on any phone claimed that it couldn’t be done.

First, when someone who doesn’t want something to happen tells you that it’s impossible, you have to view their opinion with skepticism.

If you don’t want to do something and you believe that you can’t do it, then you never will do it. That doesn’t mean that other people without your negative mindset can’t succeed.

The history of technology is studded with inventions that were made by people who didn’t know that what they were doing was “impossible.”

Second, the people who were claiming that it was impossible for Apple to have a tool or mechanism that would allow Apple itself to have access to a specific iPhone without creating an unreasonable risk to the integrity of other iPhones unjustifiably assumed that such access could be accomplished only by either (1) weakening the general encryption scheme or (2) building a backdoor into every iPhone that could be opened with a magic key that would allow access to anyone who learned the key’s secret.

That assumption was wrong.

How Apple Could Have BOTH Very Secure Phones AND ALSO Provide Search Warrant Access

[NOTE: Here’s the link to a short post about how public/private key encryption works: Password Security Explained In A Simple-Minded Way]

I had an idea about how Apple, and only Apple, could provide search-warrant access to a specific iPhone without unreasonably weakening the strong security of all iPhones.

I wanted some knowledgeable input so I consulted with two individuals on the condition that I would not drag them into this public and sometimes acrimonious controversy.

I took my idea first to a friend who received advanced degrees in computer science and a postdoctoral award from two premiere universities in the Northeast and is currently doing research and design work relating to large commercial software systems.

The second person I contacted was a friend who had been the Director of the cyber security division of a major branch of the United States government.

Both agreed that the mechanism I propose below would both provide very high security for iPhone data and also would allow Apple to unlock a specific iPhone pursuant to a search warrant.

Here’s that mechanism:

1) Apple maintains two air-gapped computers (stand-alone machines not connected to any network) in a secure location under Apple’s sole control. Machine #1 has one USB port which has been modified so that it can read a flash drive but cannot write to a flash drive.

Machine #2 has USB port #1 which has been modified so that it can read a flash drive but cannot write to a flash drive. Machine #2 also has a second USB port, port #2, that has been modified so that it can write data to a flash drive only at the very slow rate of 1 Kbyte per second.

2) When a new iPhone is activated and the user picks his/her password, the password is strongly encrypted inside the phone using the public encryption key Apple has randomly assigned to that phone at the time of manufacture. When the phone is activated that strongly encrypted password is sent to the iCloud along with the phone’s serial number.

If the user elects to change his/her password at any time then the changed password will not take effect on that iPhone unless and until the iCloud has confirmed back to the phone that the new encrypted password has been received. Until the phone receives that iCloud confirmation that the new password has been received the previous password will remain in force.

3) The iCloud will store the serial numbers and associated strongly encrypted passwords of newly activated phones and phones with changed passwords only for a short period of time (a few hours) until an Apple employee in the room with air-gapped Machine #1 copies the encrypted passwords and associated serial numbers from the iCloud to a flash drive, walks that drive across the room, and copies the encrypted passwords and associated serial numbers to a database in Machine #1 via the read-only USB Port #1.

The database in Machine #1 has records that have only two fields, (A) each iPhone’s serial number and (B) the highly encrypted password for that particular iPhone.

4) When the copying is complete the Apple employee notifies the iCloud that the encrypted passwords and associated serial numbers have been successfully copied to Machine #1. At that point the iCloud securely erases its temporary file of encrypted passwords and serial numbers.*

5) Machine #2 has a database where each record has two fields: (A) an iPhone serial number and (B) the password’s private decryption key chosen by Apple for that serial number phone at the time of manufacture.

I would assume that there would be thousands or millions of different encryption/decryption key pairs, each of which would be randomly assigned to a particular phone at the time of manufacture. The private, password decryption key would not be stored on the cloud but rather stored on the air-gapped Machine #2.

On a daily or an hourly basis during the manufacturing process the decryption keys and their associated serial numbers would be written to air-gapped Machine #2 and upon confirmation of successful entry on Machine #2 they would be erased from other Apple locations.*

6) When a law-enforcement agency has a search warrant they send the warrant and the target phone’s serial number to Apple as a PDF file. An Apple employee verifies that the warrant is valid then issues an authorization to provide the password.

7) The authorization is delivered to the secure room holding the air-gapped machines. The tech manually enters the target phone’s serial number into Machine #2. Software in Machine #2 copies the password’s decryption key to a flash drive over the very low speed USB port #2.

8) The Apple employee removes the flash drive from Machine #2, port #2 and plugs it into the read-only USB port on Machine #1. The employee enters the serial number of the target iPhone named in the warrant. Software then accesses the decryption key in the flash drive for that particular iPhone, decrypts the password for that specific serial number phone and displays the in-the-clear password on Machine #2’s monitor.

9) The Apple employee then takes a very sophisticated pencil and writes the password on a high-tech piece of paper which he/she then delivers to the Apple employee who gave the access authorization in the first place.

10) The Apple employee who initially reviewed the warrant then contacts the law enforcement agency that served the warrant and gives it the in-the-clear password to that specific phone named in the warrant so that the police agency can then unlock that iPhone with that password.

*[Apple security professionals would, of course, need to design a catastrophic secure backup procedure in case of earthquake, fire, etc. I’m sure that something like that is already in place for Apple’s IOS source code and other highly confidential Apple records and data.]

Under this plan that single iPhone and only that one iPhone could be opened only by Apple pursuant to a search warrant.

The government would have no access to any other iPhones.

There would be no back door.

Only Apple itself would hold the highly encrypted passwords. The passwords’ decryption keys would be held only by Apple in a secure Apple facility with no network connections.

Apple’s password access would only be on air-gapped machines which would not have any hardware that would allow any mass copying of the encrypted passwords or the decryption keys.

Apple would not have a copy of an iPhone’s data nor the decryption key to the phone’s data.

Taken together that’s a far more secure environment than banks, hospitals, VISA, and pretty much everything else in our society has.

But, What About Ninjas?

Now, before someone says, “But wait, the Mission Impossible team or perhaps half dozen ninja’s could break into Apple headquarters and get the password to a particular phone” or perhaps “But wait, the CIA could kidnap the spouse of an Apple employee and blackmail him/her into stealing a password.”

If the CIA, NSA, FSB, Dr. Evil or whomever wants to get into your iPhone badly enough to be willing to stage a criminal attack on Apple headquarters or its employees then it will be infinitely simpler for them to just grab you, put a gun to you kneecap and say, “Type in your password.”

So, can we let the ninja argument go and get back to planet earth?

But is this scheme absolutely, perfectly, entirely unbreakable?

Of course not. Nothing is.

Is it so very, very safe as to make the risk of the theft of an iPhone’s data extremely, exceedingly infinitesimally tiny?

Yes.

Why Should Apple Have To Go Through All This Inconvenience?

According to a March 20, 2016 article on Bloomberg Business, Apple received approximately 5,000 law enforcement requests for iPhone access in the first six months of 2015 or about 30 per day. If the process I outlined above took an hour per request that’s thirty human-hours per day or about 3 1/2 people per day. Double that to seven people per day. Out of about 66,000 Apple U.S. employees. Seven more employees. Out of 66,000.

What would be the cost of those five employees? Maybe $500,000/year. Plus the set-up costs for this system.

How much is Apple spending to fight the FBI? My guess is that its attorney’s fees will end up well north of $20,000,000. When you think about it, Apple could fund this entire proposed warrant-access program for several years with just what it’s paying its lawyers to fight the FBI today. It would be far cheaper for Apple to provide search-warrant access under this plan than to fight search-warrant access in court.

But there’s a larger issue than merely dollars.

Every business has legal obligations to the society in which it sells its products.

The auto industry has to keep track of all kinds of accidents involving its vehicles, file reports, deliver data, etc. When you sell cars in any country you have to do many things that relate to how the cars are used, how they perform, what parts you have to make available, failure rates, etc. That’s one of the costs of being in the car business.

If you’re a bank you have to report all kinds of transactions and you’re legally required to deliver all kinds of data both to the government and to your customers and maintain all kinds of records. That’s very costly for the financial industry, but it’s a cost of doing business.

You can pick any industry you want — energy, food, phama, communications, etc. Companies are legally required to do lots of things that cost them money that they don’t want to do because those activities are deemed necessary or beneficial to the society in which that company sells its products and makes its money.

The Bank of America doesn’t like having to spend its time or money responding to search warrants and subpoenas, but that’s a cost of its doing business. The same with Verizon and AT&T. The same with Walmart. The same with every business.

Apple has chosen to make a device that will very often hold evidence of a crime or evidence that can be vital to solving a crime, preventing a crime, or convicting or acquitting an accused.

Apple has now chosen to make that device inaccessible to law enforcement. Apple has chosen to make billions of dollars selling that device in the United States. It can make those choices, but with choices come responsibility.

There is a cost of doing business associated with those choices, namely setting up a mechanism that provides law enforcement with search-warrant access to evidence contained in those Apple products.

Don’t Search Warrants Violate My Rights?

Every time you exercise some right of yours, to some degree you potentially infringe some right of mine. Rights don’t exist in a vacuum nor do they exist only for you and not for me.

Someone steals my diary, gives it to you, and you want to publish it under your right of freedom of the press. I don’t want you to publish it, citing my right of privacy. Our rights are in conflict.

Your right of the freedom of the press is not unlimited. In that conflict your right of freedom of the press loses to my right of privacy.

You have the right of freedom of speech, but if you want to stand up at a public meeting and say, “John Smith has said bad things about our Glorious Leader. Smith is a danger to right-thinking Americans. He lives at 328 Main Street. I want you people to track him down, kill him, fire-bomb his house and teach him and everyone who agrees with him that he can’t get away with saying the kinds of things he’s been saying about us” your right of freedom of speech loses to my right of freedom of speech (to criticize your Leader) and my right not to be killed.

Similarly, when you say, “I have a right to keep my data private” you’re right, you do, but only up to the point that your right to privacy begins to unreasonably infringe some of my rights.

Your right to keep your data private is not now and never has been unlimited.

If there is a search warrant, you don’t have a privacy right to the child pornography saved on your home computer nor saved on your phone.

If there is a search warrant, you don’t have a privacy right to map locations or notes saved on your home computer nor saved on your phone detailing your plot help your girlfriend murder her husband or where you dumped his body.

If there is a search warrant you don’t have a privacy right in your house, your car, your home computer or your phone’s memory for evidence of a crime.

The balancing point between the likelihood of and the severity of damage to your right of privacy versus the likelihood of and the severity of damage to the property and lives of others is determined by a judge and is expressed via his/her denying or granting a search warrant. That’s how it has worked for over two hundred years.

So, no, you do not now have and never did have an unlimited right to a total guarantee that your data will be 100.000% safe instead of 99.999% safe no matter what damage your reducing that risk by an additional .001% will do to the rights and health and safety of other people who live in the world with you. If reducing your risk by .001% results in creating a material risk to the lives and property of others then you don’t get to do it.

It’s not all about you. It’s all about us.

How Does All This Theory Relate To The Real World?

Case One — No Search Warrant Access: I store lots of personal emails, credit card purchases, photos and the like on my iPhone. What if I happen to run afoul of the CIA and it decides to engage in a massive effort to get me? What if the CIA turns all the resources of the government to the task of stealing my iPhone photos of me having sex with my girlfriend? Couldn’t they infiltrate Apple with a mole employee and get my password and then get the pictures of me and my girlfriend?

That possibility is why all iPhones should be completely locked no matter how many criminals get away as a result.

Case Two — Search Warrant Access: A pedophile gets a job in an elementary school, nursery school, day-care center, youth athletic program, or the like. Not at all an unusual event. Knowing that Apple has specifically designed the iPhone to be totally immune to search warrants he buys an iPhone which he uses to take inappropriate pictures of some of the children to whom he has access and then he uses the iPhone’s camera to photograph himself abusing some of the children.

The police get information about his activities and they get a warrant but the iPhone is unbreakable.

The individual quits his job and moves to another city or state, maybe he changes his name, then he gets a job at another school.

That sort of risk is why law enforcement should have search warrant access to iPhones.

I would argue that a comparison of the likelihood of (1) the bad event happening, times (2) the level of harm done by that bad event in each of the two cases results in the inescapable conclusion that there should be search warrant access.

Why Should I Care About Anyone Else?

One of my friends recently said to me, “If the police can’t get inside a particular phone I’m not going to worry about it.”

Well, first, it’s not one phone. At the rate reported in the Bloomberg article it’s about 10,000 phones, 10,000 crimes and 10,000 victims each year.

Second, if someone I care about is robbed, shot, molested, raped, kidnapped or sex-trafficked I am going to worry about it. That’s important to me, and you don’t get to say, “I want what I want and I don’t care what my getting it does to other people. Too bad for them.”

That’s not your call.

At some point we all have to act reasonably and we have to balance the risks that our conduct creates for ourselves against the risks it creates for others.

Balancing those risks is why we have judges and court orders.

Apple’s Risk

Erin Andrews sued a hotel for giving her room number to another apparent business guest at that same hotel. That guest secretly and illegally replaced the peephole on her room’s door with one that allowed him to take pictures of her while she was undressed. A jury decided that the hotel must pay her approximately $27 million to compensate her for their responding to the other guest’s request for her room number. Juries respond to victim’s pleas and tears. They do.

If someone disappears and the location of the victim could be discovered from a locked iPhone for which a search warrant has been issued, but Apple’s response to pleas for help is “We’ve deliberately designed our system so that we can’t help you. Sorry, you’re out of luck.” And then they find the victim’s body.

In that event Apple is likely to find itself in court facing a grieving family that perhaps never owned an iPhone, and it will have to sit there and watch as the sobbing parents give pictures of their murdered child to the jury.

Apple might win the first such case with the argument that it chose to adopt its “no access” policy because it was good for iPhone owners and that therefore Apple shouldn’t be held responsible for injuries that its policy allowed to happen to other people who don’t own iPhones.

It might win the first ten cases where lives or property were lost that would not have been lost but for Apple’s choice to make their phones immune to a search warrant. But eventually Apple will lose one. And then things will get very interesting for Apple.

When Admiral Mike Rogers, the Director of the National Security Agency, was recently interviewed by Charlie Rose he made an interesting comment on the risks of Apple’s position. He said that he feared what extreme things Congress might do if a terrorist attack occurred and law enforcement could show that it could have prevented the deaths had Apple not elected to make its phones immune to search warrants.

Consider this scenario: There is a domestic terrorist event like the mass shootings in Paris or the London bombings with a material loss of life. The next day the Director of the FBI holds a press conference in which he says:

“FBI agents obtained a warrant for the iPhone of one of the participants in the plot. We secretly gained possession of the phone for a few hours. We begged Apple to help us gain access to the data. Apple refused, claiming that it had deliberately designed the phone with the specific intent that law enforcement could not gain access to it under any circumstances. Had we been able to access that phone we would have discovered the details of the plot and we would have been able to stop it in time.

“Because of Apple’s voluntary choice to make its phones immune to a search warrant the attack took place as planned and eighty-seven people were killed and one-hundred fifty-six more were wounded. We are asking Congress to immediately pass emergency legislation to make sure that this disaster never happens again.”

What do you think Congress is going to do then? Remember the provisions that were put into the Patriot Act when it was originally passed shortly after 9/11?

The word “draconian” springs to mind.

In adopting it’s hard line, “No search-warrant access under any circumstances” policy Apple is not playing with fire.

Apple is playing with plutonium.

–David Grace (WWW.DavidGraceAuthor.com)

To see a searchable list of all David Grace’s columns in chronological order, CLICK HERE

To see a list of David Grace’s columns sorted by topic/subject matter, CLICK HERE.

--

--

David Grace
TECH, GUNS, HEALTH INS, TAXES, EDUCATION

Graduate of Stanford University & U.C. Berkeley Law School. Author of 16 novels and over 400 Medium columns on Economics, Politics, Law, Humor & Satire.