Brute Force

Privacy & Security in the Age of Cryptography

SecureSet
Command Line
19 min readJan 6, 2017

--

Alex Kreilein, Co-founder & MP, SecureSet Accelerator, & Austin Chambers, Attorney, Lewis, Bess, Williams & Weese

On February 16, 2016, the United States District Court for the Central District of California issued an order that required Apple to assist the FBI in the search of the iPhone used by one the the San Bernardino shooters. The Order requires Apple to comply with an FBI demand that Apple build and install software disabling the countermeasures on the iPhone 5c running iOS 9 that wipe the iPhone after 10 failed passcode attempts. The device belonged to the San Bernardino County Department of Public Health, which has given the government permission to search the phone.

The FBI in this specific circumstance is not requiring that Apple unlock this iPhone. In this circumstance, the FBI is also not requiring Apple apply a new form of cryptography to this or other iPhones (we’ll get to this later). However, the FBI is requiring that Apple develop and install software on this iPhone to allow the Bureau to run possible passcode combinations until the phone unlocks without fear of triggering the security countermeasures that erase data on the device automatically.

Legal Standing

The legal arguments behind the Apple v. FBI1 case are the subject of intense scrutiny, although despite that scrutiny, the case itself is regularly mischaracterized. One side alleges the FBI is desperately seeking precedent necessary to unlock millions of devices through one of any manner of doors using only the antiquated All Writs Act of 1789 (“AWA”) as the key. Meanwhile the other side alleges Apple is preventing access to just a single phone — one belonging to a dead ISIS terrorist, no less — in what amounts to nothing more than a twisted marketing ploy. We believe both characterizations factually miss the mark, and in so doing, obscure the ramifications of the case itself.

The outcomes of this matter depend heavily on how the court interprets the powers and limits of the AWA. Apple raised interesting First and Fifth Amendment arguments, however, these are less likely to be deciding factors in the case, so we will leave those for others to assess. Ultimately, the court must decide two fundamental questions: Is the AWA applicable to this case, and if so, would the AWA allow a court to order Apple to create a new, but insecure, version of iOS?

Does the AWA apply?

The AWA was passed as part of the Judiciary Act of 1789, the same act that created the U.S. federal court system. The AWA provides that courts may issues all writs — or orders — that are necessary or appropriate in aid of their respective jurisdictions and agreeable to the usages and principles of law. In other words, courts can issue writs to “fill gaps” as necessary to give effect to various aspects of the judicial duties. The AWA appeared early in U.S. judicial history, even playing a part in the case Marbury v. Madison, a landmark case forming the basis for the exercise of judicial review. In the time since, it has been used in cases ranging from requiring custodians to bring a prisoner to court for their own appeal (filling a gap in the traditional writ of Habeas Corpus) to compelling phone companies to place a pen register on a phone line. Thus, the AWA has been both hero and villain in vindicating the constitutional rights of defendants, while also giving rise to contentious expansions of courts’ constitutional authority.

Fundamentally, the AWA applies when the court is within its ‘jurisdiction.’ The courts are generally not empowered to act under AWA when restricted by statute or by common law principles. Two key factors can be used by the court to assess whether a matter is within the scope of the AWA. First, whether congress has spoken to the issue and limited the actions of the judiciary. For example, a law may set forth standards and limits for a court’s actions. Second, whether the action it seeks to compel fits within the scope of traditional judicial authority and complies with general legal principles.

The main exception to whether the AWA applies occurs when Congress addresses the issue at hand by enacting a particular law. Apple argues the Communications Assistance for Law Enforcement Act (“CALEA”) does just that by addressing when a private party may be compelled to aid law enforcement in cases of encryption, wiretapping and other similar issues. CALEA ensures “that law enforcement [has] a narrowly focused capability to carry out lawfully authorized surveillance on public switched and cellular networks, but imposing certain privacy protections and limitations” said Albert Gidari, Director of Privacy at Stanford Law School’s Center for Internet and Society.

CALEA requires telecommunications carriers and manufacturers of telecommunications equipment to make certain accommodations for legal wiretapping by authorized law enforcement, such as built-in surveillance. The Act states the authorities under CALEA should not be used to “imped[e] the development of new communications services and technologies.” Moreover, while the statute covers broadband Internet and VoIP traffic, Gidari notes that “Congress also determined that carriers would have no responsibility to decrypt encrypted communications unless the carrier provided the encryption and could in fact decrypt it. CALEA did not prohibit a carrier from deploying an encryption service for which it did not retain the ability to decrypt communications for law enforcement access, period.”

Although CALEA applies to telecommunications providers and sets forth clear obligations regarding wiretaps and surveillance, CALEA applies less clearly to companies such as Apple. Apple is not a telecommunications provider, but instead an ‘electronic communications services provider’ or ‘information services provider.’ Further, and central to the case, CALEA states it “does not authorize [the government] to require any specific design or equipment, facilities, services, features, or system configurations to be adopted by … communications services providers.”

We should note CALEA was passed into law in 1994 after years of debate and compromise. At that time, mobile devices and the application economy were not yet reality. But the FBI historically used CALEA to meet the needs of law enforcement because, until recently, it governed the infrastructure germane to law enforcement’s requirements in the area of communication. Culturally, law enforcement believes it ought to have access — and the law is on its side. But in this case, when the law does not apply, cultural indoctrination cannot comport with the lack of access. And to that extent, it causes conflict between tech companies and law enforcement.

Ultimately, the crux of the issue is whether its wording means CALEA does not enable law enforcement to take certain action or if CALEA prohibits what the courts and law enforcement may require of communications services providers, including Apple. The wording suggests, and the FBI argues convincingly, CALEA does not apply to Apple, thereby leaving the door open to alternative means of compelling Apple to provide assistance, including, potentially under the AWA.

Opponents reply if the AWA already empowers law enforcement to seek a court order to compel a provider or manufacturer to build in back door access to communications, Congress would not have needed to pass CALEA in 1994. CALEA is specific in the manner in which it applies to — and does not apply to — the assistance technology companies must provide to law enforcement. Courts could easily find Congress only sought to establish predictable limits, rather than rely on ad hoc rules under the AWA, during a period of intense development in telecommunications. Consequently, CALEA may do little to alter the long history of compelled assistance under the AWA as applied to Apple.

The final issue, then, is whether the Order Apple faces complies with general legal principles. Apple argues it cannot be compelled to provide the requested services. Indeed, the law is historically opposed to ordering relief that requires a party to provide services to another against its will. However, the extent to which an individual may be ordered to provide services to the government is fact-specific, and such arguments will likely hold more weight when considering whether those actions are ‘reasonable’ rather than simply within the bound of general legal principles.

If the AWA applies is the order allowed?

The court ordered Apple to take several actions in order to give effect to a warrant to search the iPhone belonging to Syed Rizwan Farook, who we will refer to as the San Bernardino shooter. Specifically, the court’s order required Apple to allow passcodes to be input via hardware ports rather than only the screen, to allow more than 10 passcode attempts, and to reduce the time delay between passcode attempts (the “Order”). To comply, Apple would be forced to develop a new version of iOS, provide its digital signature assuring its safety and authenticity, and install it — a process requiring several developers and at least four weeks to complete. The new iOS could be installed only on the device subject to investigation — assuming Apple would control installation and eventual deletion. Together, and for the other reasons we discuss, Apple argues principally that this requirement is burdensome and unreasonable.

Apple’s case is framed by the seminal case allowing surveillance under the AWA: United States v. NY Telephone Co. In that case, which predated CALEA, a suspect was using NY Telephone Co.’s service to conduct illegal gambling operations. The government sought to install a pen register to record the numbers dialed by the suspect. However, the FBI could not place the tap itself because it did not own the facilities, and doing so without the aid of NY Telephone Co. might compromise the investigation. The Supreme Court ultimately ruled for the government, and as clarified in NY Telephone Co. and derivative cases (many post-CALEA) three factors determine whether a party could be compelled to provide surveillance under the AWA: (1) the party must not be too far removed from the controversy; (2) the assistance must be necessary; and (3) the assistance must not be unreasonably burdensome.

Apple argues it is too far removed from the controversy to forced to comply because (1) it does not own the phone in question, (2) it does not hold the encryption key, and (3) it is unrelated to the past criminal conduct. In the cases decided previously, courts compelled third parties only to help prevent ongoing crimes, and which generally took place either using that third party’s products, on its property, or using its facilities. The government argues here that Apple is closely connected to the case as it owns and licenses the OS, provides regular updates, and regularly receives data from the phones after the sale. Apple makes compelling countervailing arguments. However, the government’s reliance on Apple’s ownership is strikingly opportunistic when considering the software is licensed primarily due to limitations of copyright law and the first-sale doctrine. Advancing such a holding would have considerable and far-reaching implications in the law and this issue speaks directly to the inaptitude of AWA as presently interpreted to frame this case.

As to the second factor, the government argues that Apple’s assistance is necessary for several reasons. Most importantly, the government argues that Apple must sign all updates to iOS using its confidential private encryption key. Without that key, there can be no update; without the update, there can be no change to the security settings. Apple retorts that, while its encryption key is required for updates, it isn’t required to recover the (separate) encryption key to unlock the phone.

As we discuss below, technologists believe the NSA is capable of determining the encryption key on its own (although it could require significant effort.) If that were the case, however, Apple’s help would be unnecessary. Hence, yet again, we are faced with a troublesome choice: direct a company to create vulnerable software against its will (or worse, commandeer its private key and source code) or expend significant effort to find, if it is even possible, the device’s encryption key through alternative means — means that cannot easily scale to meet the needs of law enforcement writ large.

For the final factor, Apple argues the request is burdensome. This factor is perhaps the most flexible and the most difficult to resolve. Comparing past AWA cases, Apple’s obligations appear relatively more complex and labor intensive, but objectively, are not difficult or costly relative to Apple’s gross revenue. So what’s the issue? Apple cites the weight of potential burdens for the future — namely, that the compromised software developed in response to this Order will be a perpetual target for hackers, violate the company’s ethics, and leave the door open for totalitarian governments to request similar services. Each of these outcomes imposes serious costs for Apple in the form of software development, and also increases the risk to regular users of the software falling in the wrong hands. Once modified to run on any identified device, the software may be used to compromise the security of any Apple device. Apple’s arguments are forceful and convincing, especially when you consider the FBI’s demands bear a striking resemblance to those CALEA expressly prohibits the FBI from making of telecommunications providers.

That said, the court may decide that Apple’s argument is simply irrelevant in this specific case. The government argues that ‘hypothetical’ future burdens are irrelevant to the analysis of the burden in a single case. In other words, although the long term burden is likely significant, it is speculative, and thus may not be relevant when resolving this case. Further, telecommunication providers have fundamentally differences from Apple. The justice department further argues that CALEA only describes measures that are per se unreasonable in the telecommunications context, but that does not stop the court from answering whether such measures are reasonable in this particular case.

Stepping back, it becomes clear the AWA analysis complicates the matter significantly — and provides no prospects for a long term, balanced solution. The factors the court must consider do not suit the intricacies of the facts at hand. Each side seeks precedent, but either precedent — whether platform-level software developers must help the government when their software is at issue, or that governments faced with encryption must wait for congressional action or lose forever all encrypted evidence — will be unacceptable to the other.

Is the AWA really the best option?

Fundamentally, this is a case in search of a limiting principle. Apple faces relentless and increasingly sophisticated cybersecurity attacks, as well as an inexorable rise in law enforcement requests to access and decrypt seized devices. To protect itself, Apple has chosen to encrypt everything and throw away the key. The government, on the other hand, must do its job despite a paralyzed congress and a growing number of encrypted devices holding important evidence. To press forward, the FBI resorted to one of its oldest but most venerable tools: the AWA.

The case of Apple v. FBI must determine whether that broad and indispensable legal tool can square the demands of law enforcement with the modern technology and cybersecurity landscape without compromising basic legal and mathematical principles. Any policy must focus first on the mathematics and engineering implementations as they cannot be willed away by the force of law.

It is difficult to say whether the AWA, if it is applicable in this case, is up to this task. Indeed, the AWA may be the last and worst option. In their briefs, neither Apple nor the FBI has hinted at a viable limiting principle for how the AWA can balance the deep technical, political and social issues in play. As the case moves forward, all eyes should be on what limits the court places on the AWA, and whether those limits, if any, can provide a workable solution. We have our doubts whether the lens of the AWA will provide clarity or resolution, but we must hope that somewhere in its 227-year history lies an answer.

Policy implications of an FBI win

While the FBI attempts to scope this matter to a specific iPhone, others hope it extends far beyond southern California. In Manhattan, District Attorney Cyrus Vance has stated that his office alone has more than 175 iPhones it currently cannot access for the purposes of law enforcement due to device encryption. It is logical to expect the Manhattan District Attorney would seek a similar court order to unlock those iPhones and all future iPhones implicated in a law enforcement action. This won’t be constrained to the borough of Manhattan.

While the “slippery slope” argument is overused in the analysis of technology policy and the law, it applies here.

When the NSA bulk data collection program(s) was first designed, it was done to capture certain traffic from foreign sources. Then it was modified to include foreign nationals operating domestically and communicating with foreign sources. Later it was permuted again to include the bulk collection of all communications by foreigners and Americans operating both domestically and internationally, but also introduced encryption to ensure privacy of U.S. citizens’ communications unless a warrant was presented. Finally, the encryption processes were removed allowing for bulk data collection of all people always. New law and current policy have addressed these processes to an extent, but according to many technologists, legal scholars and privacy advocates it now exists en mass without adequate privacy controls. And while we pass no judgment on these practices, the slope is undeniably slippery.

One not-so-subtle implication of the current iPhone debate merits noting: the FBI is attempting to compel a private company to alter its product, without limitation, and against its will. The FBI’s motives are legitimate but the long-term impact is hugely problematic for an increasingly connected society.

These are not the Droids…

The political and rhetorical debate has ignored one critical flaw: To say all iPhones cannot be accessed without the assistance of Apple is incorrect.

The newest iPhones (5s and higher) use the concept of the “Secure Enclave” on the device. Secure Enclave is a section of the primary flash memory on the same physical chip with the OS partition and User Partition storing the encryption key for the encrypted part of flash memory. This key — and not the data itself — is what is automatically erased by the iOS when exceeding the limit for passcode attempts. Removing the key just renders the data inaccessible as the systems denies access to data on the basis of no key being present. This is similar to deleting a file pointer in the file allocation table on a partition. Without the pointer, data cannot be accessed. It makes it possible to copy the contents of the Secure Enclave, try 10 passcodes, let the key be erased, then rewrite the contents of the Secure Enclave back on to the disk, and try again.

A similar process can be enabled on an iPhone 5c running iOS 9. Apple notes, “The file system key is stored in Effaceable Storage. Since it’s stored on the device, this key is not used to maintain the confidentiality of data; instead, it’s designed to be quickly erased on demand (by the user, with the “Erase all content and settings” option, or by a user or administrator issuing a remote wipe command from a mobile device management (MDM) server, Exchange ActiveSync, or iCloud).” However, the data itself is not destroyed. “Erasing the key in this manner renders all files cryptographically inaccessible.”

Imagine that you had a filing cabinet filled with important data. To prevent unwanted access to that information, you placed the key in your desk drawer. When someone walks up to the locked filing cabinet and jiggles the handle without the key, the cabinet does not implode. Even if the lock is picked, you can change the lock and keep the data. Beyond this, if the key is stolen from the desk, the lock can still be changed and the contents will remain intact. Similarly, when 10 failed passcode attempts are made on an iPhone the data remains intact. But a key is needed to access it.

Rather than a desk drawer, the file system key for the iPhone is stored in the Effaceable Storage in the “NAND” flash memory. The FBI can ensure its ability to continuously brute-force the device passcode by copying the file system key from the Effaceable Storage in flash memory prior to making 10 attempts. This method should allow for an indefinite number of attempts at unlocking the phone and restoring the NAND flash memory from a backup copy over and over again.

The above diagram describes the key system mechanism. Daniel Kahn Gillmor, Technology Fellow, ACLU Speech, Privacy, and Technology Project discusses the hardware implementation in excellent detail.

But if brute-forcing of the device is possible, then why has the FBI not done it? We believe there are four possible reasons the FBI has not hardware-hacked the device:

  1. The FBI does not want to physically tamper with evidence;
  2. The FBI does not know how to do this work;
  3. The FBI is not certain this approach will work; or,
  4. The FBI does not want it to work.

This train of thought leads to another important question.

Why has the NSA been absent from this event? It’s important to understand this absence given the significant technical capabilities of the NSA. We believe that there are four possible reasons for it:

  1. The NSA does not want to help the FBI;
  2. The NSA tried to help the FBI but failed;
  3. The NSA is not allowed to help the FBI; or,
  4. The FBI does not want the NSA’s help.

In either circumstance of hardware-hacking the iPhone or working with experts at the NSA, the FBI doesn’t get what any law enforcement body really wants: precedent.

And this debate is not about a one particular iPhone. The debate is much larger than that.

Return of the Crypto Wars

This case is about more than one phone, and in truth, it represents yet another chapter in a long-running saga. This renewed controversy is symptomatic of a much larger issue. Law enforcement is increasingly placed in the impossible position of need for real-time access to actionable information critical to public safety and national security impeded by modern cryptography. This is not a new issue. Abelson, et al, wrote about it in a their 2015 paper, Keys Under Doormats:

“Twenty years ago, law enforcement organizations lobbied to require data and communication services to engineer their products to guarantee law enforcement access to all data. After lengthy debate and vigorous predictions of enforcement channels “going dark,” these attempts to regulate the emerging Internet were abandoned. In the intervening years, innovation on the Internet flourished, and law enforcement agencies found new and more effective means of accessing vastly larger quantities of data. Today we are again hearing calls for regulation to mandate the provision of exceptional access mechanisms. In this report, a group of computer scientists and security experts, many of whom participated in a 1997 study of these same topics, has convened to explore the likely effects of imposing extraordinary access mandates.”

The strong cryptography that impedes the FBI is the same cryptography that impedes hackers as they try to gain illegitimate access to devices, data, applications and networks. Those threat actors specifically target companies like Apple with people, processes and technology that can be used to access information for billions of consumers. And this goes far beyond Apple, Google, Facebook and WhatsApp. Cryptography is used everywhere — all the time — to protect all connected people.

When you input the password to your Amazon account, cryptography secures the information. When you communicate with your team over Slack, cryptography keeps it safe. When your bank routes money to your buddy, cryptography covers the trail. And when your doctor updates your electronic medical records, cryptography keeps it private.

If we assume the FBI can use the AWA to overcome Apple and its countermeasures on a specific iPhone, what is stopping the FBI from approaching RSA in an effort to compel them to magically overcome its cryptographic implementations used on billions of devices, data sets, applications and networks?

If ever there were a single server for which law enforcement required access — and its access were impeded by RSA’s asymmetric cryptographic algorithm — the outcome of the current debate over the San Bernardino shooter’s iPhone would govern the answer to that question. This would put at risk the billions of cryptographic transactions made every day that and of which we are blissfully unaware.

The trouble with implementation

The most efficient way for the over 17,985 law enforcement agencies to access data is not for each of them to go to developers of cryptography with individual requests. The FBI has implored Congress to force a fundamental change in how cryptography is deployed. Under asymmetric key exchange, the sender and receiver of a message are able to exchange information secretly. The FBI believes it should be guaranteed exceptional access to information being exchanged.

In order to grant exceptional access, a separate key that never existed before would have to be generated. One for Alice, one for Bob and another one for law enforcement. Key escrow management of those billions of keys would be a tremendous and budgetarily explosive undertaking. It would also create a consolidated or, at worst, single point of failure that would be under constant attack by hackers and insider threats. But more than that, it would also take a long time to retrieve the unique keys that match with the specific asymmetric key exchange. So the implementation would have to be simple. It would have to be flat across all 17,985 plus law enforcement agencies in America. One key would have to rule them all.

If we look at the field of epidemiology, we can see the dangers in creating monocultures. In the milestone work on plant disease epidemiology, Gareth Jones states that “whichever solution we choose to try to control pests and pathogens within the monoculture framework, that solution has to operate against the inherent advantage to the pest organism of selection on a massive and uniform medium.” If the keys are not as fantastically dissimilar to each other as they are in the current implementations of asymmetric cryptographic algorithms, then they may be as easily brute-forced with modern computing infrastructure as a 10-digit passcode is with a finger.

In either circumstance, we’ve created a new vulnerability that is still easier to exploit around the very thing our society is built on: privacy.

Closing Shots

Cryptography is the culmination of thousands of years of human intellectual evolution. It is an application of mathematics not well understood by many. And this is precisely what worries us. The resolution of this case must respect those core mathematical principles. While we must not allow those principle to be violated, we must also not lose sight of the potential costs, and very real problems facing law enforcement in the modern era.

Civil discourse is important and this issue is worthy of it. It is also important to recognize that the device in question may have important tactical and operational information to which law enforcement should have access. We cannot belittle the request itself as it too reflects an important societal value: keeping Americans safe from enemies foreign and domestic. However, we believe that it is fully outside the scope of thoughtful security and privacy policy for this issue to be resolved in the view of our enemies. Therefore, we believe that the only optimal outcome will be found in a private solution.

A solution must be had to ensure the proper outcome in this matter. But that solution will not be arrived at in the court of public opinion and should not be in the courts under the AWA. Encryption rights are too important to be gambled in a political process, or on the whims of individual courts.

This is an important technical, legal, and civic issue. We side with the FBI in that law enforcement requires access to critical information. We side with Apple in that what is being required may not comport with the law and, and in any case, may not be in the best interest of society. But regardless of sides, we do support one important perspective: this is an issue for engineers and mathematicians. The rhetoric around this issue must subside such that a technical solution can be arrived at in the absence of marketing, media and politics. As security practitioners, we believe that encryption rights are too important to be gambled in a political process, or on the whims of an individual court. It is possible that a party wins the court battle but loses either liberty or security for all of us.

1In the Matter of the Search of an Apple iPhone Seized During the Execution of a Search Warrant of on a Black Lexus IS300, California License Plate 5KGD203, ED No. CM 16–10 (SP) (C.D. Cal. 2016).

securesetaccelerator.com
securesetacademy.com

Originally published at blog.secureset.com.

--

--

SecureSet
Command Line

The #cybersecurity bootcamp with campuses in #Denver and #CoSprings. A @flatironschool. Educating the next generation of cybersecurity professionals.