Isaac Potoczny-Jones
9 min readMar 23, 2016

FBI vs Apple: how did we get here?

The Encryption Dance…

Apple and the FBI have been headlining recently in a debate about cryptography. Crypto is one of those fields that very few people actually understand well; it’s nuanced and complex, just like the current debate, and small mistakes can have big consequences.

I recently found myself picturing this debate in a series of over-simplified cartoons. I wonder if these can help illuminate a complex issue, even if just a little bit.

So let’s start with where we are today. Apple doesn’t want to help the FBI unlock the iPhone of a terrorist. Well, they do want to help, but they don’t want to help in the way the FBI wants them to help, which is to create a one-off backdoor, a version of their iOS where several vital security features are disabled. The FBI suddenly backed off of this request on Monday because they might have found another method of accessing the phone:

2016

After a lot of public outcry and much debate, the FBI has said that “an outside party demonstrated to the FBI a possible method for unlocking [the] iPhone” and so they want to hold off on the hearings. As of today, we don’t know who the outside party is, but there have been some unconfirmed reports.

March 2016

Setting a precedent: Why Apple is worried

Apple has claimed all along that the FBI is trying to set a precedent having to do with surveillance over all Americans, not just warrants for a single device. To understand Apple’s concerns, we have to go back in time a bit to 2004, when the US government published an encryption-related algorithm with a serious security flaw that seemed to include a backdoor.

Most of the security community didn’t bother with the new algorithm, and in 2007, Microsoft security researchers realized it was deeply flawed because there could be a second key hidden somewhere:

2004–2007

Then in 2013, Edward Snowden claimed the algorithm was intentionally backdoored and that the US government did the backdooring.

This, coupled with other Snowden claims and advanced attacks by foreign governments, caused many tech companies, including Apple, to massively improve the security of their products over the last few years. Encryption is a core aspect of those improvements. The Snowden claims also created distrust between the security community and the government, and that is what we’re seeing play out in the Apple-vs-FBI case.

2013

But wait a minute, where did those nice locks come from in the first place? The short answer is that the government and security people have been working together to make and break codes for quite some time. Many people trace the roots of modern cryptography to World War II, so if you’ll permit a brief interlude, that era is absolutely fascinating.

Some History

During the war, mechanical computers were used to make and break codes, and early computer scientists like Alan Turing were an important part of the war effort. This work eventually evolved into modern cryptography, but more importantly, it evolved into modern computers!

1938–1945

In the 1970s, several people, both inside and outside the government, independently invented some core aspects of modern cryptography that are still in use today. The team that created a very widely used algorithm (called RSA) created a company (called RSA) to commercialize it. This kind of “asymmetric” crypto was a major advancement in protecting everyone’s secrets.

1970s

Since that period, the US government has tried, with varying degrees of success, to limit access to encryption algorithms in order to maintain legal access for wire tapping. Sometimes it happened via export restrictions (some of which are still in place), sometimes via laws and standards that limit key size, and sometimes by giving the government a special access key:

1980s-1990s

Things came to a head when the government tried to create a hardware cryptography system called the Clipper chip that gave them a legal backdoor so they could maintain the ability to wire tap phone conversations. Like many backdoors, this one contained a serious vulnerability, and the security community and consumers rejected the premise and the implementation of the technology:

1994

It’s worth noting that, back in the 90s, before the Patriot Act, the government had reduced surveillance powers due to public and congressional outcry over alleged abuses in the 1970s. Eventually, because of the rise of the Internet, encryption became vitally important to the US economy.

Working Together to Improve Security

Since then, the relationship between the security community and the government has steadily improved, with the government continuing to participate in the creation and standardization of cryptographic algorithms that are largely the ones we use today as the foundation of cybersecurity:

2000s

Unfortunately, encryption technology has been extremely hard for normal people (and even developers!) to use, leading to a lack of adoption and enormous security vulnerabilities:

Always

Cryptography was instrumental to getting the Internet widely adopted as a platform for buying and selling stuff, not to mention just communicating with one another. Before e-commerce could become a thing, developers had to find a way to transmit secure information over the Internet. Of course, we had these wonderful cryptographic security tools already developed, and eventually they were built into web browsers in way that most people can (sorta sometimes) manage to use securely:

1990s-2000s

As an aside, “hackers” have long played both sides of the fence in helping to point out weaknesses (white hat hackers) and also exploiting them (black hat hackers), and even testifying before Congress using their hacker nicknames. Eventually some of those same people would fill important positions in cybersecurity within the US Department of Defense.

In fact, around this era, cybersecurity was considered critical in the adoption of new communications technologies like health care data exchange, where the government actually mandates cybersecurity and encryption by law:

1998

But the fact that crypto is so hard to use doesn’t just hurt consumer end users. The US government has also been the victim of attacks.

In 2002, Congress passed a law basically saying that government agencies need to use security, and the Office of Inspector General (OIG) released public reports over the years about security vulnerabilities at various agencies (2013, 2014). This includes the Office of Personnel Management (OPM), which holds sensitive information about people who have security clearance. Unfortunately, OPM didn’t have any professional security staff until 2013 and didn’t use two-factor authentication in many instances, including these really cool crypto “smart cards” that everyone in the government is supposed to use:

2002–2012

Eventually, the OPM got hacked, and extremely personal information about US clearance holders was taken, reportedly by the Chinese intelligence services:

2013–2014

This hack bolstered the arguments of those who are concerned that a government backdoor to crypto would be impossible to keep secret, that our enemies would use it against us.

Warrants and Surveillance

In 2011, US Senator Ron Wyden claimed the Patriot Act was being used to justify widespread surveillance, but he couldn’t say much about it because it was classified:

2011

Then came the 2013 Snowden claims about the mechanism of surveillance: accessing the computer networks of US technology companies. Some companies reacted by increasing the use of encryption and security.

Everyone knew that crypto was a good solution, but it had always been difficult to use, even for computer programmers, not to mention journalists who are trying very hard. At this point, the technical communities started getting a lot more interested in building much more robust cybersecurity into their systems:

2013

Of course, Apple found a way to make its iOS extremely secure and shockingly easy to use so that even a somewhat bad password is robust against advanced attacks:

2013–2016

Since most people accept most “defaults,” making things secure by default is an important part of actually making things secure; but if everyone uses strong crypto all the time, it becomes very hard for law enforcement to get access to it.

Those surprisingly secure devices are widely adopted now, and although they probably weren’t originally introduced to thwart law enforcement, they are apparently having that effect. While anyone could have used strong encryption at any time since the 1970s, it wasn’t really practical for the average person until the last few years; so the issue isn’t really strong crypto, it’s easy-to-use crypto.

Since then, the public has been engaged in a lively debate about whether the government should have backdoors into crypto systems. In late 2015, the FBI said they wouldn’t seek a legislative solution, and President Obama echoed this.

2015

New York and California introduced bills making encryption-by-default illegal, but it’s not clear those bills will go anywhere. The whole thing is starting to remind the security community of the “crypto wars” of the 1990s:

2015

That pretty much brings us up to speed. The FBI and Apple are continuing their legal sparring, with the FBI making thinly veiled threats about solutions that would be even worse for Apple:

2016

So while the FBI in 2016 is asking for access to a single device, the security and technical communities are weighing the historical context of the request.

Conclusion

Since World War II, if not earlier, cybersecurity has been an intricate dance between the code makers and the code breakers. There’s a lot of distrust, even though we all know we’re on the same team. The distrust engendered by the Snowden claims causes people to worry that the purpose of the request is actually about mass surveillance, not about one phone.

At the same time, law enforcement organizations know they need to get their jobs done, and it will be harder to perform investigations without the data that they have until recently been able to get from phones and back-end servers. Strong encryption makes “physical” evidence inaccessible in a way that it rarely, if ever, has been in the past.

The security vulnerabilities of the Clipper chip backdoor and the OPM hacks cause the security community to worry that the government won’t be able to protect a backdoor if it’s ever created.

Even with the best crypto money can buy, cybersecurity is not a solved problem; in fact, many companies are barely keeping ahead of the threats, as John Oliver obscenely expresses it. On top of that, some claim that adding backdoors will only make it harder to build secure systems because they increase the attack surface that security folks need to defend.

Isaac Potoczny-Jones is an authentication and privacy specialist. He is the CEO of Tozny, a security startup that offers two-factor authentication services. Follow Isaac on Twitter. Credit for the beautiful cartoons goes to Shpat Morina of Galois.

Isaac Potoczny-Jones

Isaac Potoczny-Jones is an authentication and privacy specialist. He is the CEO of Tozny, a security startup that offers two-factor authentication services.