Do Terrorists Love Silicon Valley?

I recently read an opinion piece from the Wall Street Journal. It got me so riled up I decided to respond with a letter to the editor. I felt it was representative of the misunderstanding today of encryption, “hackers” and surveillance. Nonetheless, I haven’t gotten a response back yet. That’s totally fine though because I really enjoyed writing it. In addition to enjoying the writing process, I thought it would be a good idea to post this online as to hopefully reach at least some people. Anyways, here was what I wrote to the editor:

Regarding “Why Terrorists Love Silicon Valley” Opinion by L. Gordon Crovitz

Crovitz wrote his essay with legitimate concern for our nation’s access to the communications of people suspected of terrorism. He describes the increasing difficulty for our nation’s security agencies to look into any suspicious digital activity, even if such surveillance is legal. He quotes Benjamin Wittes, an editor of the Lawfare blog for the Brookings Institution, in saying “As a practical matter, that means there are people in the U.S. whom authorities reasonably believe to be in contact with ISIS for whom surveillance is lawful and appropriate but for whom useful signals interception is not technically feasible.” His response was to address the growing push for ubiquitous encryption of user information and communications in the tech industry. There is, however, a contradiction in the assertions made by Wittes. He says that the U.S. already has a reasonable suspicion against certain individuals, but continues to say they are unable to gather this evidence in the first place. Given his own statement, I would think giving every user the privacy (which they have the right to) hasn’t been slowing down the FBI or NSA at all!

Crovitz then goes on to suggest a solution by mentioning ICANN, or the Internet Corporation for Assigned Names and Numbers. ICANN distributes seven physical keys to individuals spread around the world that together allow access to data very important to the internet as we know it. Crovitz believes this would be a viable and secure alternative to complete encryption of user’s data. He claims a similar system could be used to allow only authorized government officials to access the data of those warranting suspicion. The problem here is one of scalability. It’s not that a key used for data encryption, or ‘private’ key, couldn’t be written out and placed in a safe for later use, it’s that such a method limits us to only two further steps. We could either write down each individual private key for every user who signs up, or we could just use a single private key to encrypt everyone’s information, then put that in a safe. The first option would get out of hand very quickly, as any company would struggle to secure enough safe deposit boxes for each individual user. Equally impractical, but less obvious is the second option. If our nation wanted information on an individual they would have to obtain this global ‘master’ key. However, it would be impossible for them to obtain it without also accessing the key that unlocks everyone else’s information. The result is the incidental, yet unavoidable discovery of all users’ private data. Even if the government is investigating one individual with reasonable suspicion, they can not prevent illegally viewing the information of other users. In addition, this creates a lot more headroom for illegal surveillance. Not only that but should the key be leaked through other means, every user with data encrypted under that key would have their private data leaked along with it. This isn’t as far-fetched as one might first conclude. The private key must be installed on an internet-connected server to be useful at all, which is where a number of things could go wrong.

The issue of security against privacy is an oversimplification. It is impossible to provide access to data to a government, without also introducing a great deal of opportunities for unauthorized individuals to access that data. Introducing backdoors, no matter how exclusive their users thought they were, always introduced security flaws. By design, backdoors provide not only government access, but access to anyone willing to search for these exploits. The result of these so-called exclusive deals has almost always made the user much more vulnerable to stolen data. Bruce Schneier, an American cryptographer and writer, wrote in his blog that “those who would give up privacy for security are likely to end up with neither.” [1] Again this statement is referring to how important privacy is when securing our data. Otherwise, you end up simply with nondiscriminatory access to information that was supposed to be encrypted. In fact, some experts have been saying this for practically a decade; Schneier wrote that particular blog post in 2008.

So even though FBI Director Comey found it a little surprising that “Technical people say it’s too hard,” it’s important to remember that the security of data is instantly compromised as soon as you introduce a back door, front door, cat door, or any other kind of entrance. In fact, if a company intentionally allows access of private data to another company, government, or individual, they might as well be removing the encryption entirely.


Here’s a link to the original article on the WSJ website; unfortunately it’s behind a paywall. Oh, and don’t forget to check out Bruce Schneier’s blog and watch this talk by him which proved good inspiration in writing my response.