An Open Letter to Glenn Greenwald

Full Disclosure and NSA-induced vulnerabilities

Brandon Downey
4 min readSep 8, 2013

[Disclaimer: The views expressed in this letter are my own, and do not represent those of my employer.]

Dear Mr. Greenwald,

Thank you for your ongoing coverage of the Snowden files. As a citizen of the United States, I think you are doing our republic a great service by exposing behavior which is potentially criminal, definitely unethical, and certainly demanding of open and public debate. Many of the questions raised by these stories strike at the heart of what form a constitutionally restrained government should take in the modern era: One where information networks have become ubiquitous, data can be copied at almost no cost, and asymmetric threats like terrorism loom large in the public consciousness. Dealing with this world in an informed way requires that we know what our government is doing in the name of safety.

The most recent round of stories, concerning the NSA’s success in compromising the security of the world’s encryption, is in some ways among the most troubling. It did not surprise me to learn that the NSA (or their brothers in arms, the GCHQ) are adept and exploiting vulnerabilities in existing computer systems or implementations of cryptographic schemes. What did surprise me is that our intelligence agencies are engaged in a campaign of purposefully weakening the tools people and businesses use to protect themselves.

According to the documents you have published, these attacks seem to take many forms. The NSA has attempted to subvert the standards process to make certain cryptographic standards weak by default:

http://www.wired.com/politics/security/commentary/securitymatters/2007/11/securitymatters_1115

The NSA has also convinced hardware security vendors to include back doors:

https://twitter.com/ashk4n/status/375758189444493312/photo/1

And they have evidently induced software vendors to introduce subtle weaknesses in things like random number generators or through the inclusion of overt back doors — some more obvious, some less so:

http://en.wikipedia.org/wiki/IBM_Notes#Security

So far in your (and I assume your editor’s) publications regarding the Snowden leaks, you’ve gone the extra mile to practice “ethical” disclosure, frequently redacting information that might be damaging to individual’s lives, or to the integrity of ongoing operations. However, in the case of this leak, you took the opportunity to redact any reference to the identities of any companies or standards which have been compromised.

I think this is the wrong approach. Many people; hundreds of millions of people in fact — use cryptography on a day to day basis to protect the mundane details of their financial transactions online. Vitally important infrastructure — both virtual and actual — also uses cryptography to protect itself. The smallest and most important coterie of people using encryption though are the dissidents who use cryptography in various ways to protect themselves from oppressive regimes.

But the flaws and backdoors imposed by the NSA into standards, software, and hardware that allow them access to user data are unfortunately not limited to just them. A flaw in software can potentially be exploited by anyone — and it is very likely that if these flaws are there, adversaries with large amounts of time and resources have found them already.

We don’t have to go far to find examples of back doors getting intelligence agencies into trouble:

http://spectrum.ieee.org/telecom/security/the-athens-affair

Or we can simply recognize that Edward Snowden walking out of the NSA with these documents is prime facie evidence that their their back doors will not necessarily remain their back doors. Even the most charitable reading of these documents indicates that we must accept that there is commercial security software out there which has unpublished vulnerabilities in it — so what can we do about it?

This debate: what to do about ‘unpublished’ vulnerabilities is actually at the heart of an ongoing discussion in the security community about when and where disclosure is appropriate. I know you’re probably very busy, but this article is worth reading:

http://en.wikipedia.org/wiki/Full_disclosure

Here’s the money quote from Bruce Schneier:

“Full disclosure — the practice of making the details of security vulnerabilities public — is a damned good idea. Public scrutiny is the only reliable way to improve security, while secrecy only makes us less secure”.

What you have on your hands with this most recent set of stories is a problem of disclosure. There are flaws in some portion of the world’s security systems, and our government is actually endangering the privacy, assets, and sometimes even the lives of the people depending on them. Hiding this information about these vulnerabilities from terrorist also necessarily hides them from the most vulnerable people.

Keeping this information back out of a fear that terrorists might stop using them is I think an argument rooted in the same fears that have dogged our society since 9/11. Modern society requires the internet in order to function — and part of that functioning is that it be secure. By publishing the names of the companies and standards which have been compromised, you will be giving individuals the tools they need to protect themselves.

Thanks for reading this.

Update: Quick response on Twitter — https://twitter.com/bdowney/status/376818109719404545

--

--

Brandon Downey

Nothing is more useless than a biography less than 160 characters.