Photo by Brandon Wong on Unsplash

Privacy isn’t free

Don’t take your rights for granted. If you care, defend them.

Zane Pocock
Published in
16 min readApr 9, 2020

--

Governments the world over have long sought to cripple the assurances of consumer encryption. They back it up with strong, emotional reasoning. They pull on the heart-strings by illustrating how encryption tools allow global pedophile rings to organize their operations and distribute their content. A disgusting thought. Or they imbue the populace with fear, illustrating how a terrorist cell might use this technology to organize an attack, safe from the prying eyes of the brave men and women tasked with “protecting our freedoms”.

Leveraging a world thoroughly distracted and frightened by the Coronavirus, the US Federal Government is now using the shock doctrine to try their hand once more at undermining this technology. Called the EARN IT Act, a fittingly dystopian doublespeak, the proposed law relies on a sleight of hand to form a law-enforcement run committee responsible for regulating online platforms, ultimately granting it the authority to unilaterally undermine encryption (while not explicitly saying this). The legislation had been circulating for a while with apparently little chance of passing. That is, until the pandemic opened the legislative buffet. Now it’s looking like it might just pass after all.

So, what’s the big deal?

The Long Game

Government has a history of attempting to undermine encryption tools harnessed by everyday people. Clearly visible attempts reached a crescendo with the Crypto Wars of the early 1990s, before they were forced underground into the controversial operations of today’s NSA. But they’ve never lost sight of the prize: a window into the activities and thoughts of everyone in the world.

In 1991, Phil Zimmermann invented an encryption program called Pretty Good Privacy (PGP). We will explore a little more how this works later, but briefly it allowed for two things: end-to-end encryption, meaning individuals could obscure messages to each other such that the intended recipient was the only person able to read it; and digital signatures, meaning the author of a message could certify that they themselves created it. Governments, particularly the US, had been keeping close control of encryption programs because of their military usefulness: if they met certain criteria they were legally defined as munitions. Zimmermann had written quite the program, the first major consumer challenge of these controls, and soon he was the target of a criminal investigation for “munitions export without a license”.

Zimmermann was never formally charged, but his creative defense set the groundwork for a vital legal precedent for our online lives. He published the PGP source code as a book, meaning that global distribution was protected by the First Amendment to the United States Constitution (the one about free speech, for those unfamiliar):

Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof; or abridging the freedom of speech, or of the press; or the right of the people peaceably to assemble, and to petition the Government for a redress of grievances.

Tying the use of encryption programs to the US Constitution, ensured that it was upheld as a basic legal right. While Zimmermann never had to defend formal charges himself, his defense was integral to two later legal cases that established the legal precedent.

Since then, the US government has continued to take pot-shots at encryption, but largely focused its energy on the clandestine activities of the Deep State. Attempts to change the status quo have been largely unsuccessful, with the Fifth Amendment (the right not to self-incriminate) used to set the precedent that a suspect can’t even be forced to reveal a password. But privacy advocates shouldn’t be taking victory laps: the CIA was recently revealed to have owned a major distributor of government encryption software, Crypto AG, which it had used to spy on foreign governments for decades, by inserting vulnerabilities. Meanwhile, the NSA has illegally conducted widespread mass surveillance of the American population, after leveraging new capabilities granted to it under the Patriot Act. For exposing these crimes, Edward Snowden is forced to hide in Russia. On the bright side, he also revealed that these agencies still have no way to break PGP.

The Principled Objection

It was around the time of the Crypto Wars that the cypherpunks came to prominence. Timothy C May was early, writing The Crypto Anarchist Manifesto in 1988. He predicted nothing short of a revolution arising from the technologies he could see being developed. He foresaw that a technology like PGP was just around the corner, and that this would herald the rise of a sovereign online individual. An individual able to act entirely anonymously, in a world where reputation was tied to verifiable pseudonymity rather than government identity. He foresaw that the state would try in vain to stop such progress, but championed a principle that needs no argument among free people: liberty.

“Just as the technology of printing altered and reduced the power of medieval guilds and the social power structure, so too will cryptologic methods fundamentally alter the nature of corporations and of government interference in economic transactions. Combined with emerging information markets, crypto anarchy will create a liquid market for any and all material which can be put into words and pictures.” — Timothy C May, The Crypto Anarchist Manifesto

More directly concerned with privacy, Eric Hughes wrote A Cypherpunk’s Manifesto in 1993. He made the case that “privacy is necessary for an open society in the electronic age,” defending that privacy is something humans have always been able to rely on finding. Before the electronic age, “people have been defending their own privacy for centuries with whispers, darkness, envelopes, closed doors, secret handshakes, and couriers.” But with the dawn of communications technologies gaining traction as he wrote, it was foreseeable that without defending the assurances encryption offered, this important element of a society was in danger.

Meanwhile, Hal Finney, who would go on to receive the first ever Bitcoin transaction 15 years later, identified in Politics vs Technology that there was a tendency among the cypherpunks to prematurely celebrate victory and avoid the political. He astutely drew attention to the fact that cypherpunks harnessing such simple tools as text and communication protocols, could end up in jail if participants didn’t engage the political sphere. And indeed, this is what we see now in the high profile cases of Julian Assange and Ross Ulbricht.

“The notion that we can just fade into cypherspace and ignore the unpleasant political realities is unrealistic … Have people forgotten the PGP export investigation? Phil Zimmermann hasn’t. He and others may be facing the prospect of ten years in prison if they were found guilty of illegal export. If anyone has any suggestions for how to escape from jail into cyberspace I’d like to hear about them.” — Hal Finney, Politics vs Technology

It is with Hal Finney in mind that we push back, both in principle and in regulation, against encroachments on these hard-fought liberties.

“Fundamentally, I believe we will have the kind of society that most people want. If we want freedom and privacy, we must persuade others that these are worth having. There are no shortcuts. Withdrawing into technology is like pulling the blankets over your head. It feels good for a while, until reality catches up.” — Hal Finney, Politics vs Technology

You’ve got something to hide

People have been trained to see privacy as malicious. But as Eric Hughes put so simply in A Cypherpunk’s Manifesto, “privacy is the power to selectively reveal oneself to the world.” If one tries to make the argument for privacy to someone who isn’t particularly concerned by it, the common response is to observe that they “have nothing to hide”. While this speaks volumes of the comforts and security people have been offered in life, it betrays a lack of concern for the multitude of scenarios where someone might have something to hide. What’s legal is not necessarily what’s moral; indeed, laws are often changed by the activities of people doing that which they are not supposed to be doing.

Take same-sex relationships, for example. There is an increasingly clear consensus in most Western societies that these relationships should not be oppressed by the state (even while there is disagreement on whether they are culturally acceptable). While recent legal victories to this end have been celebrated as examples of the system allowing for real freedoms, it’s often forgotten that these victories would not have come to pass if people did not have the ability to privately break the law. What’s also forgotten is that these relationships and religious freedoms remain forbidden in large swathes of the world. A successful attack on encryption would move societies closer to that hypothetical world where every single law is 100% perpetually enforceable. Would-be offenders would know they have a guaranteed jail term for crossing the line, so in the present example, same-sex relationships would rarely happen in the first place. This is the Big Brother argument: the very act of mass surveillance changes behavior and suppresses the human will. Some might like this idea, but many would see a tragedy in it: how would these relationships come to be legal if so few had ever experienced one? It would be astonishingly naive to suppose that no other such cases are out there, still pending the official stamp of approval.

“I have nothing to hide” is also an attitude that betrays an assumption of guilt: the opposite of a legal tradition that presumes innocence until proven otherwise. The EARN IT Bill is playing a clever positioning game in its name, leveraging this assumption. It implies that the government gives people their rights, and they must therefore earn the right to privacy, but this is the wrong way around. If one charitably assesses the purported goals of democracy, it is the process by which a citizenry chooses which liberties they might forgo for the collective. Government does not have the power to grant people the right to privacy. They already have it. Rather, people have the power to surrender that right.

Privacy and secrecy are not practically separable

People half-sold on the argument for privacy technology, often attempt to delineate between privacy and secrecy. Their argument is that you have a right to privacy in the sense that you can close the door to the bathroom when you’re using it. But you don’t have the right to secrecy — if you’re suspected of doing something nefarious behind that closed door, they argue that bursting in would be in service of a “greater good”, and that the ability for our enforcers to do so is a critical capability.

The people making this argument appear to intuitively understand the importance of privacy technology, but have been so deeply compromised by the narrative that they can’t see the flaw in such an attempt at bifurcating between privacy and their definition of secrecy. For the two cannot be practically separated. In the connected world, either all the information you’re sending is able to be snooped, or none of it is. This is a fundamental, practical limitation. But once a secure communication is complete, nothing is stopping an individual from divulging a secret from that communication. The answer to this conundrum isn’t spying, it’s old fashioned police work.

The “greater good” argument can also be turned on its head, particularly when it comes to the state. One of the strongest cases for encryption is accessible human resilience in the face of tyranny. As most people sit at home, locked up with their liberties suspended for the “greater good” in the face of the Coronavirus, it should be increasingly clear how easily tyranny can be leveraged against people in the modern age, despite the assurances they’ve told themselves about how democracy fixes this. This is true even if one charitably grants that the current response is justified. Today, the average person’s fragility is strikingly clear. Their capacity to conduct activities in secret if needed ought to be obvious, not just in principle, but in the increasingly likely case that they need it.

The Fallout

Putting aside the philosophical reasons that one might care to protect encryption, it’s arguably the practical implications that make a stronger, more immediate argument. A successful attack on encryption undermines the fundamentals of many tools and practices we have come to rely on.

A sly, roundabout attack on encryption

The EARN IT Bill requires that all online platforms comply with a list of as-yet undefined “best practices”, that will be created by a commission of 19 people controlled by US Attorney General William Barr and law enforcement agencies. These committee members will have the power to unilaterally define and enforce criteria that online platforms must meet in order to retain legal protections from criminal and civil liability for user-generated content under Section 230 (47 U.S.C. § 230). Typically Orwellian, they’ve even managed to word it such that encryption might be theoretically nullified without mentioning the word:

“This bill says nothing about encryption. Have you found a word in this bill about encryption?” — EARN IT Bill co-sponsor Sen. Blumenthal

So, you ask, if the criteria are yet to be drafted, and the Bill makes no mention of encryption, what’s the cause for concern here?

To those paying attention, this is transparently the latest in a slew of attacks on encryption that have ramped up dramatically in the past year. The primary indicator is to observe who, exactly, is expected to be involved in this commission:

As the EFF points out:

“You can’t have an Internet where messages are screened en masse, and also have end-to-end encryption any more than you can create backdoors that can only be used by the good guys. The two are mutually exclusive. Concepts like “client-side scanning” aren’t a clever route around this; such scanning is just another way to break end-to-end encryption. Either the message remains private to everyone but its recipients, or it’s available to others.”

A primer on encryption

To understand why attacks on encryption are so damaging, it’s important to have a base level of awareness about how encryption works. To dramatically oversimplify, encryption is the process of encoding information in such a way that only authorized parties can access it.

Early encryption followed a method known as symmetric key cryptography, meaning both the sender and receiver required the same knowledge to obscure and reveal a message. For example, a symmetric encryption algorithm might require iterating each character by one place in the alphabet. ABC would be encrypted by moving to the right and becoming BCD. The recipient could decrypt it by reversing the same algorithm: moving each character by one spot to the left and arriving back at ABC. Think of it like a sender and receiver sharing a post box to which they both have an identical key. The sender opens the box with his key to deposit a message, and the recipient uses their identical key to retrieve the message from that same box in order to read it.

Source

Recently, encryption has been employed asymmetrically, known as public-key cryptography. Rather than having a single point of failure as in symmetric key cryptography (where both sender and receiver have the same key), everyone has their own unique private key from which they can derive a public key. We needn’t go into the technical here, so to simplify, this public key can be seen as an address that can be published publicly. A sender can encrypt a message to the recipient’s public key, and only the recipient can ever read it by using their private key. To use the post box example again, it’s like everyone having their own box which only they can open, but they can share the address with correspondents. PGP, discussed earlier, was an early example of this being used at a consumer level, and it is still rather successful.

Source

You use encryption every day without knowing it

At a pragmatic level, attacks on encryption threaten the most basic technologies you’ve come to rely on, for both the meat-space and internet economy. These should be increasingly obvious if you consider the needs of all the individuals around the world working from home under quarantine.

End-to-end encrypted messaging and voice calls allow for distributed teams to securely discuss sensitive topics. It ensures that important documents can be verified as coming from who you think they’re coming from, and that those containing sensitive material — such as claims on your health plan — haven’t been exposed to malicious actors. That little padlock icon you see in the address bar of your browser is telling you that your connection to the server hosting this blog post is secure. It verifies that the information rendered on this web page has been sent from the server you think it was sent from, without a malicious actor injecting code or other content in the middle, or snooping on your credit card details.

Beautyon has even pointed out that an attack on encryption would be game theoretically stupid for America in today’s connected world:

“The men advising Trump can demand that encryption has back doors in America, but they cannot demand that anyone anywhere else follows her. This would mean that only US web sites and services are vulnerable; the entire US internet would be globally recognized as an unsafe zone for e-commerce. It would be a disaster for the tech sector of the US.”

The central, practical problem with attacking encryption is that it introduces a fundamental weakness in this infrastructure. What these actors want is something known as a backdoor: an intentional exploit, known to law enforcement, that can be selectively used in the course of an investigation. Leaving aside the unearned trust that law enforcement will not abuse such access, the most obvious problem is that other bad actors will find these exploits. Even without the intentional addition of weaknesses, the pace and magnitude of these exploits is already growing at a concerning rate: just look at how malicious actors are able to gain access to Amazon’s Ring cameras, or how almost half of America had their social security numbers exposed by Equifax.

While the arguments above should all be considered of utmost importance, here’s the real kicker: even with a backdoor, law enforcement won’t be able to achieve their goals. These laws are futile for their purported aim. Encryption standards are open source and broadly distributed. If you need to use it, you’ll be able to. The end result is mass surveillance of average citizens in exchange for trivial improvements to the efficacy of law enforcement. The EARN IT Bill, enforceable at the platform level rather than targeting encryption programs themselves, only violates those who don’t think they have “anything to hide” and thus don’t bother to find alternatives. If the government forces a backdoor into iMessage, for example, those that wanted to would still be able to send encrypted messages with a little more effort. They would simply encrypt messages on their computer before copying it over to the communication application. The recipient would also have to take a little more effort to decrypt the message, but the result would be the same. Those snooping on iMessage would just see what appears to be a long string of random characters. If a group is sophisticated enough to run a global pedophile ring, they’re going to do this. But if you’re telling your friend that you’re going for a cheeky second run in violation of your quarantine quota, you might not bother to encrypt it and soon get a knock on the door.

Rise up

Security researchers often argue that the government shouldn’t be in the average person’s threat model. The argument is that you’re probably not important enough for them, but if you are, then they will always get you anyway — so what’s the point. They point to Julian Assange, a heroic activist against tyranny, detained and tortured in London. Aaron Swartz, who wanted people to have access to the research they had funded, driven to death by DoJ bullying at the behest of academic publisher JSTOR. This case is particularly jarring amidst the Coronavirus crisis, as JSTOR has been seeking brownie points by temporarily allowing open access to a limited subset of its database. Ross Ulbricht, set up with egregious false accusations, dropped before trial, to sway the jury’s perception. These are the sorts of online actors who know many of the best practices for covering their tracks, but they are hunted relentlessly by the full scope and fury of the state.

Yet these same people also demonstrate the exact activities that should be a rallying cry for why encryption is so important. Maybe stories of child abuse can pull at your heartstrings (and obviously one shouldn’t diminish these stories: it’s the purest evil), but the same tools in the hands of a principled, independent journalist can be harnessed to expose the child abuse rampant among the world’s most powerful people themselves, as with Jeffrey Epstein’s network. Maybe the specter of terrorism is a strong argument for you as to why such powerful tools shouldn’t be in the hands of the layperson. But without encryption, Julian Assange’s Wikileaks might never have been able to pull off the clandestine operations required to expose the Obama administration’s role in arming ISIS. If terrorism is such a threat, it might be beneficial to understand exactly who’s involved.

This argument — not to bother evading the government — is defeatist, but it is also incorrect. These attempted abuses of our liberties, and the government’s role in feeding the problems they profess to solve, demonstrates that the government should be the centerpiece of everyone’s threat model. The cynical nature of the EARN IT Bill demonstrates that they are one of the greatest threats to human life and liberty.

The second- and third-order effects of a successful attack on encryption are poorly understood and dramatic. Honest citizens will lose their liberties while the worst people will continue to find workarounds. There’s a counter-intuitive rationale to expect that the fight against child abuse can actually be more effective by strengthening encryption, not undermining it, as it serves to protect investigative journalists and encourages whistle-blowers with stronger assurances of anonymity.

It’s an intellectually lazy mental shortcut to think that attacking encryption would magically improve things. If anything, it will make things worse.

Encryption is antifragile. Encroachments by malicious actors will only strengthen the need for it, and the assurances that its developers seek to offer. It also forces knowledgeable users to broaden the scope of their threat model. Using free and open source software like PGP and Signal, the everyday person can build a web of trust including people who can verify the software they’re using and certify who they’re communicating with. Begin to do the hard work. Fight for your right to privacy.

Sincere thanks to Alex, Thib, hodlonaut, Max Hillebrand, Matt Odell and Hass McCook for their help with this piece.

If you want to learn about responsible Bitcoin custody for your fund, exchange, or other vehicles, have strict LP and risk management requirements, or otherwise appreciate a trust-minimized profile, we’d love to talk.

Please email us at custody@kn0x.io

--

--