The Fine Line Between Government and Data Privacy

How lawmakers can collaborate with technology companies on encryption and national security

Mayer Mizrachi
Criptext
8 min readSep 7, 2018

--

Over the last few years the dispute between government oversight and data privacy has intensified. The most prominent event that triggered this conversation was the investigation of the 2015 San Bernardino shooting in which the FBI requested that Apple create a backdoor to unlock the shooter’s iPhone. This saga put Apple on the hot seat of what is an industrywide responsibility. Apple’s decision was critical and would’ve set a precedent for how companies should act in the future. On the one hand you had the moral obligation to help bring justice to a ungodly act of hatred which left 14 people dead and 22 injured. On the other hand, you had the company’s responsibility to protect the privacy and security of its users. There were critics on both sides with equally valid arguments to back their stance. Apple was in a tough position. What the FBI asked of Apple was not exactly to give them data that Apple had, but instead, to create a system to forcefully access the data they wanted by debilitating the very security Apple had worked so hard to create.

After much deliberation and in spite of media and political scrutiny, Apple decided not to concede to the FBI’s demand and by doing so they drew a clear line as to how far tech companies should go in assisting law enforcement. Apple CEO, Tim Cook, argued that “the technique could be used over and over again, on any number of devices. In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks — from restaurants and banks to stores and homes. No reasonable person would find that acceptable.” Ultimately, what this case did was trigger the conversation of government overreach in accessing private user data. As we speak, Australia is looking to take government surveillance and overreach to a whole new level.

The Five Eyes

On Tuesday, the New York Times published an opinion piece by Lizzie O’Shea that weighs in on a new bill in Australia which will help intelligence agencies circumvent encryption. Under this proposed law, technology companies would be legally compelled to cooperate with law enforcement in not just delivering information, but also access to information. This means that if the information were to be encrypted, the company would be forced into circumventing its own security, decrypt the data and deliver it to law enforcement. Lawmakers and law enforcement agencies argue that tech companies should all create a ‘backdoor that only law enforcement could use’ but technology companies, industry insiders and cybersecurity reporters have all agreed that any such ‘backdoor’ would be a vulnerability to the entire system, which would enable hackers and bad actors to access private user data.

At first thought, one could dismiss Australia’s actions as just some far away country going into disarray and ignoring the fundamental facts of information security. However, Australia’s actions could permeate to the US, Britain, New Zealand and Canada; as all five countries belong to the ‘Five Eyes’ initiative, which aims to collaborate across member nations in the matter of intelligence. In 2017 the Five Eyes convened and concluded that the single biggest impediment to their intelligence agendas was encryption and its broadened use across communication platforms such as messaging and email. Since then, all member countries have made it their goal to create legislation that grants intelligence agencies ‘backdoor’ access to encrypted data and technology platforms. Australia was the first to materialize something with its recent bill proposal and you can expect USA, Britain, New Zealand and Canada to follow suit. In fact, one can expect things to move quicker with the US considering that the Sheryl Sandberg of Facebook and Jack Dorsey of Twitter, have both testified this week in congress before the Senate Intelligence Committee. Of course, they’re being grilled for the way in which their platforms have aided in misinformation and election tampering by foreign powers and bad actors, but make no mistake that this only adds fuel to the fire.

The Dragnet Solution

The main argument used by lawmakers to breach user privacy is that encrypted communications platforms can and are being used by bad actors. Furthermore they claim that social media and messaging platforms are being used by extremists and terrorists to fulfill their evil agendas against society. By consequence, intelligence agencies cannot protect national security if they cannot snoop into these conversations. The problem with this argument is that it is the same one that was used by the US government in 2013 to defend the NSA’s mass surveillance practices once they were disclosed by whistleblower Edward Snowden. In addition, lawmakers’ proposed solution is a drag-net method that is fundamentally similar to the methodologies used by the NSA, which gained much public criticism for collecting massive amounts of private data of its own lawful citizens. Phone records, text messages, phone calls, emails, browsing histories and social media interactions from private citizens who weren’t even persons of interests to intelligence agencies where being collected without discrimination.

At this point it’s very important to point out that the ‘drag-net’ method is one that has proven ineffective. In fact, since Snowden’s leaks in 2013, there has been no evidence to show that the NSA’s mass surveillance programs have thwarted any terrorists attacks in the US. And what price has been paid? Loss of privacy, government overreach, misuse of government funds, and the regression of social values.

Fundamentally, dragnet solutions are the equivalent of using a nuke destroy a hornet’s nest: yes, you’re solving the problem, but you’re destroying an entire ecosystem in doing so. Terrorists in the encrypted communication space are like lion fish in the coral reef. If we use a dragnet to remove the invasive species from the coral reef, we’ll also be destroying the very ecosystem that we’re trying to protect. In this sense, creating laws that force platforms to create backdoors for the sake of ‘National Security’ would end up putting at risk the very same society that the government is looking to protect. A society without privacy rights is as bad as a society without free speech. In fact Snowden put it best when he said:

“Arguing that you don’t care about privacy because you have nothing to hide is like arguing that you don’t care about free speech because you have nothing to say.”

A regulated future

Now, we can preach all we want about the importance of privacy rights and how wrong it is to use dragnet solutions such as the ones proposed by the Australians, but in a world where politics reigns over reason, we must expect the worse and prepare for it. So, for practical purposes, let us just explore how would this ‘Assistance and Access Bill’ work and wherein lie its pitfalls. For this experiment we’ll use communication platforms that use the Signal Encryption Protocol, which is widely revered to be the most privacy oriented encryption in the cybersecurity space.

Problem 1: The law would obligate a company like WhatsApp (encrypted messaging) or Criptext (encrypted email) to deliver encryption keys to the intelligence agencies so they can decrypt the messages. Now, the problem here is that both Criptext and WhatsApp use the Signal Protocol Encryption in which encryption keys are not generated by the service provider’s server, but instead, they’re created by the user’s physical device — be that a computer, smartphone or tablet. In this sense, the communications company doesn’t have access to the keys, which means they cannot deliver these to the requiring intelligence agency.

Problem 2: Continuing, let us assume that by some miracle the intelligence agency is able intercept a user’s encryption key. Well, this is where Signal Protocol’s Forward Secrecy characteristic comes in. Each message is encrypted with a unique key that is replaced with a new encryption key every time a message is sent. This forward ratcheting system bars any one key from decrypting all messages/emails.

Problem 3: The overarching assumption in these regulatory efforts is that the service provider hosts the user’s data. This is a huge pitfall as companies that use the Signal Encryption do not collect user’s messages in their servers. They just can’t, even if they wanted to. All the data from WhatsApp and Criptext are stored exclusively on the user’s device — just like their encryption keys. This means that even if the encryption keys are hacked, the service provider has no data to give to authorities. This is an important point to cover as Brazil suffered public scrutiny when, succumbing to its government’s ignorance, it jailed a Facebook executive in 2016 for not delivering WhatsApp user data and ignoring the fact that WhatsApp didn’t collect its user’s data.

Solving the problem, together

There is absolutely no doubt that bad actors can and do use encrypted communication services to carry out acts of terror and we therefore have to find a solution to this very real problem. The best way to go about solving this issue is by involving the tech companies that develop these platforms in the conversation. The latter examples stated above prove that regulators don’t understand enough about the encrypted communication platforms in order to draft bills that regulate them. As it stands today, the “Assistance and Access Bill” proposed by the Australian parliament will bring more harm than good.

Going back to the example of the Lion fish, we’ve seen local governments incentivize the hunting of individual Lion fish. In Jamaica, restaurants have now started serving Lion fish as a delicacy and tourists love it. This has incentivized an entire society to hunt the fish and profit from solving the problem. This precision guided approach has lead the coral reef to regrow and prosper without putting the rest of the ecosystem and its inhabitants at risk. What if lawmakers took a similarly surgical approach to the broad encrypted communication issue? What if the government proposed incentives that were pushed down all the way to the end user so the bad actors could be pointed out by the very society they seek to safeguard?

We live in a world where innovation beats regulation and therefore, when legislating over the nexus of technology and national security, it is imperative that lawmakers and service providers collaborate seeking a viable solution for a shared problem. As the CEO of an encrypted communication platform I can say that no one is in this business to empower terrorist and extremists. On the contrary, the technology being built by companies like Criptext, ProtonMail and Tutanota bring the world closer together for good. We, as a society, must be very wary of not solving problems by compromising privacy. These solutions are the kind that take us 3 steps forward and 5 steps back.

--

--

Mayer Mizrachi
Criptext

CEO & Founder @Criptext. Magna Cum Hack — Picota 2016.