A short list of actions required to regulate encryption

The topic of encryption has again come up in the United Kingdom and Australia and governments try to legislate against it, once again scrabbling around for ways to make people safer and look for scape goats for various complicated problems that they haven’t been able to deal with effectively.

There have been many attempts to explain why this probably won’t work, but apparently none of these have managed to make their way into the hands of the people that are proposing these new policies. It seems to many technical people that are familiar with the topic of information security that our politicians are either wilfully ignoring everything that has been said on this topic, or are simply surrounded by people without the technical background to understand what is going on.

Rather than pointing out the problems of legislating away mathematics, or mentioning that open source has already opened the Pandora’s box, or the commercial and security issues of weakening encryption so unnamed government observers can attempt to literally read everything everyone writes everywhere, I thought I’d try a slightly different approach — what kind of things would have to happen to successfully regular crypto in a useful manner?

First, the legislation

Let’s start with the easiest part first — writing new laws. Now, I say “easy” because this part only requires political will. It just needs a bunch of politicians to all temporarily point in the same direction, write some stuff down, and all vote on it.

The reality is that this is of course a staggeringly complex process that will take an extremely long time, if it’s even possible. But I still think it’s the easiest part.

What would such laws look like? When it comes to crypto, I guess there are several alternatives:

  1. Outlaw encryption entirely. This is unworkable; too much depends on it already.
  2. Strategically weakened encryption. This means that we the people could only use certain types of cryptography that had certain types of flaws known only to the government, allowing them to easily break it. This also is unworkable.
  3. Back-doored encryption. Strong encryption making it hard for “the bad guys” to break, but with a special method that allows the government (presumably, “the good guys”, always and for all time) to still read it.

Crypto isn’t the only thing on the table though. Data and metadata storage and retention, supplied on demand at the request of government agencies. This legislation will be equally complex — how long do places need to store their data for? How do they need to store it? What is their liability in the (increasingly likely) situation that their data is stolen wholesale, because it was being stored per this legislation? What are their obligations to comply with government requests? What is a satisfactory turnaround for a request — an hour? A week? And so on.

But this is just writing a big set of rules. That’s the easy part. Actually complying with them though… that’s another story.

Next: getting the big software companies to do something

So now the laws are on the books, let’s say, in the UK. What does this actually mean?

Well, first of all, remember that many of the companies that make the software that use encryption aren’t actually based in the USA. The software is written (or if not written, housed from an intellectual property standpoint) in the US. So whatever laws are written need to take that into account.

Let’s pick Facebook as an example, mostly because of Whatsapp’s strong end-to-end encryption being a great target for governments. What laws in the UK can compel Facebook to change their software to suit — bearing in mind that Facebook do not want to make this change as they know it weakens their software and is worse for their customers.

Obviously this is not impossible. Companies are required to comply with local legislation all the time, even in matters of software. From a technical perspective alone, it is quite challenging — maintaining multiple codebases is more complicated than one, especially if the changes are fairly low-level (like I imagine they would be for the crypto in Whatsapp).

Getting everyone else affected to comply

So Facebook have to write some new code. Big deal? But it’s not just Facebook. Google would have to do it too. Microsoft. Amazon. All the big companies with heaps of money and smart programmers, so piece of cake. Oh yeh, and all the small companies, including cafes, hospitals, and libraries. These places are somewhat less well-equipped to comply with technical requirements.

The burden on them is disproportionately great as a result. It will almost certainly require them having to buy and use off-the-shelf software that can comply with these requirements. How they can be sure they’re complying with specifics of legislation though — I have no idea. If you’re a cafe or a hospital or a library proving free wifi to your visitors and you need to supply network access information about a particular user that visited a week ago, that would be very hard. A month ago? A year ago? Without collecting and storing vast amounts of data on every user expensive) it would be practically impossible.

Enforcement: making sure everyone has actually complied

The government also now needs to be responsible for making sure that every company affected has actually complied with whatever their legislation said.

It simply would not do to pass these laws and then find out a bunch of companies claimed to be compliant (or more likely, did nothing because there probably was no process to actually even allow them to even pretend they’d been compliant) only to discover months down the track when there’d been criminal activity that they had no way to provide information the government requested. “What do you mean you can’t decrypt this message?” is the start of a conversation that would probably not end well in the even the authorities were chasing down the tail end of a criminal investigation.

I cannot think of a systematic way for the government to do this kind of verification with any sort of useful efficiency. There is too much software from too many sources, and too many places where it is used. Random sampling with very strong penalties for non-compliance is the only way that seems even remotely feasible. Even this would be expensive and time consuming for most of the parties involved, but I don’t feel confident they could get wide enough coverage.

Remember that every company that fails to comply (and “company” in this sense means every solo app developer, every abandoned open source software project, etc) is just another hole in this line of defence that can be exploited by “the terrorists”… or whoever.

Dealing with open source crypto

Many companies depend on open source software for their cryptographic needs. Using popular, well-maintained open source crypto libraries is generally considered to be the most secure way to deal with issues of encryption in your software.

I don’t even know where to begin dealing with this issue. It’s certainly possible to create new versions of these libraries that comply with legislation; companies that are required to comply can do this and then use the new (weaker) crypto. But the original (strong) crypto is still out there in the world. Pandora’s box has been opened.

Legislation could be attempted to restrict or prohibit the use of, or contribution to, open source software. It could block access to popular sites with common libraries, I guess? But this would be as useless and unworkable as basically any sort of high level Internet filtering, but that doesn’t mean it won’t be suggested. And of course it would only “work” (for definitions of work that basically mean “not work”) in the country of the legislation anyway.

Any half-assed criminal with even a basic understanding of software can now build their own secure communications channel using this open source software. It can never be removed from the Internet or from public circulation.

It is kind of a bit like Digital Rights Management — you know, that stupid shit that stops you from copying a game or movie that you own to make a backup or watch on a different device. You already know that if you wanted to pirate it, you can. DRM basically punishes you if you did the right thing and bought whatever media it was by limiting your rights.

Similarly, these crypto laws just mean you as a legitimate citizen of your country will be at risk of less secure communications — your online banking, your personal talks with your partner, sending photos of your children to your family. All while criminals still have totally unfettered access to unrestricted, highly secure communications using excellent open source software.

Well, now what?

The main takeaways:

  1. Assuming new laws can even get written — which despite the current attempts of politicians and media deliberately trying to cultivate a culture of fear, is far from assured — they will take a long time and a lot of effort and money to implement.
  2. Even if they are passed and companies agree to implement them, it will be extremely challenging to ensure they do so widely and compliantly enough to have any practical effect.
  3. Even if compliance is high, it doesn’t solve the problem of widely available crypto in open source software.

… do not paint an encouraging picture. The sheer staggering amount of work that has to go into even getting this remotely off the ground alone is enough to make me cringe, without even considering the risks of potential for privacy violation, spectacular financial failures and whatever else could happen without strong encryption to protect us in the digital realm.

Few other options have been presented as alternatives. But the lack of a good option doesn’t mean that one idea should win by default.

It is almost impossible not to see this as policy being created out of sheer desperation. To quote David Allen Green’s excellent commentary on this topic: “Law-making creates the illusion of something being done.”