The Senate crypto bill is comically bad

A visual guide

Isaac Potoczny-Jones
3 min readApr 8, 2016

Some Senate members are promoting a bill that is apparently in response to the Apple vs. FBI case (see my visual explanation of that case here). But that bill is so broad that no one could truly predict its implications.

If you’re curious about the draft text for the senate crypto bill please, read the text for yourself or a summary on Wired. If you have ever used a security product, you’ll probably quickly realize that it would make most (if not all) of today’s encryption illegal.

For example, a product like an hard drive with built-in encryption is covered since seagate provides a process for storing data. Upon a court order, seagate must provide the data on that drive by making it intelligible, either by never encrypting it or if it is encrypted, they must decrypt it.

The following graphic is provided to illustrate the paths by which nearly all secure storage or communication needs to have a back door.

A back door would be required for most security products.

The upshot is that any feature, service, or product that is created controlled by or provided by a covered entity (device manufacturer, software manufacturer, remote computing service, communication service, or product or method for storing data) who gets a court order must be made intelligible (never encrypted or if it is encrypted it must be decrypted) and provided to the government.

The law does not seem to prevent users from using crypto directly after a product is purchased, but of course most users won’t. The issue isn’t strong crypto, it’s easy crypto.

Does it cover source code in addition to crypto?

Another amusing aspect of the bill is that it doesn’t just cover encryption. It also includes any data that’s been “encoded, modulated, or obfuscated”.

The process of turning human-readable source code into something that computers can understand often requires encoding it into a binary format. Furthermore, the definition of data includes “information stored on a device designed by a software manufacturer”, which would certainly seem to include the programs stored on that device. Does this require developers to provide source code?

During the FBI vs. Apple situation, the FBI’s had a specifically scoped warrant for a specific phone. Their request was for Apple to modify their OS’s source code to remove certain security features. The FBI could remove those features themselves, but they would more-or-less need Apple’s source code. (They would also need the signing key, but let’s leave aside the question of the signing key for now.)

My reading is that this law would give the FBI a new power to request the OS source code under the scope of a warrant to search a specific phone. They would not need a search warrant issued against Apple.

An app (product) designed by Microsoft (software manufacturer) compiles (encodes) its source code (data) and stores it on a user’s device.

Binary code stored on a device would appear to be in scope.

What about obfuscation, which for example can mean making the code of your application confusing so that hackers cannot reverse engineer it to find vulnerabilities. Does this require software companies to turn over the source code of obfuscated programs?

Heck, if the government does not want to write a Base64 decoder for your communications protocol can they can require you to write one for them?

Conclusion

The shockingly bad implications of this bill are most likely unintentional. It’s likely that these Senate members simply don’t realize that the technologies to make data unintelligible to anyone, including the government, are 1) The cornerstone of cybersecurity as we know it, and 2) already widely available, and have been for many years.

--

--

Isaac Potoczny-Jones

Isaac Potoczny-Jones is an authentication and privacy specialist. He is the CEO of Tozny, a security startup that offers two-factor authentication services.