After all this time, we still can’t crypto

Earlier today, someone posted a blog about their experiences auditing an NQ Vault, an Android app that claimed to be “a safe place to store your private data.” The result was not pretty.

TLDR: basically it turns out the encryption the app was using was, how shall we say, less than effective (using a 1 byte ‘key’ XOR’d over the first 128 bytes).

As developers in this freshly security-conscious world, what lessons can we learn from this?

1. Encryption is still too hard for developers.

One of the first thoughts that came to mind was this: with all of the effort involved in constructing this custom ‘encryption’ scheme (especially the key generation), wouldm’t it have only been a few more lines of code for the NQ Vault developers to use a real framework?

Crypter crypter = new Crypter("/path/to/your/keys");
String ciphertext = crypter.encrypt("Secret message");

Fortunately, there are OSS projects and companies, e.g. Virgil Security, attempting to make the problem of using crypto easier. Equivalent code using Keyczar for example (as shown above) should make this kind of work dead simple — yet developers are not aware of these libraries, or maybe there isn’t support in their language or platform of choice.

Standard libraries for programming languages need to make easy, basic encryption a first class citizen. Yeah, I get it, it’s not going to be easy or maybe even possible to communicate when to use symmetric vs. asymmetric crypto, or how many bits they need in their keys, or how to build out a PKI infrastructure. Fine.

But for basic things: like storing hashed passwords, or encrypting small strings to be read only by the app: wouldn’t it be awesome if someone could learn some basic API once, something like POSIX for crypto, and be able to move among platforms easily?

2. Security Transparency is not a real thing.

Perhaps the most hilarious/tragic part of the blog was the reposting of this statement, listed in the first few sentences of the description in the Play store for the NQ Vault app:

TRUSTe — Received “TRUSTe Privacy Seal”

NQ Vault used this ‘credential’ as a sales tool, to lull the consumer into trusting them, that they are doing the right thing security-wise. The measures taken by NQ Vault would not protect your privacy against a smart high-school kid who read a basic chapter on crypto, nevermind the NSA.

As sad as it sound, the best vehicle we have right now for security transparency is compliance, and even then it’s often just an auditor verifying you have written down what controls you claim to have in place for risks that may not even apply, or be top priority. And furthermore, even if a vendor you use is PCI-certified or HIPAA client or FIPS-whatever, you often have no visibility into what that really means.

For consumer applications, there is no security transparency. Until companies are brave enough (or shamed, or compelled by the market) to reveal the details of their real security posture, trust is simply a matter of reputation, or convenience. It’s just a calculated risk you take based on a brand name.

3. To get real security, we will have to pay.

As @codinghorror wrote, to get real security auditing, you need to put real dollars into the effort.

You could trust the vendor to do it themselves, but we’ve seen how well that works. Even if they aren’t outright lying, like most organizations, they have their own agenda and priorities and that’s almost never admitting to you the consumer: ‘Ehh… we did an okay job guys, but we could definitely be more secure.’

That said: if there was a place I could go to in order to ‘crowd source’ code review and pentesting for the most sensitive apps and OSS projects I use, I would totally throw some dollars to that effort. Companies like bugcrowd, hackerone, synack have a huge opportunity to delivery on this, given that they have the infrastructure and already possess the community of crowdsourced security researchers. However currently they are designed to deliver on bounties funded by a corporation.

I don’t want to wait for Google to decide it’s important to audit OpenSSL and put a lump of cash toward that effort. Let the community raise the money and pay.