Pandora’s Cloud Key Vault

Pwn All The Things
7 min readAug 16, 2016

--

A couple of days ago I tweeted about an observation from Apple’s presentation at BlackHat 2016 on their “Cloud Key Vault” — essentially a data-center sized Secure Enclave built out of HSMs. It caused a bit of a storm, and it needs some clarification.

The controversial tweet in question was this one:

It’s an admittedly bold claim that takes some considerable explaining to understand where I’m coming from, and Twitter is a terrible place for bold claims or explaining things, so it was perhaps unsurprising that it was widely misinterpreted when compressed into 140 characters.

John Hopkins Professor Matthew Green is one such person who disagreed. His blog post about it is here. He does a good job of thoroughly torching an argument I didn’t make, and then compounds the problem by arguing vociferously about how secure CKV is. Director Comey should send him a thank-you cake for his help.

The problem, as I see it, for the privacy community isn’t that CKV isn’t secure. It’s the opposite. It’s that it is so obviously secure.

This sounds counterintuitive. So let me unpack it.

The problem is this. The argument that high-value keys are inherently dangerous and can’t be kept securely is the most important technical argument against law-enforcement mandated access to encrypted devices. All of the other problems — building the software securely, coping with international law-enforcement, making it auditable against misuse and so on are all losing arguments for the privacy community because they’re all small, political or solvable engineering problems.

Once you concede that keeping high-value decryption keys secure against theft is possible, you’re left fielding a combination of technically weak arguments and political arguments against law-enforcement exceptional access, and quickly the debate moves from the technical sphere where the privacy community has the upper hand to the political one where they don’t.

Everything in the “Going Dark” debate hinges on the argument that “keys can’t be held securely”, and hoping that the media and public don’t investigate the claim too closely, or notice the other problems are all tractable, and hoping that the government aren’t technical enough to notice and explain why.

You might well now be thinking “Apple Cloud Key Vault doesn’t concede that keys can be held securely”.

Wrong.

Apple might not be conceding that argument technically, but they’ve conceded it politically, because they’re trusting CKV to store a key that logically they must consider astronomically dangerous in order to get a tiny user-interface improvement to users, and they’re doing so for ordinary business purposes unrelated to security, and built it entirely of their own volition.

During the Apple-FBI dispute, the FBI asked Apple to write a custom firmware for an iPhone 5C so that the FBI could unlock Syed Farook’s phone and brute-force the PIN code on his phone without being locked out after 10 tries. In response, Apple did a big PR campaign against this request — culminating in Tim Cook hyperbolically calling the hypothetical firmware “the software equivalent of cancer”.

Whether that characterization is sound or not doesn’t matter. What matters is that Apple made it clear that a firmware image that can allow law-enforcement to brute-force PINs is exceedingly bad, and that creating it would cause huge damage to Apple if they could not keep it secure. CKV’s private key must logically be equivalently bad. If the CKV private key is lost, law-enforcement can brute-force uploaded iCloud Keychain files to recover iCSCs (which are nearly always device PINs). The only mitigating factor to the huge security dangers of the CKV key is that it is kept securely inside the CKV itself.

It’s also important that Apple introduced the CKV private key entirely of their own volition. They weren’t compelled to by the government, and they could have chosen another design that avoided needing the CKV entirely. For example, instead of uploading your keychain file encrypted with a key derived from your short, brute-forcible device PIN, they could have generated a long non-brute-forcible random sequence of characters and made you type it in from your iPhone to your Mac. Or they could have let you share the key between your iPhone and Mac via a QR code shown on your iPhone screen that you show to your Mac webcam. Both of those designs would alleviate the need for the CKV and its private key to exist at all. Apple chose to do neither of those things, because they want your user experience to be slightly less awful when sharing saved passwords between your iPhone and your Mac.

That’s a serious commitment to user interface design.

But it’s a problem for the privacy community. This is a case of a technology company building — for their own business purposes — a key they think is extremely dangerous, stored inside a secure vault so that it can’t be hacked or stolen by insiders. And worst of all, doing so not in support of some security purpose, as automatic update keys or SSL keys might be, but in support of making user interfaces slightly easier to use.

When Apple and the privacy community later try and argue that CKV or an equivalent won’t be able to hold dangerous keys for law-enforcement, their arguments are going sound much more hollow now.

It also matters that CKV is a decryption key, not a signing key, and hence neutralizes lots of minor technical arguments against law-enforcement exceptional access keys. Arguments that “decryption keys are more dangerous”, concerns about “cipher agility”, “key revocation” and “key rolling” are all now easy for the government to answer: “Look at CKV”. If CKV solves it, it can be solved in the same way. If CKV doesn’t solve it, it clearly doesn’t matter all that much to Apple or their customers.

Finally, it matters because the government can now play Apple off against itself in the Going Dark debate to get past the technical “we can’t keep high-value keys secure” argument.

I don’t work for the justice department, and never have. But if I did, I’d be asking every investigator to pour over their backlog of cases, find every non-decryptable PIN-locked iPhone, and issue a subpoena for any corresponding iCloud Keychain files for that phone. Many won’t have one — not all users will use iCloud Keychain — but the government only needs one.

The next step would be to ask a court to issue an All-Writs-Act request against Apple to decrypt it with the CKV key so that the FBI can brute-force the iCSC (device PIN) on the inner vault. The iCSC will let law-enforcement access the phone bypassing the PIN guess limit, so it seems likely they would have grounds for the AWA order.

I can already hear you shouting at the screen “Apple won’t be able to help! The whole premise of CKV is that Apple can’t help!”. It doesn’t matter. Apple will have to go to court and publicly argue how they can’t possibly unlock the device. CKV is secure. It’s built secure. Nobody at Apple is smart enough to hack into CKV etc etc.

And simultaneously, the government can introduce legislation compelling exceptional access to encrypted devices via split-key access. Something along the lines of taking the device encryption key, wrapping it with an FBI public key, and then wrapping that with an Apple public key, and storing the resulting onion ball on the unencrypted part of the drive (i.e. storing Apple(Fbi(Device key))).

Such a system would allow the FBI to unlock devices with court-orders and has ideal (perfect) security if you can solve two problems: first you need to keep the private keys safe from theft, and second you need to prevent the key-guardians in both of the organizations from colluding or getting hacked.

The problem for the privacy community is that the first argument holds no weight if Apple is in court arguing as hard as it can that it’s technically impossible to hack their high value keys out of the CKV. Apple will be left in an impossible position. Any argument they make that maybe the CKV could be hacked via “X” leads the FBI to ask Apple to do “X” to service the All-Writs-Act order. Any argument that they make that the CKV is secure and can’t be hacked in the All-Writs-Act case leads the FBI to assert that the device must therefore be safe enough for law-enforcement decryption keys.

And if the government decides that CKV should generate a rolling cryptographic tamper-proof ledger of decryption requests from inside the vault, the second argument starts to look pretty shaky too. It doesn’t guarantee against collusion or key-guardian hacking. But it does guarantee that it can’t happen without it being detectable after the fact.

And once you’ve got past those arguments, we’re past the technical question of “can we do law-enforcement access securely” and have moved fully into the political questions of “but should we?” and “how would we coordinate such a thing internationally”?

That’s why I called out Apple’s Cloud Key Vault design.

It’s not that CKV is technically insecure. It’s that it gives the government the political space to robustly argue that asymmetric split-key law-enforcement device decryption isn’t especially technically dangerous, and move the Going Dark debate forwards in a direction that the privacy community won’t like.

There’s a good chance nobody from the government will notice. But if they do, we might all have to reflect on the irony that in building a box full of badness that can’t be opened, Apple unintentionally opened the Going Dark Pandora’s box.

--

--

Pwn All The Things

Mostly #infosec or #natsec tweets. #FOIA|s and document tweetstorms via @foiathethings