Quantum security and crypto agility — prepare for a few choppy years
At Q2B (organized by QCWare) this year, I had the pleasure of moderating a quantum security panel with experts from Sandbox AQ, Air Force Research Labs, National Institute of Standards (NIST), and Boston Consulting Group. Insights from this panel are worth sharing.
It appears that corporate CIOs and CISOs are increasingly aware of the risks posed to their security posture by quantum computing in general and Shor’s Algorithm in particular. They are worried about SNDL — Store Now, Decrypt Later — where hostile actors are increasingly capturing highly-sensitive yet encrypted data with the expectation that they could crack the encryption later. Cynically, one could look at the attention from a C-Suite as “it could happen during my tenure” syndrome. Still, even companies that estimate that such danger is a decade away seem to be preparing to address it.
But how? Engagement swapping and other quantum-secure networks are still years away, and Quantum Key Distribution (QKD) is not popular in the US, in large part because the US National Security Agency publicly states that it “…does not support the usage of QKD or QC to protect communications in National Security Systems, and does not anticipate certifying or approving any QKD or QC security products for usage by NSS customers”. This leaves Post-Quantum Cryptography (PQC), encryption, a potential set of encryption protocols that are believed to be resistant to breakage by quantum computers.
“Are believed to be resistant” is the key phrase. NIST has been leading a worldwide process to create a PQC standard and selected four leading algorithmic candidates earlier this year, yet Rainbox, one of the algorithms, has already been cracked. Thus, not only have resistant algorithms yet to be selected, but it is unclear what additional changes will need to be made to them. Unlike the 45-year experience that the world has with RSA, quantum-resistant algorithms are very new and have not been analyzed as deeply as classical algorithms.
Given this moving target, what are CIOs to do? The prevalent opinion is that companies should follow a structured process of taking a detailed inventory of all their encrypted communication systems, prioritizing them, and then gradually upgrading their encryption to quantum-resistant. This is an arduous process: just taking inventory might take 2–3 years. If PQC standards were alread set, upgrading to quantum-safe systems would have been a difficult process, but given that standardization work is still ongoing, it is even more difficult. Because of this, companies are advised to implement “crypto-agility”, which means implementing PQC in a software layer so that it can be modified or replaced as standard mature.
Another recommendation is to strive for hybrid security, layering PQC on top of existing classical encryption such as RSA. The rationale is that by adding quantum on top of classical, instead of just having quantum replace classical, the security posture is never degraded. Even if the particular quantum algorithm is broken, the classical layer is still in place. Having said that, a hybrid strategy carries its own set of problems. For instance, IoT devices often have weak processors, and thus might not have enough computational resources to implement hybrid encryption.
Combining all the ingredients — a serious quantum threat that is real today because of SNDL, evolving standards, millions upon millions of systems that need an upgrade, IoT systems that might be too weak — we need to prepare for a few choppy years. But the quantum threat is real, and companies should have already started addressing it.