Classical Cryptography in a Post-Quantum World

Large scale quantum computing has always been “ten years away” since it has become apparent a quantum computer could be built. Something quite unexpected has happened recently: This moving timeline has changed.

Roman Štrobl
Wultra Blog
10 min readJun 2, 2020

--

IBM Q System One

If you ask an arbitrary quantum computing scientist at a conference, they’re now more likely to tell you that large quantum computers are “five years away” or even less. Some may claim that such quantum computers have already arrived, but those who have them would not tell you about it so that they can spy on you. Conspiracy theories aside, the fact is that no-one knows when quantum computers will become powerful enough to break classical cryptography. It is becoming inevitable. Based on the estimates of many experts, it is likely to happen within this decade.

In 2019, Google claimed that its Sycamore processor completed a task faster on a 54-qubit quantum computer, which means that quantum supremacy has been reached, according to Google’s claims. The quantum computer solved a problem that would take 10,000 years to a supercomputer in 200 seconds. So, should we start getting concerned about the security of our classical ciphers?

The theory of quantum computing is evolving much faster than hardware implementations. In 1994, Peter Shor discovered an algorithm that can factorize large prime numbers in polynomial time on the quantum computer. This algorithm not only breaks RSA, but it also breaks elliptic curve cryptography by providing a polynomial-time solution for the discrete logarithm problem (ECDLP). This means that the popular asymmetric cryptography algorithms such as ECDH and ECDSA will be broken once a quantum computer powerful enough is built.

In 1996, Lov Grover discovered a quantum search algorithm with a speedup from O(N) to O(√(N)). While seemingly unrelated, this algorithm has quite a significant impact on the symmetric cryptography algorithms, such as AES, and it even affects some of the hash functions. To see how the search speedup affects the security of the symmetric encryption, simply substitute N=2^k (where “k” is the encryption key length) to the formula.

Besides these two well-known impacts of quantum computing advances, there will be many less visible impacts. The classical random generators may prove not to be quantum-safe, because they may rely on weak hashing algorithms. Many cryptanalysis tasks will become easier for an attacker with a quantum computer. We are only at the beginning of the quantum computing revolution. Indeed, new algorithms and improved versions of Shor’s and Grover’s algorithms will be discovered. So yes, we should get concerned whether our classical cryptography is still safe to use today.

Is Cryptography Apocalypse Coming or Not?

We know that quantum computing will break asymmetric cryptography and weaken symmetric cryptography. Luckily, we still have some time, because there are inherent challenges with the physical implementation of quantum computers, such as:

  • Quantum decoherence — the quantum processor needs to operate in a deeply frozen environment so that there is minimal interaction with the surrounding environment. This makes the computer very expensive and unfit for large scale public use (except for experimenting, which you can currently even do for free).
  • Too few qubits are available— to crack a 4,096-bit RSA key, you need approx. 8,200 stable qubits. Elliptic curve cryptography can be broken easier on a quantum computer (unlike on regular computers), so you need approx. 2,300 stable qubits to break ECC.
    Note: The D-Wave quantum computers cannot solve generic quantum computing tasks, we are talking about qubits in universal quantum computers, such as those built by IBM.
  • Too many errors when executing quantum algorithms — getting reliable results from a quantum computer is very tricky. While algorithms for the quantum computers are often stochastic by nature, sometimes, the undesired errors that are outside of the expected stochastic model accumulate as well. Special (and top-secret) error-correcting codes are being researched to tackle this problem. Right now, for every stable qubit, you may need hundreds to thousands of ancillary qubits. It is also quite challenging to execute algorithms with long execution time because the errors may compound over time. This is why the current state of quantum computing is often labeled as Noisy Intermediate-Scale Quantum (NISQ).

On the other hand, technological evolution cannot be stopped. Remember that the first classical computers needed a large air-conditioned room to perform tasks you can do now much faster on your mobile phone.

Inevitably, quantum computers with thousands of stable qubits capable of running Shor’s and Grover’s algorithms will be available. We just don’t know exactly when. We can use the time we have to get ready, because unlike the previous cryptography algorithm migrations (such as SHA-1 to SHA-2), the breaking of existing algorithms may happen suddenly.

Broken and Weakened Algorithms

Let’s analyze how a powerful quantum computer would break all these algorithms. We will also mention the mitigation strategy for each of the algorithm families, with more details coming later on.

These algorithms are known not to be quantum-safe, providing a large enough quantum computer is available:

  • DH, RSA, DHIES, ElGamal— Completely broken due to polynomial integer factorization using Shor’s algorithm.
    Mitigation: Short term — update algorithm parameters. Long term: migrate to PQC or use a hybrid classical/PQC solution.
  • ECDH, ECDSA, EdDSA, ECIES — Completely broken due to polynomial ECDLP solution using Shor’s algorithm.
    Mitigation: Short term — update algorithm parameters. Long term: migrate to PQC or use a hybrid classical/PQC solution.
  • AES-128, CHACHA20-POLY1305 — Partially broken, need 2x bigger key sizes due to Grover’s algorithm.
    Mitigation: Use AES-256 instead of AES-128.
  • SHA2, SHA3, HMAC-SHA — Only slightly affected, migrating to a stronger variant of the hash function is a good idea.
    Mitigation: Use SHA-384 instead of SHA-256.
  • PRNG / DRBG random algorithms — Depends on the underlying algorithm. If the algorithm can be attacked, the randomly generated keys may not be quantum-safe.
    Mitigation: Update the random generator algorithm, if needed.

If you are in doubt about which parameters to choose, you can check out the Transitioning the Use of Cryptographic Algorithms and Key Lengths document. This NIST document provides guidelines before the final PQC algorithms are announced.

The Path to Quantum-Safe Cryptography

The previous chapter might have scared you a bit. Indeed, quantum computers will break algorithms which are used for e-government solutions or digital banking security. There is some good news to share, though:

  • Many algorithms exist that do not have known attacks using a quantum computer. These algorithms are called post-quantum algorithms, and most of them have been under development for many years already.
  • There is still quite a lot of time available to migrate to post-quantum algorithms, given all the physical implementation issues with current quantum computers (most likely several years, but we do not know for sure). This, of course, does not mean we can just relax and wait for the powerful quantum computers to arrive, we need to prepare for their arrival actively.
  • NIST is running a competition for post-quantum cryptography (PQC) algorithms. Similar competitions were run in the past with success. For instance, the AES and SHA-3 algorithms were standardized thanks to NIST efforts. Although NIST had a questionable reputation in the past, it is now well regarded among cryptologists, and some of the top experts are now participating in analyzing the post-quantum algorithms.

NIST PQC Competition Status

We are currently in “round 2” of the NIST competition, and it is expected that the “round 3” will start later this year. Currently, 26 algorithms are still in the game with candidates for two families of algorithms:

  • Public-key encryption and key-establishment algorithms.
  • Digital signature algorithms.

NIST published an API that is common for either of the families of algorithms. The beautiful thing about this API is that once the winners are announced, it should be possible to switch the algorithms rather easily, in case your favorite algorithm does not make it.

Note: The key-establishment algorithms differ from the traditional DH or ECDH algorithms. The new protocols are key encapsulation protocols, rather than key exchange protocols. NIST has additional security requirements on these protocols including perfect forward secrecy, which ensures that decryption is not possible using stolen private keys. Furthermore, resistance to side-channel attacks is a desirable property. The complete list of evaluation criteria is published on the NIST website.

It is expected that multiple winners will be announced because each of the algorithms has some benefits and some drawbacks. There is no clear winner yet because of various aspects, such as key sizes, performance, signature sizes, security levels, etc.

The PQC Algorithm Zoo

Post-quantum algorithms are based on different types of underlying mathematical problems:

  • Lattice-based cryptography — 12 algorithms are competing in this category for both key agreement and digital signatures. The cryptography is based on finding the shortest distances in lattices, which are special mathematical structures. The most well-known candidate is NTRU.
  • Code-based cryptography — 7 algorithms are competing in this category, but only for the “key agreement” category. The cryptography is based on error-correcting codes, which are used as one-way functions. The most well-known candidate is McEliece.
  • Multivariate cryptography — 4 signature schemes are computing in this category. The problems are based on solving many multivariate polynomial equations. All of these signature schemes are stateless. Stateful signatures were removed from the NIST competition. The most well-known candidate is Rainbow.
  • Hash-based signaturesSPHINCS+ is the only candidate in this category. It uses SHA-2, SHA-3, or Haraka as the underlying hash function.
  • Zero-knowledge proofsPicnic is the only candidate in this category. The signature scheme uses zero-knowledge proofs, which provide the proof with a single message.
  • Supersingular elliptic curve isogeniesSIKE is the only candidate in this category. The scheme is based on the elliptic curve cryptography. However, it uses the computation of isogenies instead of ECDLP as the underlying hard problem.

What Can We Do Now?

As you can see, there is still a wide choice of post-quantum (PQC) algorithms to choose from. For this reason, we expect that the “round 3” of NIST competition will be announced later this year. We also expect that the final winners will be announced between the years 2022 and 2024.

However, some steps can be taken right now, independent of the result of the NIST PQC competition. These are our recommendations:

  • Update the random generator algorithms, so that they use a strong cryptographic function. This change will guarantee that keys generated now are strong enough to withstand quantum attacks on the underlying cryptographic function. An example would be migrating from SHA1PRNG (potentially insecure) to DRBG_HMAC_SHA512. This change is quite easy and has only a minimal impact on performance.
  • Update key sizes for symmetric encryption algorithms. The general advice is to double the key sizes for symmetric encryption algorithms. This means using AES-256 instead of AES-128, and so on — key sizes need to be doubled for symmetric algorithms. For hash functions, use the recommended variant, such as SHA-384 or better. Allow only the stronger suites in TLS protocol. Your backends should not allow weak algorithm suites for the TLS protocol. Changing the allowed suites for TLS should not be difficult on the backend, as long as the clients support the newest suites. Increasing the key sizes for symmetric encryption may be more challenging, but still not rocket science.
  • Update the parameters of asymmetric algorithms. Did you know that the commonly used P-256 NIST curve is no longer recommended for ECC? Although migrating to a curve with a bigger field prime will not make it quantum-resistant, it will make the attackers’ life harder. If you still use RSA, using a bigger key size is a good idea before you are ready to deploy a PQC algorithm.
    Note: These changes should be only temporary — for the transition period. The end goal is replacing RSA or ECC-based cryptography with PQC algorithms because neither of these algorithms is going to stay secure in the long term.
  • Start experimenting with PQC algorithms for key encapsulation and build a proof of concepts. We have done the same in Wultra already, and it has helped us immensely to understand what we need to do to migrate to PQC algorithms. The security of key agreement algorithms is critical from the point of view of quantum attacks because the attacker will potentially be able to decrypt all recorded traffic from the past.
    Note: The PQC algorithms are still under development. It is probably not a good idea to deploy any of the algorithms in production before they get properly reviewed. Unfortunately, there exists no mathematical proof that the algorithms are quantum-safe (or even safe against classical attacks), so we may have to wait for the “test of time” with the PQC algorithms.
  • Educate yourself about PCQ signature schemes. It may be a bit too early to start integrating PQC signature schemes into your solution. Currently, most of the schemes provide less than optimal signature sizes. Depending on the PQC algorithm family, they may require extra key pairs, which may also be quite large. There is much less rush in getting the PQC signature schemes adopted because the harm will be done only once a large enough quantum computer exists. This is very different from the case of key agreement (including TLS!), where all recorded traffic is already problematic. You can wait until the algorithms are reviewed, optimized, and the signature/key sizes become smaller. However, as a proactive step, we also recommend having a look at the Long-Term Validation (LTV) schemes for validating the signatures. By using an LTV scheme, you will be able to easily update your signatures later as needed.

Conclusion

Powerful quantum computers are coming whether you like it or not. It is time to get ready and start educating yourself about post-quantum algorithms and prepare the migration path if you use classical cryptography. Some migration steps can already be done today. But the most important steps still need some time till it is clear which PQC algorithms survive the NIST competition as well as the test of time.

We prepared this post in close cooperation with Tomáš Rosa from the Cryptology and biometrics competence center of Raiffeisen Bank International (RBI). RBI is one of the few financial institutions with their own in-house theoretical research capabilities. As an innovator of the field, RBI actively examines the potential use of quantum computers and the application of post-quantum cryptography. Cooperation with the RBI competence center provides a solid background for educational posts like this one and practical implementations of proof-of-concept solutions.

We hope this article educated you about the migration to PQC crypto. We are currently preparing the migration of the products which we use in our digital banking security solutions, and we will share our experiences in a future post. If you have any questions or feedback on this topic, feel free to add a comment to this article or send us a message at hello@wultra.com.

--

--