Going Quantum Resistant In Blockchain: A Plausible Timeframe?

This is an elaboration of what I wrote about this subject earlier in “An Addition To The Bitcoin Wiki Page On Quantum Computing”

Allen Walters
The Dark Side
Published in
7 min readOct 13, 2019

--

For an in depth analysis on Quantum computing-blockchain and what it takes to upgrade for current blockchains, see: Quantum resistant blockchain and cryptocurrency, the full analysis in seven parts.

To make a complete and realistic estimate of the expected timeline for upgrading and coin-migration we use Mosca’s theorem of risk determination. For blockchain, the theorem can be adjusted as follows:

The sum of v, w, x, and y should obviously be smaller than z-q for a system to end up succeeding to be secure.

v = selection of signature scheme and proposal for implementation. Different signature schemes are available. There is no plug and play scheme to replace current schemes. And there are several solutions imaginable to handle the bigger signatures. To estimate a realistic timeframe for this process, a comparison can be made with the Segwit timeframe (Idea from 2012, was officially introduced in late october 2016) or the implementation of Schnorr signatures. If we look at Schnorr, we see that an advanced discussion is already in process in May 2016 and a possible implementation proposal is planned for this year.

V should be estimated to be a couple of years or more.

w = reaching consensus and upgrading the nodes. An upgrade will not be implemented without any form of review and discussion. After a formal proposal is done, reaching consensus will take an amount of time. Since phase “v” quite likely results in multiple options, from which all will have significantly bigger signatures and possibly different ways to deal with these bigger sizes, a quick majority for one proposal is not a given. Consensus might be a trajectory like we seen with the SegWit. Or SegWitx2, which isn’t implemented by this date. Besides the need to chose between different options, there should be decided when the upgrade will be effective. This is a second subject that will cause debate. Since there is a lot of skepticism amongst many blockchain devs about quantum computing to be realized within our lifetime if not ever, this skepticism will be an additional cause for delay of activation of any upgrade.

Once quantum computing is close to forming an actual threat or is known to form a threat, the moment of implementation will be no issue in the process of reaching consensus. Which signature scheme and how to implement it efficiently will still be a rough discussion.

W is a very unpredictable factor. A smooth consensus is unrealistic though.

x = migration period. After an upgrade of the signature scheme, all the coins are still stored on the old, vulnerable public-private key addresses. The upgrade simply gives the users the tools to create a new, quantum resistant address and migrate their coins to the safety of that address. Without migration, there is no quantum resistance. Hacks of coins that are still vulnerable will affect the value of all coins, including those on quantum resistant addresses. Due to the decentralized nature of blockchain, this can only be done by the users themselves, since only they have the private key and thus only they have access to the coins. Quantum resistance will depend on the actions of millions of users. If we consider human nature, we can conclude that this process will take time. Comparing to SegWit transactions, we see that two years after SegWit, just over 50% of all transactions were SegWit transactions. Without any form of serious incentive, this migration period will be several years. The incentive could be a deadline (which would need consensus again to be enforced), or the actual capability (or near capability) of quantum computers to break ECDSA. But then still, human nature should be considered a weak link in this process, mainly because we will depend on 7 million + people to act.

X is another unpredictable factor in the timetable.

y = stagnant phase to minimize the risk of burning live funds. This last phase is advised since for most existing blockchains a considerable amount of their circulating supply is lost and can never be migrated to quantum resistant addresses. A solution should be found for these so-called lost addresses. The only solution to this problem would be to burn them. Otherwise, these coins will be hanging like Damocles’ sword of uncertainty over the value of the blockchain forever. Due to the fact that none of the users are registered and thus cannot be contacted, you can not determine which addresses are really lost and which are simply longtime holders. If we take another look at the results of the research by Chainalysis, who concluded that between 17% (low estimate) and 23% (high estimate) of BTC was lost at the time of publishing, we see a difference of about 1 million BTC in the high and low estimate range. The big discrepancy between the high and low estimate (about 1 million BTC), shows the issues there will be to determine with certainty what stagnant addresses are lost and what are long term holders. This is important to notice for anyone who proposes to just burn the lost addresses in any neglect able period of time after upgrading to a quantum resistant signature scheme. This phase should be a serious period of time. And then still, if at a certain point in time the decision is made to burn any leftover coins, you will risk burning people's live funds. This makes the last phase controversial if not impossible to fulfill without trading one risk for the other.

Y should be several years. A period of five years has been mentioned in discussions a few times.

z = The time we have until a quantum computer of a critical level has materialized. Estimations are all over the place. From a couple of years to never.

Z is the great unknown.

q = is the margin we should deduct from z as a safe margin to compensate the blind spot caused by the fact that any assessment of the development curve of quantum computers is based on incomplete information. Additionally, q accounts for the fact that developments on other levels like algorithms improvements can contribute to a quicker reach of the moment a quantum computer can break the cryptography in question.

Q should be a certain percentage of Z.

Development in quantum computing is likely to speed up. Several credible organizations like the NSA, NIST, NAS advice to plan ahead of a possible emerging threat, rather than act reactively once the threat has materialized. The reasoning for a pro-active attitude is substantiated in the article in this link under the header “Timeline/ Plausibility”.

Summarized:

1. The hazard and the security disaster it would create is of such significance that one can’t afford to take any gambles.

2. Public and universal analysis of a possible critical date can only be done while reviewing public information. And because there are huge interests at stake (commercially and strategically), not all developments will be shared publicly. So, assessing the risk, you should assume the possibility of a blind spot. This means that in assessing the risk, you must seriously consider the idea that an estimate should be adjusted to an earlier timeline if you would have had all the information at your disposal in your analysis of the development curve. Adding to that, there are developments in other fields that can bring a critical date closer. To give an example: a new algorithm called Variational Quantum Factoring is being developed and looks quite promising. “ The advantage of this new approach is that it is much less sensitive to error, does not require massive error correction, and consumes far fewer resources than would be needed with Shor’s algorithm. As such, it may be more amenable for use with the current NISQ (Noisy Intermediate Scale Quantum) computers that will be available in the near and medium-term.” See for more information here.

3. An implementation period of new cryptography takes time. While the needed timeframe depends on the system, an analysis of this timeframe should be made. If this isn’t carefully done, there is no way to make a total risk analysis where you reflect the expected timeframe against the expected time the risk will materialize. V, w, x, y will need to be done for every single blockchain individually that is serious about risk determination.

There are some blockchains that have implemented quantum resistance from the launch of their genesis block. QRL is my personal favorite. But for other existing blockchains, the fact is that going from where we are now to fully quantum resistant cryptocurrency is a process that will take a serious amount of years.

--

--

Allen Walters
The Dark Side

Discover the technology, even before it gets relevant