The two (Quantum) cultures

Simon Thompson
Jul 29 · 27 min read

Abstract

It’s likely that practical quantum computing (QC) is many years away, and yet there is a great deal of excitement about potential applications of QC. Many financial institutions are actively developing quantum computing with a view to applying it to computational finance. Problems that are being targeted include portfolio optimization and asset pricing using Monte Carlo methods. This paper argues that excitement from investors and financial institutions is founded on a mismatch between what the QC community is doing and what those outside it believe it is doing. When the QC community says that it is building a quantum computer, this rests on a model of what a future quantum computer is that is radically different from the idea that investors and end users have.

Preamble

I’m not an expert on quantum physics, but nor are many of the people who are excited about Quantum Computing, and like many others I was asked to consider Quantum Computing as a commercial opportunity, possibly because of all the people who were badly placed to look into it I was considered the least badly placed.

It was a challenging task which meant I had to disentangle (!) the perception of some technologists, from the reality of the research community.

This blog documents what I found out and explains what I think is going on when one side (the researchers) says A but the other side (investors) decides that it means B.

Executive Summary

· The progress on quantum computing is real and should not be dismissed out of hand.

· Very significant investment is being made by both state actors and private investors.

· There is a zoo of technology, and great caution and care is required when separating real progress from generalisations about the field and the ultimate goal of a Practical Universal Quantum Computer (PUQC).

· The research program underway in the quantum computing community does not share the same conception of what a “real quantum computer” will be when it is realised as that of outsiders.

· Heightened venture capital (VC) investment can be understood in terms of the function of the VC process and market.

· Quantum inspired algorithms may support new applications on classical devices.

· Large financial institutions have a fiducial duty to understand tail risks of this kind (the possible emergence of QC). Their investment can also be exploited for PR and brand development defraying the cost — so their interest is rational at this level and should not be interpreted as an indication that they seriously believe that a quantum computing revolution is going to occur.

· The challenge of system integration and development and theoretical efforts by classical computer scientists may disrupt the advantages of quantum computing.

· It is entirely plausible (based on current progress) to speculate that it will take 40+ years for devices that resemble the concept of a PUQC to become available, 70 years would be unsurprising.

Introduction

All modern computers are in a sense quantum computers — in that semi-conductors harness quantum effects to provide the logic that allows programs to be written and run. Semi-conductors along with lasers and transistors and some other devices are known as “type 1” quantum technology (Prichard and Till 2014). These devices exploit an effect that has been observed and then understood using quantum mechanics. The new wave of technology (Quantum 2!) generates novel effects that can be created only because our understanding of the subatomic world has advanced to a point where it is possible for us to manipulate these microscopic systems into new and exotic behaviours.

Quantum 2 technology includes:

· quantum sensors, such as those used to track submarines (by responding to the change in conduction of a volume of water that includes a massive metal object), and to find buried objects by measuring their impact on gravity

· super accurate compact and low energy timers and clocks

· sensors that can determine inertial force (acceleration)

· quantum information processors (QIP) — including quantum computers (QC) and universal quantum computers (UQC)

The development of all of these technologies interacts because the manipulations and measurement required to achieve progress on each of them supports the progress on all of them. Also, it’s worth noting that developing all of these technologies has provided improved insight on to the physical world and supports the advance of scientific knowledge separately from the development of applications.

Arguably, the greatest near-term value from quantum 2 technology will be delivered by quantum sensors, but these applications are beyond the scope of this review[1].

The practical nature of quantum sensors explains why there is significant support from government for quantum information processors that act over quantum data vs. classical data encoded into quantum states.

Quantum Information Processing (QIP)

It’s important to understand that there is a range of devices that could potentially be used to deliver results in computational finance, and that these are not all equally useful or equally difficult to make. They all have a couple of things in common — they must implement a way of storing quantum states (known as qubits). They must implement a way of setting up the state, processing it and then reading it out/measuring it after processing. The level of control and scale of the device (how many qubits, what processing, how fast) defines the power of the device. There are many physical ways of implementing each part of the device (photonics, superconductors, crystals) but the device builders tend to care less about how a subsystem is implemented than the capability that it delivers.

Unfortunately, advocates and boosters of QC tend to equate all of these devices and to use progress in the development of one (simpler) type to make the case for interest in applications that can only be tackled by the implementation of one of the more complex devices. For example, quantum annealers are QIPs, as are quantum simulators. Both of these devices may be of value as scientific instruments or as demonstrators of scientific principle. Neither is (currently) viable as what would generally be understood as a universal quantum computer (UQC).

This is one of the more minor breakdowns in communication between the community of QC research and the rest of the world — most QC researchers wouldn’t assert that a quantum annealer should be thought of as a candidate for QC today (many would say that it’s possible that quantum annealers might be turned into UQCs by some breakthrough — but it’s not clear what breakthrough). It is the case though that QC researchers and quantum 2 technologists would view quantum annealers as interesting devices that might have great applications in their own right. Unfortunately, in the gap between these two reasonable and honourable positions there is a lot of space is available for misunderstandings to emerge.

What all of these devices do, is to make use of quantum 2 technology to create extra “quantum power” which delivers something new and valuable that a classical digital computer can’t. This is where their value lies, but for this value to be realised, this “quantum power” must be controlled and harnessed. This is where the real communication gap has emerged. It is now clear that if “quantum powers” could be completely and skilfully controlled, then a QC can be built, but the gap between that device — even if the research programs of today are realised — and a practical universal quantum computer which is similar to what most people think a “real quantum computer” would be, is very great.

Quantum Power(s)

QIP can only be implemented on a device that has some sort of added “quantum power” in addition to the information processing power of a traditional computing platform. This power is derived from two features of subatomic systems that are described by quantum physics. Superposition is the idea that many possible states of a particle can be combined together, and later, the state can be collapsed to just one of the possible, and entanglement which is the idea of a state that is shared by a number of different particles.

Super-positioned and entangled states are how “quantum information” is created and represented. A QIP applies a sequence of quantum processes to quantum information to transform it. This can be a very similar process to the logic processing in a classical computer (with different rules), or it can be more analogous to the processes that happen in a flock of birds or the evolution of a population on an island or during the growth and formation of a crystal. QIPs that work this way are known as quantum annealers, and these are under development by D-Wave, an American company[2].

A QIP manipulates a superposition before it is collapsed, allowing many different states to be manipulated at once. If a QIP manipulates an entangled superposition, it effectively processes many possible states for many possible items — simultaneously. It’s this many times many simultaneous processing that makes QIPs powerful, because each (quantum) logical step of the processor performs the equivalent of many steps of a classical processor.

The limitation on quantum power is the type of processing that can be described by physics and implemented by a real device. There is only a small set of processing steps that can currently be implemented. Theorists and mathematicians can combine these steps to demonstrate that they have limited powers, and also to demonstrate that there are no devices that have not yet been developed which would allow processing beyond certain limits. One fundamental limitation that is known is that no QIP will be able to implement “uncomputable” functions, and no QIP will be able to compute functions that are fundamentally “complex” (or polynomial) in a sub-exponential (non-polynomial) amount of time[3]. However, there are functions that currently we have no sub-exponential algorithm for but we either don’t believe are genuinely polynomial or we can’t prove are polynomial.

It may be that QIPs can be made that can solve these in a non-polynomial time and that the QIP might achieve exponential increases in performance. There are some functions, such as factoring integers that are proven to be in this class of complexity — these are known as BQP functions. A BQP can in theory be solved in polynomial time using a quantum computer. Of course, some of these functions can be solved with a relatively small polynomial amount of time, and some of them still need a relatively large amount of time or steps in order to solve.

Currently (2021), the best QIP devices are on the threshold of what is called the NISQ era — the noisy intermediate scale quantum era. NISQ devices are provably real QIPs that can be made to do some marginally useful thing at great cost. Currently, that thing is to describe the change of state in itself (quantum circuits) in a way that a supercomputer can’t easily predict (Arute and et-al 2019) — which is very marginal indeed — but a genuine demonstration that a QIP can do something special, and as such a breakthrough in science.

Now that it is accepted that such a device can be made and does exist, in the near term it is expected that similar devices will scale and improve sufficiently to attack a wider range of problems more successfully. However, to make progress in this fundamental research, four fundamental problems must be overcome.

Problem 1: Noise

The current technology of QIPs means that noise builds up in the quantum states to the point that the effect of the processes on the states can be obscured and no result can be read reliably. This problem is so bad that it was believed that intermediate scale devices (50->1000 bit) could not be built with any of the technologies used to build the first generation of experimental devices. This blocked progress for several years from 2012 onwards, but it was realised that it was possible to develop an error correction (QEC) procedure that could mitigate the problem, and this has ushered in the NISQ era that is causing so much excitement.

The issue with QEC is that it scales in all the dimensions of a particular quantum state, meaning (in some implementations) that the number of qubits required to be generated in a system using QEC scales as a square of the number of effective qubits that can be used.

Additionally, although error correction is decidedly possible the implementation of error correction schemes is an emergent scientific endevour. For example Google published a set of experiments showing how error correcting schemes behaved on their sycamore device in July 2021.

Problem 2: Scale

Current NISQ systems have approximately 70 qubits. It is believed that devices that yield useful results in limited domains will require at least 1,000 qubits. This scaling is extremely challenging — requiring significant breakthroughs in manufacturing to achieve. Moore’s law has conditioned investors and technologists to expect a doubling of computing capacity on a regular heart beat (say every 3 years) but Figure 1 shows that this is far from the case with quantum technology.

A chart showing how different functions fit  the evolution of QC with a linear fit and an exponential fit as two reasonable choices.
A chart showing how different functions fit  the evolution of QC with a linear fit and an exponential fit as two reasonable choices.

Figure 1. Scaling of quantum computing — a naïve linear fit to progress so far shows a pessimistic scenario where 100 quBits will not be available until the 2040’s. An optimistic exponential fit and projection shows 100 quBits in 2023, and progress to 1000 quBits by 2040. Data for blue line taken from https://www.statista.com/statistics/993634/quantum-computers-by-number-of-qubits/ and amended for the Google Sycamore announcement in 2021.

Fitting an exponential to the growth in qubits so far, we see that 1,000 qubit machines may arrive in 20 years, but a linear fit is only a little worse, and using this fit and projection would lead to a reasonable expectation that QIP devices would be expected to have barely more than 100 qubit capability by 2040.

As noted, current technology requires error correction codes, and these require a large number of qubits to provide effective storage and results. Engineering processors with 1,000,000 qubits (to escape from intermediate scale) will obviously be a huge challenge.

Problem 3: Decoherence/time

Even if 1,000s or 1,000,000s of qubits are developed, the current state of the art devices can’t maintain their state for very long. Currently, QIPs maintain state long enough for up to 26 cycles (Kjaergaard and et-al 2020), and this presents a limit on the algorithms that they can effectively run to completion. This constraint imposes a barrier between what a working QIP of a theoretical scale can theoretically do, and what it can practically do.

Problem 4: Algorithms

Building from the algorithmic constraints imposed by timing considerations, there are fundamental issues with quantum algorithms (QA) that limit the potential for application of QIPs.

First, many interesting QAs rely on the idea of a quantum RAM (QRAM) where qubits can be written and retrieved fast enough to not delay the algorithm. Current QRAM technology is extremely limited — and large scale, often involving terahertz frequency technology, high power, low fidelity and low (millisecond) storage times. No practical QRAM device has been built, as far as we are aware[4]. This has driven a burst of interest in algorithms that work on quantum data — for example the control data in a quantum circuit, enabling better control of a QIP. Alternatively, more interestingly in terms of practical applications, data that is collected from a quantum 2 device such as a SQUID, quantum gravity sensor or a quantum camera.

Secondly, very few of the existing QAs are interesting for solving important problems. The most famous is Shor’s algorithm for factoring integers — which would have an impact on cryptography. Shor’s algorithm is unique at this time in that it looks like an algorithm that can both be implemented on projected QIP without significant architectural advances (QRAM) and will produce exponential speedups. Oversimplifying, factoring an integer in 8 bits will take 1 period on a classical machine, 9 bits in two periods, 10 in four, 11 in eight and 12 in sixteen and so on, using Shor’s 8 will be 1, 9 in 1.42, 10 in 1.95, 11 in 2.5, 12 in 3.8 — and so on[5].

It looks likely that Shor’s algorithm will work, provide huge advantage using quantum powers and will be practical on current architecture QIP, but really… what is it good for? Factoring integers will mean that using some current key exchange algorithms in cryptography will become insecure, but in fact these could be fundamentally compromised by the processes used to generate large prime numbers and the chipsets that are widely used to implement them. More secure encryption algorithms that can’t be compromised by integer factorisation is also available[6], as are other encryption systems. There don’t seem to be other applications of integer factorisation at this time. In addition these results won’t be instantaneous, the state of the art analysis for Shor’s algorithm is that it will take a 20 million qubit QC 8 hours to factor one 2048bit RSA encrypted message (Gidney and Ekerå 2021).

Other algorithms are proposed, and these are of greater interest to markets and financial services. A QA known as HHL was of particular interest — enabling potentially very fast machine learning of the type that is widely used in recommendation and many other AI systems. However, HHL inspired new insights into classical algorithms and has been effectively “dequantised” (Tang 2019), a classical algorithm that executes with the same time complexity (number of steps per input) has been found and there is no advantage to using a QIP with this algorithm. This work has provided benefits though, in the sense that it has inspired significant progress in classical computer science. Quantum-inspired work is a practical approach to developing new applications — but it does not rely on a working QIP; just the fundamental understanding of the mechanisms that a QIP will work. This is a direct way that the QC research program is delivering real benefit now — new algorithms are available for exploitation that would not have existed if it were not for the work that QC researchers have done.

Two other algorithms are widely cited as being interesting on QIPs for FS and markets. Quantum Monte Carlo algorithms will be useful for Bayesian reasoning — which is important for portfolio optimisation, Grover’s algorithm is similar to the searching techniques used in

many forms of classical machine learning. Both of these QAs seem secure from dequantisation at the time of writing and feasible[7] as implementation targets for QIP. However, both offer only sqrt speedups vs. classical methods. In large, practical and cheap QIP, this will certainly[8] be sufficient motivation to use them for problems beyond the reach of current methods, but for large inputs, there will still be a limit on the practicality of these algorithms.

Figure 2 shows a projected timeline for the adoption of quantum Monte Carlo as projected by a quantum computing start-up.

Figure 2. QCWARE timeline for Monte Carlo solvers using quantum computing as an accelerator.

It’s notable that even advocates of QC project substantial time before appropriate hardware becomes available and are cautious about the impact of the devices on the state of the art.

The final category of algorithm that is worth considering is the concept of a quantum neural network. These are implementations of neural network architectures using quantum computers either to attempt to train the weightings for a classical network using a quantum algorithm or attempting to encode the network as a quantum superposition and implement all elements on a QIP. Interesting progress has been made in this area, but it isn’t clear if QNNs offer algorithm quantum advantage[9], and it is clear that there are significant challenges in terms of achieving structures with fidelity to classical neural network architectures.

All four of the fundamental problems outlined above are under attack in the QIP building community. Progress on these problems will allow QIPs to grow to be used on problems that are currently intractable, but even if they are cracked and a scaled QIP is created, then an important challenge remains.

Challenge: System Building

In this discussion, we have carefully focused on QIP, not QC. This is because what can be imagined as a quantum computer is more accurately dubbed a PUQC — a practical universal quantum computer. This is not a device likely to be recognised by QC researchers because it’s so far from the kinds of devices that they are currently contemplating; but it is the imagined device that people pouring money into QC research think that they are buying.

There are differences between an imagined PUQC and a QC as understood to be under development.

· A PUQC will operate over processes of arbitrary (or at least very long) length and will have a memory (QRAM). A universal set of quantum gates is available that will allow arbitrary logic sufficient to simulate a Turing machine and therefore compute any computable function — including BQP functions. But implementing all the gates in the universal set so they can operate for sufficient time to execute sufficient operations for algorithms to be effectively executed is so easy. As discussed, a QRAM is required for some interesting quantum algorithms but also the conceptual model that outsiders have of a PUQC is that it will be fluidly processing data using different algorithms in the way that a digital computer has ever since the Manchester Baby. This is hard to imagine without an effective QRAM and so a gap between what end users imagine will emerge in the future and what it is expected by the QC community will remain.

· A PUQC will have a practical operating infrastructure and management system. In addition to implementing the number of required qubits, scaling PUQC will require the implementation of mechanisms that can control and read the state and evolution of the processor. The Google Sycamore machine with 54 qubits requires 277 digital to analogue converters interfaced between room temperature and 20mK. As the processors implemented scale, the number of control elements required will also scale, given that similar devices such as the Tektronics AGW5200 series cost approximately $10k each, implementing the control and measurement plane beyond NISQ will provide a massive scaling challenge.

· The fact that current QIPs need to be cryogenically cooled and shielded from vibration and electrical noise. As well as being expensive and logistically difficult, these physical challenges of shielding and cooling point up the challenge of powering these devices; the (potential) tens of thousands of digital and analogue controllers and cryogenic coolers will require large amounts of power. It does appear to be the case that problem by problem a QIP can solve particular problems with great efficiency — if we disregard the infrastructure that is needed to run them. It took five years from the implementation of a practical programmable digital computer (EDSAC at Manchester University) to the implementation of an integrated commercial device (LEO), and as noted, the engineering challenges a PUQC presents are much greater. EDSAC didn’t sit in a pool of liquid nitrogen, and it didn’t take 36hrs to turn it on (Google_Quantum 2020).

These challenges and features create two distinct views of what a PUQC will look like.

The simple view of a what a practical quantum computer will look like — mirroring the economics of the digital computers of the 1960’s and 70's
The simple view of a what a practical quantum computer will look like — mirroring the economics of the digital computers of the 1960’s and 70's

Figure 3. The system that would embody a “practical” universal quantum computer as conceived by end users and investors, the device can load classical data for processing in a scalable way, programs can be loaded and run on the machine in the same scalable way enabling many users to access and utilise the system for different applications. Overall power consumption is similar to large servers and supercomputers. The machine is physically large but can co-exist with other machines in a data centre.

The kind of practical quantum computer that is likely to be developed from the current research program pursued by QC researchers.
The kind of practical quantum computer that is likely to be developed from the current research program pursued by QC researchers.

Figure 4. The system that would embody a “practical” quantum computer (elements of this diagram are adapted, redrawn and developed from https://spectrum.ieee.org/computing/hardware/heres-a-blueprint-for-a-practical-quantum-computer). Data is introduced into the computer from quantum devices, a fleet of classical devices is required to control and use the QC and calibrating and configuring the machine for each run requires hours or days. The physical scale of the machine may occupy an entire cloud data centre or research campus.

Figure 3 shows a PUQC. This is the aspirational vision pursued by end users and investors; a machine that can be used as part of a cloud to execute many programs and to process an array of classical data flexibly. The machine will be practical to maintain and will operate with overheads and costs similar to those of a contemporary supercomputer.

Figure 4 shows another PUQC. This is a vision based based on a critical reading of the quantum computing literature on 2021. The machine has no QRAM so quantum data is processed from quantum devices including other quantum processors. Shielding and cryogenics are required to operate this level of the system. Supporting access and utilisation of the system a fleet of devices are required to configure the system and to calibrate and detect the results that it produces. Programs take hours to load and setting up the system for use can take days, this means that few end users can effectively exploit it. The overall system may require a facility like a cloud data centre to support its requirements for power and management.

These two ideas are caricatures, every reader will find reasons to disagree with each of them — but they are both reasonable to infer from the current state of the art, above and beyond what is advertised as the achievements of QC in the literature, but also what is not advertised and is not easily understood by outsiders. The start-up times for the larger scale devices like Sycamore are easier to find than the plans for a bypass in the Hitchhikers Guide to The Galaxy[10] but only just.

It is worth noting that quantum technologists are not working with a blank piece of paper, the skills and workforce created by the development of digital computing will be invaluable in enabling the program to resolve these issues. If an army of engineers did not exist then it would be truly unrealistic to imagine that these problems could be overcome. Luckily these engineers do exist, along with the resources and infrastructure supporting vast cloud computing nodes and system, and therefore while the challenge of building these devices is daunting it is conceivable that this will happen if sufficient investment is poured into the task.

National Investment

Since 2014, the UK has invested $1bn in quantum computing research. The USA and EU have both launched $1bn investment programs in 2019, and China has committed to build the world’s largest quantum-research laboratory at a cost of $20bn (Biamonte, Dorzhkin and Zacharov 2019)

National governments are motivated to invest on multi-decade timescales. The development of industries such as biotechnology, digital computing and aerospace has unfolded on these terms, and it’s reasonable for policy makers to expect QC & Quantum 2 to do so as well. Quantum 2 technologies in general have national security implications[1]. Work on Quantum 2 is foundational. The technology developments, such as control or noise reduction required to improve a QIP, may have many applications in other Quantum 2 devices.

Private Investment

According to The Economist (The Economist 2020), approximately $600m of venture capital investment in QC was made in 2020, more than double the total for the previous year. More than 87 organisations are trying to construct quantum computers (most are probably building QIPs — probably less than 10 are trying to actually build a QC). A number of factors are driving this surge. The first is sustained investment by Intel, Google, Microsoft and IBM creates a clear exit strategy for limited investments. If a start-up can create a subsystem or component that offers either a real advantage or a significant patent that blocks development of one of the big player’s devices, then a trade sale and profitable VC exit is on the cards. Of course, the bulging VC investment ecosystem also opens the possibility of an exit as one start-up buys another and the ecosystem “shakes out”.

First, the metrics and investment practices that drive VC participation are lighting up green for quantum technology start-ups. VCs like to invest in teams, and because of the intellectual attractions of quantum physics and the lack of commercial opportunities for quantum physicists (until now), the teams coming on the market smash the criteria that VCs use to assess them. Secondly, the growth in both patent numbers {2009–382, 2018–1799 (Williams 2021)] and investment from government sources (see previous section) as well as other VCs provides the evidence required to unlock participation from funds and enable the VCs to raise funds for QC investment.

Interest from Large Market Players

Several large banks and private equity managers have established teams working on QIP and related technology. There are two business processes in financial markets that depend on massive computing power — pricing derivatives and allocating assets to investment portfolios. Both of these are notionally tractable with QAs (Grover’s and Quantum Monte Carlo) and the development of proprietary QAs or procedures using QIPs and classical machines together could conceivably be developed in these applications. The business value of such developments would be very clear. The compute bill for pricing and asset management is very large, and could conceivably pay for current generation QIP access, so competitive performance could be worth having on a cost basis alone. Of course, a process that produced accurate pricing faster than is currently worth getting using classical computing could enable higher risk trading and also allow an institution to outpace other less sophisticated players — in much the same way that banks allegedly front run retail investors in dark pools and HFT now, but legitimately (if using a secret QIP is legitimate) on the public markets.

It has been suggested that such technology would be “kept under the hat” (The Economist 2020) but a major player that did secure such an advantage, or was close to securing it, would surely have to disclose its capabilities to its regulators and central bank. Measures to prevent market disruption, as other participants were bankrupted or withdraw to prevent bankruptcy, would have to be taken! Additionally, it’s just not plausible that confidentiality would be maintained given the number of decision makers and scope of such a step forward. The advent of a practical QIP for use in financial markets would be heralded with a lot of gossip at the very least.

There are other, more plausible, motivations for the big players to be engaged. First, ramping up for a PUQC will not be simple. Building a culture and infrastructure that can work with machines that do become available will take years. Secondly, working on QC is good PR, asset and investment managers can convince funds that they have a cutting-edge attitude and capability — which is good for sales. Thirdly, there is a tail risk that a clutch of breakthroughs could pull the timeline for a PUQC sharply forward. A lot of money is funding a bunch of brilliant scientists who have spent their careers complaining that they could do much more, much faster if they just had more cash. Maybe they’re right, and if you are sitting on $100bn of assets at risk, it’s rational to invest a few million of those $ in covering that eventuality.

Conclusion

Quantum 2 technology is one of the most exciting frontiers of progress for the 21st century and is likely to create a number of significant new devices that will empower and enrich humans. Quantum information processors are one of these devices. They have been demonstrated to be theoretically interesting and possible on a small scale, but for QIP to be translated in to practical universal quantum computers, four fundamental challenges (noise, scale, decoherence and algorithms) must be overcome with new science, and significant related engineering challenges must be tackled as well.

State and private investors and end users are demonstrating strong interest in the prospects of real quantum computing as they understand that, but rational and cyclic drivers for this can be identified without the need to believe that there is extra information or better informed judgement being exercised behind the scenes. It doesn’t seem likely that “they know something we don’t” — although some of them might think they do. Rather, it seems that there is a breakdown in communication between the people doing real work trying to build real quantum computers and the people funding and writing about them.

In the minds of investors and potential end users, a quantum computer will share the practical and universal qualities that make digital computers so useful. This is quite reasonable. “Computers” as we know them are often identified to have come into being with the development of the Manchester Baby. Babbage’s analytical engine, the Z3 and the ENIAC “don’t count” — even though they computed! The Manchester Baby was rapidly translated into the F1 and made available to solve commercial problems because it had these properties. It could load and process data, run arbitrary programs and was reasonably practical to manage.

This is what almost all people who are outside of the QC field believe will happen when a real quantum computer is developed. The reality could be radically different, unfortunately this is not clear from the literature. A lot of effort is required to critically read it so as to understand the practical issues and aspects of the machines being developed. For example, it is claimed that the power consumption of QIP’s are very low, which makes sense because they are superconducting devices and power does not dissipate in super conductors. On the other hand, it’s not easy to pin down exactly what the power consumption of the fleet of control devices that are required to run them is, or for how long this power requirement is imposed when commissioning and calibrating the device, and how this will scale in the future.

So, what are the prospects for QC now? While optimists point to the astonishing development of classical computing in the late 20th century to illustrate that these problems can be overcome, pessimists might point to nuclear fusion and manned space travel as examples that show that humans can’t always make what they can imagine and understand.

A realist might also look at what has happened with molecular biology and posit that, given time, extraordinary science and patient investment, practical technology will emerge from fundamental understanding. In 1954, the helical structure of DNA was understood, the Human Genome project was completed in 2004, and mRNA vaccines saved the world in 2021. It seems completely realistic to imagine that the development of a PUQC may also take 70 years.

Based on the rapid progress that has been made and the vast investment being poured in by governments and private enterprise, it seems possible that a PUQC will be created in 20 years. The most optimistic prediction backed by any form of track record is being made by PsiQuantum[1] who are claiming that they will have a photonics-based machine in five years. Google predicts a PQC in approximately 2029. Perhaps the necessary engineering and scaling to develop a PUQC will then rapidly unfold. More likely, given the current rate of progress and the speedbumps that remain it will take significantly longer for a workable implementation of the technology to emerge. If a working machine does appear in the near term it will likely not be in the form imagined in Figure 3 and it could well be much closer to Figure 4. The tail risk is not that PUQC’s will fail to emerge in the next 40 years — it is that they will emerge any time before then.

Bibliography

Arute, Frank, and et-al. 2019. “Quantum supremacy using a programmable superconducting processor .” Nature 574.7779 505–510.

Biamonte, J.D., P. Dorzhkin, and I. Zacharov. 2019. “Keep Quantum Computing Global and Open .” Nature (comment) https://www.nature.com/articles/d41586-019-02675-5.

Gidney, Craig, and Martin Ekerå. 2021. “How to factor 2048 bit RSA integers in 8 hours using 20 million noisy qubits.” Quantum Vol5. 433.

Google_Quantum. 2020. “Supplementary information for “Quantum supremacy using a programmable superconducting processor”.” ARXIV. January 1. https://arxiv.org/pdf/1910.11333.pdf.

Kjaergaard, M., and et-al. 2020. “Superconducting Qubits: Current State of Play.” Annual Review of Condensed Matter Physics 369–395.

Prichard, J., and S. Till. 2014. “UK Quantum Technology Landscape 2014.” EPSRC UK. https://epsrc.ukri.org/newsevents/pubs/dstl-uk-quantum-technology-landscape-2014/.

Tang, Ewin. 2019. “A quantum-inspired classical algorithm for recommendation systems.” Proceedings of the 51st Annual ACM SIGACT Symposium on Theory of Computing 217–228.

The Economist. 2020. “Commercialising Quantum Computers.” The Econmist , September 26th : https://www.economist.com/science-and-technology/2020/09/26/commercialising-quantum-computers.

Williams, L. 2021. “Quantum Computing : Believe the Hype.” Investment Monitor , March 1st : https://investmentmonitor.ai/ict-and-internet-infrastructure/quantum-computing-believe-the-hype.

Footnotes

[Footnote : 1] The practical nature of quantum sensors explains why there is significant support from government for quantum information processors that act over quantum data vs. classical data encoded into quantum states.

[Footnote 2] It is possible to both buy outright and to rent time on D-Wave devices. So far, to our knowledge, while D-Wave devices have been shown to outperform non specialist classical devices on particular problems, they have not been shown to be comparable to equivalent classical devices, such as a similarly priced super-computer on these devices. For example, a result showing better performance on a problem vs. an optimised super-computer that cost $10m; or even a large cluster running on AWS.

[Footnote 3] Functions are how sets of inputs are mapped to sets of outputs. Performing these mappings (working out the outputs from the inputs) requires a minimal number of steps. For some “complex” functions this number of steps grows at least xn for the number of inputs n. This is known as a non-polynomial function, and it is believed that there is no trick that makes these functions solvable in less than xn steps. If such a trick exists, a host of problems could be solved very much faster than is currently possible. This would have profound philosophical implications for the meaning of time and infinity — which would change our view of reality significantly. Because of this, most people think that P!=NP even though they can’t prove it.

[Footnote 4] In the sense of a device that encodes classical data into a superposition in linear time at scale, and can store a large number of qubits for long enough to be used in an algorithm.

[Footnote 5] In fact, the growth in both cases is more complex, but the point is the same. The time to execute Shor’s algorithm on a QIP will grow in polynomial time, whereas the time to factor on a classical machine grows (as far as we know) in exponential time.

[Footnote 6] Post quantum public key encryption can be implemented, but it is much less efficient than RSA or Diffie-Hellman, novel quantum key distribution systems can also be implemented.

[Footnote 7] The author does not understand how quantum Monte Carlo can be implemented without QRAM but this seems to be the case from looking at the literature.

[Footnote 8] Although Monte Carlo methods offer approximate solutions — so if you are willing to accept a larger error margin, a classical machine might still be good enough.

[Footnote 9] There are simulations that indicate that there is a quantum advantage over quantum data in terms of the quality of the results (generalisation) that can be obtained. However, we have not found literature that offers proof of speedup or improvement in generalisation from QNN algorithms.

[Footnote 10] They were in a disused toilet with a sign on the door that said “beware of the leopard”, but the Sycamore startup times are in the additional material for their Nature paper.

[Footnote 11] For example, Astute Submarines use a quantum compass to navigate without GPS (to within 1 m worldwide). Compact atomic clocks using Quantum 2 have applications for targeting and communications. Sensors such as SQUIDS, Quantum Radar and Gravity Sensors are also significant.

[Footnote 12 ] https://psiquantum.com/about

gft-engineering

GFT is driving the digital transformation of the world’s…

gft-engineering

GFT is driving the digital transformation of the world’s leading companies. On here, our tech communities from all around the globe share their tips, tricks & insights with other developers.

Simon Thompson

Written by

Head of Data Science at a Financial Services Consultancy in the UK. These are my personal views, not the views of my employer.

gft-engineering

GFT is driving the digital transformation of the world’s leading companies. On here, our tech communities from all around the globe share their tips, tricks & insights with other developers.