Navigating the Quantum Computing Frontier

By David C. Brock

Computer History Museum
Core+

--

CHM’s Center for Software History Director David C. Brock in conversation with Microsoft Research’s Matthias Troyer, Google Quantum AI Lab’s John Martinis, and IBM Research’s Pat Gumann.

Perhaps you are like me: You’ve aware that quantum computing is a hot topic today but have a nagging feeling that you don’t really have a good picture of what it’s all about. Sure, you know it has something to do with the unintuitive behavior of the world described by quantum mechanics — cats in boxes that are blends of alive and dead until you look inside, and photons coordinating their properties instantaneously over great distances and that are also sometimes a particle and sometimes a wave. And you also know that somehow in this weird behavior, researchers see the possibility for a new kind of computer that accomplish feats that computers like the ones that you own could never dream of doing. Oh, and you know there is something about these quantum computers being able to break all the codes.

On August 15, 2018, I had the chance to sit down with three of the best people one could hope to if you wanted help in getting a better grasp of what the story of quantum computing is really all about. We called the event “Quantum Questions,” because this was our chance to ask some of the world’s leading researchers to help us to understand what quantum computing is really about, where it stands today, what most of us are getting wrong about it, and what it might mean. They did not disappoint.

Our trio represented some of the most prominent efforts in realizing both working hardware for quantum computers — quantum processors — and the new forms of software for using them. Pat Gumann joined us from IBM Research’s efforts. John Martinis represented Google’s Quantum AI lab. And Matthias Troyer brought the perspective from Microsoft Research. It is a wonderfully candid and clear discussion of our quantum questions and what life is like on this research frontier.

“Quantum Questions: Microsoft Research’s Matthias Troyer, Google Quantum AI Lab’s John Martinis, and IBM Research’s Pat Gumann,” July 25, 2018.

I was very surprised to learn that it was the famous Caltech physicist Richard Feynman who first opened the eyes of many researchers to the idea of using quantum mechanics to calculate and the very notion of a “quantum computer.” In the 1990s and early 2000s, researchers in the then-hot-topic of nanotechnology also traced the origins of their field back to a talk by Feynman. For the quantum computing community, it was a conference talk in 1981, “Simulating Physics with Computers,” published in the International Journal of Theoretical Physics the next year that really opened up the field. (The official copy can be found here. There are others.)

In his paper, Feynman makes the argument that electronic digital computers as we know them will never be able to be effective tools for making useful calculations about the parts of the world described by quantum mechanics. Not only are some quantum mechanics situations too complex to ever be calculated by conventional means — even if you turned the entire universe into a digital computer and ran it for the age of the universe — but Feynman argues that even for some very simple quantum mechanical situations, digital simulations of them will produce the wrong answers.

Feynman proposed, in the face of this limitation of digital computing, to simply change the way we compute in order to explore the parts of the world dominated by quantum mechanics. His thought was to use “quantum mechanical systems” to make calculations about other quantum mechanical systems. Which quantum mechanical systems could be used to form the “quantum computer elements” he did not say, nor speculate. Today’s researchers in quantum computing are trying to find the best answer to that question. Interestingly, for Feynman, part of the appeal of quantum computing was that in it he saw a new way to explore physics itself, as a way to learn new things about how the world works. For the representatives of today’s quantum computing community who joined our August 15 panel, their research encompassed this very same aspect. Today’s quantum computing is not only a complex engineering challenge requiring creativity and care, but also a big experiment in physical science in which we are learning new things about our world. Some of them may prove to make the path to more powerful quantum computing clearer and easier. Some of these new lessons may severely disappoint. Only continued effort will show.

Lastly, I was particularly struck by the panel’s idea that quantum computing may represent the opening of a new chapter in the story of humanity’s 5,000-year relationship with calculation. Contemporary scholarship suggests that humanity’s invention of numbers is at least as old as our engagement with reading and writing, likely older. As the historian Lorraine Daston puts it in a recent essay, “[O]ur oldest evidence for writing systems, for example from ancient Mesopotamia and the Mediterranean, suggests that alphabets are parasitic upon numerals. Somewhat disappointingly, many of the earliest surviving texts in Sumerian (ca. 3500 BCE) and other ancient languages record not epics like the Gilgamesh and the Iliad but rather what sound like merchants’ receipts: five barrels of wine, twenty-two sheepskins, and so on. The earliest use of reading and writing appears to have been to keep track of calculations, mostly for commercial and administrative purposes.” (Download Daston’s remarkable essay here.) The deep history of the spreadsheet is deep indeed.

The anthropologist Caleb Everett believes that the invention of number was a leap from recognition of the common patterns between our bodies — literally our digits on our hands — and objects in nature to a symbolic representation of this relation: a word, a mark. From this, the possibility of calculation opens. (See a nice interview with Everett about his work here.) If our relationship to number and calculation to the present, and all that it has afforded (NB our digital world), rests at root on this intuition of our bodies in the world, it would indeed seem that quantum computing could require us to develop new intuitions through new experiences to make the fullest use of what it may afford. In the end, if it is successful, quantum computing should open up ever more quantum questions than it answers. I’ll continue to watch, and wonder.

About the Center for Software History

The purpose of the Center for Software History at the Computer History Museum is to collect, preserve, and interpret the history of software and its transformational effects on global society.

Software is what a computer does. The existence of code reflects the story of the people who made it. The transformational effects of software are the consequences of people’s creation and use of code. In the stories of these people lie the technical, business, and cultural histories of software — from timesharing services to the cloud, from custom code to packaged programs, from developers to entrepreneurs, from smartphones to supercomputers. The center is exploring these people-centered stories, documenting soft-ware-in-action, and leveraging the Museum’s rich collections to tell the story of software, preserve its history, and put it to work today for gauging where we are, where we have been, and where we might be going.

About the Author

David C. Brock is an historian of technology, and director of the Center for Software History at the Computer History Museum. He focuses on histories of computing, electronics and instrumentation, as well as on oral history. Brock’s work in the history of semiconductor electronics includes Thackray, Brock and Jones, Moore’s Law: The Life of Gordon Moore, Silicon Valley’s Quiet Revolutionary (Basic Books, 2015); Lécuyer and Brock, Makers of the Microchip: A Documentary History of Fairchild Semiconductor (MIT Press, 2010); and Brock (ed.) Understanding Moore’s Law (CHF, 2005). He has served as a writer and executive-producer for several recent documentary shorts and hour-long television documentaries, including “Moore’s Law at 50,” “Scientists You Must Know,” “Gordon Moore,” and “Arnold O. Beckman.” Brock is on Twitter @dcbrock.

Originally published at www.computerhistory.org.

--

--