How Ewin Tang’s Dequantized Algorithms are Helping Quantum Algorithm Researchers

Qiskit
Qiskit
Published in
6 min readMar 15, 2023
Ewin Tang (Illustration: Russel Huffman)

By Robert Davis, Technical Writer, IBM Quantum

Most quantum researchers eagerly look forward to a day where quantum computers will outperform classical computers for some useful task. At the same time— perhaps a little paradoxically — quantum researchers also want classical computers to continue beating expectations and running quantum algorithms successfully, since classical computation is so important for verifying the results of quantum experiments. It’s only rarely, however, that we find these contradictions encapsulated in the work of a single researcher.

Take 22-year-old University of Washington computer scientist and PhD student Ewin Tang, for example. Tang first came to prominence in the quantum research community in 2018 with the publication of her undergraduate senior honors thesis project. That paper investigated a quantum machine learning algorithm that had been published just two years prior, and which claimed to deliver an exponential speedup over all known classical techniques for the same task. But at just 17-years-old, Tang developed an alternative that delivered an equivalent runtime using only classical methods, effectively “dequantizing” the original QML algorithm. In the years since, she’s built a reputation for dequantizing a number of additional QML algorithms claiming similar speed-ups.

So, how can research that disproves quantum speedups actually help push the field of quantum computing forward? Conversations with Tang herself suggest the answer might not be quite as obvious as you expect.

Readers who liked this quantum algorithm may also enjoy…

If you’ve ever purchased a product through an e-commerce platform, or watched a video on a streaming website, you’re likely familiar with recommendation algorithms, which use data on consumers’ past purchases or product ratings to generate personalized recommendations for new products they might enjoy. In 2016, a pair of well-established quantum researchers developed a quantum recommendation algorithm, which they believed could tackle these problems exponentially faster than any classical technique.

The next year, in early 2017, Ewin Tang began taking famed computer science researcher Scott Aaronson’s Introduction to Quantum Information course at the University of Texas at Austin. On his Shtetl Optimized blog, Aaronson recounts how Tang quickly caught his attention as a promising young student, prompting him to offer her an independent project.

As Tang recalls, Aaronson’s offer was as much a warning as it was an invitation. “The first time I was introduced to quantum machine learning, I think it was [Scott Aaronson] who said something to the effect of, you know, ‘it’s a messy field, but … cleaning that up is a perfect research project for grad school,’” she said.

In the original paper, the authors of the quantum recommendation algorithm make a persuasive argument suggesting that their method offers a speed-up over all possible classical analogues, but they do not actually prove this to be true. Aaronson asked Tang to do exactly that — to find a lower bound for the number of queries a classical recommendation algorithm would need to perform the same task, and to prove that no classical algorithm could match the quantum recommendation system’s runtime.

Tang would ultimately fail at this task, albeit in the best possible way. After spending months working to prove the original paper’s authors right, she came to suspect that she could, in fact, devise a classical algorithm that matched the fast runtime of the authors’ results. Eventually, she did — replacing the authors’ use of quantum linear algebra with classical sampling techniques to create a classical algorithm that ran exponentially faster than any other known classical precedent.

The ‘trick’ to dequantizing quantum machine learning algorithms

After publishing her thesis work, Tang graduated from UT Austin and went on to become a computer science PhD student at the University of Washington, where she is based today. In the fall of 2019, she published a follow-up to her thesis project, a second paper in which she and her co-authors demonstrate a simple framework for dequantizing a wider class of quantum machine learning algorithms — not just the original recommendation algorithm.

Tang was able to do this, at least in part, because of a quality that many quantum machine learning algorithms share in common: reliance on pre-processed data. “The idea is that we have these quantum algorithms,” she said. “Quantum researchers want to find quantum advantage. So, what they do is they take a classical problem — say, recommendation systems — and then they do some pre-processing to put it in a form that the quantum algorithm can operate on.”

In other words, QML algorithms can only operate on classical data if that data is first encoded as a quantum state — i.e., if the classical data is pre-processed. Often, when QML researchers claim a quantum speed-up for a new algorithm, they do not include the computational steps needed to pre-process the classical data as part of their QML algorithm’s runtime, and they compare their algorithms to classical alternatives operating on raw, unprocessed data.

Tang’s work shows that you can achieve equivalent speed-ups using only classical resources, at least sometimes, if you perform the same pre-processing for the data you feed into your classical algorithm. For example, in her original recommendation algorithm from 2018, it’s this pre-processing that makes it possible to apply classical sampling techniques to the data, and achieve a significant speedup over previous classical techniques.

This, incidentally, is also why you won’t be seeing Tang’s dequantized algorithms replacing the algorithms that companies actually use. “If you are a classical person, you would not do this pre-processing,” Tang said. “I have to use this quantum type pre-processing with these data structures and these formats that are amenable to quantum computers, but this is not necessarily the right way to do it from a classical perspective.”

Tang went on to explain that if one were to actually run her quantum-inspired recommendation algorithm, or any of the other dequantized algorithms she’s developed, they would get something that’s actually fairly slow. “I mean, it’s not wildly slow compared to other things that are similar, but it doesn’t give an advantage because you’re sort of going about it in this roundabout way, rather than going the direct route to the recommendation or whatever it is that you wanted,” she said.

Cleaning up the field of quantum algorithm development

Tang says the goal of her research isn’t to develop classical algorithms that are useful in the real world. Instead, her motivation has more to do with that “messiness” Scott Aaronson described to her all those years ago.

“As a field, quantum computing and quantum machine learning are pretty messy,” Tang said. “There’s a lot of work where it’s unclear what the purpose is. You’re talking about things that won’t really be possible until a distant future, and it’s not clear whether that future will actually come to pass.”

Tang doesn’t think this is a bad thing, necessarily, and says that it’s true for many other fields of research as well. Indeed, this sort of “messiness” is common in emerging fields like quantum computing, where researchers are just getting started figuring out what’s possible, and where advances in theoretical research often require broad assumptions that may later prove false.

At the same time, Tang argues that these areas of research benefit a great deal from outsiders who challenge those assumptions, and who push the field to embrace increased scientific rigor as they continue to mature. In her view, the effort to create dequantized classical algorithms doesn’t just help the research community understand the limitations of quantum computing; it also helps researchers better understand quantum computing’s possibilities.

For example, while Tang and her co-authors were able to dequantize a number of QML algorithms with her 2019 paper, they also found that the algorithm for quantum topological data analysis proved resistant to her dequantization techniques. Topological data analysis is a subfield of machine learning that makes it possible to envision the “shape” of a data set, which in turn is useful for understanding the data as a whole, and for considering questions like whether there are holes in the data.

“It’s nice to see that I did this work, and [since then], there’s been a lot of work on that particular algorithm to try to get it to a state where quantum computers can try to use it to get an advantage,” Tang said.

In a way, it’s like Ewin Tang is conducting her own topological data analysis of the state of quantum machine learning, probing the shape of the entire field to help quantum and classical researchers alike better understand where it begins and ends. It may be messy, but for now, that’s fine with her. As Tang says herself, “I don’t know if I would want it any other way.”

--

--

Qiskit
Qiskit

An open source quantum computing framework for writing quantum experiments and applications