Quantum Optimizations at Qubits, Day 1

D-Wave’s quantum annealed Success Story

Nicholas Teague
From the Diaries of John Henry
24 min readJun 21, 2024

--

The following notes represent an unofficial meeting minutes of sorts, recording the discussions presented at the Qubits Conference hosted by D-Wave Inc June 17, 2024 in the beautiful city of Boston, MA. (An writeup for Day 2 of the conference follows in a separate publication to be linked again below.) The conference was associated with quantum annealing, the adiabatic form of quantum computation that D-Wave has been pioneering for optimization applications going on 25 years, and scope of the proceedings were more aligned with the conventions of a commercial user meetup (unlike some of the research venues that this blog has previously visited). The technical breadth of this writeup will fall on the lighter side — there are more traditional academic venues being held around this domain like the Adiabatic Quantum Computing conference (AQC) if you are research inclined. As will likely become apparent from the reading, it is an exciting time for the commercialization of quantum computing and for D-Wave in particular given their extremely differentiated contributions to the field of high performance computing.

The format of this writeup will quite simply adapt various presentations into a “key takeaways” form of prose, following the published agenda and sequence of speakers from the conference — which the author has attended in person for a second year running. The author has also sat several other forms of training offered by D-Wave and so considers himself literate in their product line, which includes the Ocean API for high bandwidth / low latency optimization, the Leap cloud service for quantum solver access, and the Launch professional services available to support organizations along their journey from developing proofs of concept all of the way to scaled deployment.

As a disclosure, please note that the author has active investing interests in D-Wave through their publicly traded $QBTS equity. He is a long term / passive investor and has no intent of using this publication for purposes of “trading”. He offers this writeup as an attempted contribution to the field as well as a thank you to the employees for their kindness and hospitality in these events.

Great so without further ado:

Opening Keynote

Dr. Alan Baratz, D-Wave CEO and selected guests, D-Wave

D-Wave finds themselves in the center of a pivotal moment in the history of quantum computing, or the history of computation in general for that matter. As the first firm to offer a commercial quantum computing product to industry, they have followed a long and winding road over the last 25 years to get here, one that started purely in the realm of research and theory but has over time progressed to include hardware development, full stack software integration, cloud computing, and extending now into professional services available to support your organization’s unique road to business model innovation.

Now, as their platform is finally catching up to approach the full promise of quantum supremacy that we nerds have been anticipating for so long, it feels like we are at the front row of history, watching a paradigm shift unfold in real time. The curtain pulled back to reveal this stage before us and the hint of an untold number of additional curtains to come.

Owning one of the largest patent portfolios in the field of quantum computing, D-Wave is mostly known for their quantum annealing technology, which represents a fundamentally different model for quantum computing compared to the rest of the field. Quantum annealing, aka adiabatic quantum computing, is expected to have enduring advantages over other quantum computing conventions towards those subclasses of applications involving optimization, which will be demonstrated further in talks to follow. Put differently, quantum annealers are designed to be native optimizers, and as a result outperform other gate based forms of circuits not just from the massive scale of qubit counts available from D-Wave’s hardware relative to other conventions, but also from slower speed or solution quality from implementing those optimizing algorithms available to gate based quantum computing conventions.

That is not to say that there will not be a marketplace dedicated to the utilization of gate model quantum computers available from competing firms. For example, although annealers are awesome for optimization, gate based circuits likewise have inherent advantages, notably for those applications that can be modeled by systems of differential equations. Although the annealing product has more strengths outside of such realm, D-Wave in parallel does have a previously announced gate model system in development, and partly owing to countless channels of technological synergy with their annealing circuits it is already one with performance properties matching demonstrations with comparable (fluxonium qubit) designs reported in peer reviewed literature. With a logical qubit prototype on hand it is reasonable to expect that D-Wave’s gate model hardware will be competitive with the industry when released, which will make them the only quantum computing vendor with local integration for algorithms that leverage both gate and annealing forms of circuitry.

The surrounding context of the user conference was not just to restate and celebrate these talking points, however valid they may be as a demonstration of technological superiority. It also comes along with a host of product advances that are compounding such capabilities one after the other. Consider D-Wave’s new Advantage2 platform which is currently midway through a staged rollout. Not only are the hardware specifications staggering in context of the rest of the industry (4800+ working annealer qubits with a 7000+ scale now being fabricated), the surrounding metrics are likewise of significance. The Advantage2 has doubled qubit coherence times, meaning qubit superposition is now accessible through longer annealing times, the qubits are more densely coupled with 20-way connectivity between nodes, and importantly the systems are capable of a 40% higher energy scale — which is a way of saying that the solvers have improved access to finer grained regions of an optimization algorithm’s fitness landscape for more diverse distributions sampled from a solver.

These hardware based innovations are coupled with the new fast anneal convention derived from a ground up rewrite of the software stack for performance and speed. The implications of a faster software stack mean that a greater proportion of the time the computer is running a coherent state of superposition and as a result realizing a quicker convergence to an optimal solution. The forms of control enabled by fast anneal has likewise for the first time enabled the demonstration of a violation of the bell inequality on annealing hardware, a significant milestone and form of validation for the hardware.

The fast anneal software update was a significant contributor to those historic results being reported in a recent preprint currently undergoing peer review [Andrew King et al 2024, arxiv:2403.00910] in which D-Wave has for the first time in the field demonstrated what can only be described as quantum supremacy, a terminology referencing D-Wave’s annealers (early Advantage2 prototype with ~1222 qubits) being benchmarked solving a useful / real world class of problems in about 10 minutes that one of the world’s most powerful supercomputers could be expected to take over a million years to complete.

It is worth pausing for a second to let that sink in.

This scaling advantage of quantum computers is most commonly discussed in such manner reporting lengths of time to complete some sophisticated calculation. This isn’t the only framing though. Consider that for each generation of D-Wave product rollouts, the successive generations have retained an effectively constant scaling of energy demand. That problem that would have taken a million years for a GPU cluster to solve? It would have also taken about the current world annual energy demand to complete by classical hardware. It took D-Wave’s annealers around 12 kWh (about the same amount of energy as you may draw over the course of a year watching television a few hours a day). In a world with data center power consumption climbing by whole percentage points of global energy demand from year to year…

It is worth pausing for a second to let that sink in.

D-Wave embarked on a 25 year journey towards building a sustainable and differentiated quantum computing business, the proverbial determined tortoise racing a field full of hares. The world’s first quantum computing business. Sales of the world’s first quantum computers as early as 2011. The Leap cloud service introduced in 2018. With every milestone and high water mark there have of course been detractors coming from every direction. Was D-Wave actually utilizing quantum phenomenon? (Yes, see their Nature paper demonstrating quantum tunneling.) Was D-Wave useful compared to tensor networks? (Yes, see the recent benchmarking results.) Will annealing ever perform better than quantum Monte Carlo? (They have now demonstrated a scaling advantage.) Won’t error corrected gate model computation address this scope? (Scalable error correction for gate model systems are still far from realization.) Why can’t NISQ era quantum computers address optimization? (These systems are impractical to scale for obvious reasons.)

The D-Wave tortoise has slowly plotted a long and methodical course down the winding road to quantum supremacy. Commercial quantum annealers capable of solving business scale applications are finally here and D-Wave owns a commanding lead as they sprint accross the finish line with fresh legs and a victory smile.

Computational Supremacy

Andrew King, D-Wave Senior Distinguished Scientist, D-Wave

Quantum supremacy could at times have been considered an ambiguous term in the field of quantum computing, where various firms have attempted to publish benchmarks from time to time making similar sounding claims. An issue with prior claims is that they have involved benchmarks with uninteresting and contrived problem class setups. The D-Wave standard for quantum supremacy quite simply is based on solving a useful class of problem outside the reach of best in class classical supercomputers. D-Wave offers further unambiguous terminology for tiers of performance characteristics that their solvers can be counted on for, including quantum utility (solving a problem with reliable and accurate solutions) and quantum advantage (solving a problem with some better measure of performance than classical resources, like cost, speed, solution quality, etc). The latest benchmarks documented in King’s paper “Computational supremacy in quantum simulation” [arxiv:2403.00910] was conducted with the support of reputable experts from the field and leveraging the world’s most powerful supercomputers like Frontier and Summit, and achieved each and every one of these thresholds:

  • Quantum utility
  • Quantum advantage
  • Quantum supremacy

What thresholds may they reach next?

How Tomorrow’s Technologies Will Shape Today’s Economy — the Inception of a Global Quantum Hub

Damir Bogdan, CEO, QuantumBasel

What is Quantum Basel? A private financed European initiative approaching their second year anniversary of being formed on the Switzerland innovation campus of Uptown Basel with an explicit charter to help democratize access to quantum computing to mainstream industry and promote sustainable practices for real world applications realized through novel technologies. They are hosting D-Wave and their European headquarters as a partner for this initiative along with several other hardware providers and a surrounding ecosystem of homegrown startups and university partners. Examples of their initiatives include partnering with industry for practical and reasonable channels for leveraging quantum compute in the commercial sector like for HVAC system design, solar panel / battery placements, last mile delivery routing, and targeted mitigations to various other forms of industrial processes with known environmental externalities.

It is worth highlighting that the percentage of energy consumption from our world’s data centers has skyrocketed from 8% to 13% since ChatGPT was introduced in 2022. Even if we don’t yet have a direct porting of large language models to quantum hardware, there are subdomains of scalable applications that are capable of being addressed even today with vastly more energy efficient quantum computing resources.

Quantum Basel is building what can only be described as a comprehensive quantum industry ecosystem, with venture partners, a homegrown Basel University curriculum, and a well attended global symposium with participation from industry and government. Damir closed his speech with an invitation for outreach and what appeared to be a heartfelt ThanQ.

Quantum Computing Applications at Retail

Lindsay Dukowski, Director of Data & Analytics, Pattison Food Group

Speaking of leveraging quantum compute for practical applications in industry, Lindsay has sought to adapt these new capabilities in a retail environment that each of us in the audience likely faces from week to week — grocery store management and operations. Pattison Food Group is a 100 year old company with several sub brands like SaveOnFoods. Some of the challenges they face in their unionized operations include 13 different collective bargaining agreements for the 30,000 employees working accross their 13 different sub brands. They looked at their operations and found that a particular challenge for this setting involved the practice faced by each store of drafting employee scheduling from week to week. The prior art involved leveraging schedulers and their years of experience that were necessary to understand all of the various sources of complexity unique from store to store.

With the support of D-Wave professional services established in 2020, Lindsay and her colleagues developed a framework for setting employee allocation used to import each of the distinct conventions from different collective bargaining agreements for their unionized workers into a mathematical constraint framing. With the annealing resources they found that they were able to derive scheduling updates in minutes compared what previously each store would have had to attempt weeks in advance. Perhaps more importantly, not only were they able to populate such employee schedules, they could do so in a manner that was adaptable to changing workforce conditions in near real time (when someone calls in sick or takes vacation days, they can instantly adapt). In parallel they are extending these conventions towards adjacent challenges like delivery driver scheduling and are in the process of rolling out in production to the majority of their store brands.

Panel Discussion: Quantum Success

  • Lorenzo Martinelli, D-Wave Chief Revenue Officer, D-Wave
  • Steve Flinter, Distinguished Engineer and Senior VP, Mastercard
  • Scott Buchholz, Deloitte
  • Lindsay Dukowski, Pattison Food Group

Since D-Wave is the first in the quantum computing industry to reach the stage of commercial applications suitable for industry, their sales team have had to identify what are those best practices appropriate for each customer’s journey towards implementing solutions towards those unique challenges in their organization. They welcome input from industry towards what they are doing well or could do better.

Consider the Mastercard R&D department, which after evaluating technologies selected D-Wave quantum annealers for building solutions to enhance offer allocation for customer loyalty & rewards programs or to supplement machine learning pipelines for fraud detection by way of composing feature pipelines from a vast data lake — using annealers to maximize their leveraging of expensive AI compute cycles.

As a global consultancy offering solutions across industries, Deloitte has similar challenges. They found that their emerging technology initiatives not only benefited from hiring in new sources of expertise but also by retraining their staffed consultants. They accessed their talent pool of data scientists and mathematicians and cross-trained for optimization using the D-Wave Ocean API. A big part of the art was associated with pairing the right tools to the right optimization applications. They first sought to identify those “discrete” applications which could be tackled without reworking whole business models. As optimizations are often one piece in some chain of business operations, it helped to find those applications that can be extracted to demonstrate concrete value through implementations.

The Pattison Foods team found they could structure their projects by following the D-Wave professional services recommended pipelines of starting with proofs of concept, progressing to pilot projects where they can measure the results, and graduating with scaled productized applications. An unexpected perk of these initiatives involved a newfound respect from tech workers and new university graduates benefiting their recruiting of high tech talent.

It is worth remembering that when seeking those pairings of new technologies to existing business problems, it is not uncommon that some new technology in a field won’t be immediately competitive with existing solutions, particularly as workers seeking to implement may need to be retrained to leverage new forms of software. Don’t only look for where a technology is today, look at the pace of change in an improvement curve, as tomorrow’s competitive advantages will only be realized with those investments you start making today.

Look for those employees in your organization that are already exploring quantum computing or next generation technologies like AI and reward them. Harness what passion and energy already exists for tech innovation to promote awareness of the space of the possible. The whole point of the D-Wave platform is that workers won’t need to have a phd in quantum information science in order to transform your business model. Look for ways to leverage these new capabilities for continuous improvement of existing processes as well as innovations towards all of those aspects of your field that could move the needle on diverse segments like supply chain efficiency, technology deployments, and customer experience. Don’t only focus on upgrading existing solutions, the most valuable applications will likely become those that leverage real time solvers towards high bandwidth optimization challenges for new business models that are only now reaching the space of the possible.

Panel Discussion: Quantum for the Public Sector

  • Allison Schwartz, D-Wave VP of Global Government Relations, D-Wave
  • Rima Ouied, Senior Commercialization Executive, US Department of Energy
  • Elnaz Kanani K., Partner, Deloitte
  • Dana Linnet, Senior Executive, Artificial Brain

Each of the panelists in this session were experts on adapting technologies towards challenges in the public sector. Sometimes as new technologies are investigated for adaptions to such domains, projects may be structured with research and R&D primarily conducted in an exploratory mode. The benefits of channeling energy to quantum annealing initiatives is that these robust implementations available for D-Wave solvers allows research and practical considerations of deployment to be investigated all in the same initiative.

As someone looking to adapt annealing forms of optimization to your field, it will help to focus on locating a specific problem statement. Manager’s aren’t likely to wish to use some technology just for the sake of using that technology. Try to clear pathways so that various departments may have unobstructed access to trained talent and grant pools to fund initiatives.

Consider that the researchers from Artificial Brain are finding challenges needing annealing scale solvers in high value asset management for domains like satellite routing, wind farm optimizations, smart grid control, and other sustainability initiatives. The US Department of Energy likewise has identified hundreds of applications that could benefit from annealing technology and are in need of resources for deployment.

The US government is not alone in this space. Other countries are actively making large investments and don’t want to get left behind, they see these transformations to technology as somewhat existential. There is a risk that the US could even fall behind smaller countries. The industry appears to operate under conventions that if you want to participate in a meaningful sense it is not enough to fund your own projects, you also need to be looking to help to advance to field in aggregate.

From a policy perspective, many countries are taking the approach of making commercialization a key pillar of their technology deployment initiatives. That means that federal funding is investing with explicit intent for promoting sustainable businesses for commercial industry. They are investing in mission driven programs, not just one-off grants.

It is unfortunate that all kinds of misconceptions are out there that come into play when people see the term quantum in use. Some may extrapolate to science fiction interpretations others may become distracted by popularized quantum physics thought experiments. In practice quantum annealing can quite simply be thought of as a powerful resource for evaluating optimization problems with unparalleled performance for metrics like energy efficiency, latency, and solution quality; and unmatched capabilities at the highest scale applications. Don’t let the use of the word “quantum” distract you.

Even more importantly, don’t let the fact that gate based quantum computers are so far off from usefulness become a misconception about the viability of quantum annealing technology towards transforming applications in the industries of today.

Lunch Workshop: Building a Quantum Community

Victoria Goliber, D-Wave Global Head of Technical Advising, D-Wave

As organizations and regional ecosystems look to accelerate the integration of quantum annealing to their industries, an impactful means of training and networking can be achieved through even non-formal sources of collaboration that become available from a local technology community. Even simple forms of unstructured meetups with loose agendas like paper discussions, vendor presentations, tutorial sharing, and socials can be impactful to making a region desirable for emerging technology enthusiasts and promote extracurricular explorations that enable cross-pollination of ideas across industries and silos of application. There are public apps like meetup or eventbrite that make implementing such events nearly trivial. There are countless tutorials and public code demonstrations available on GitHub. There are even more channels of videos and tutorials on YouTube that could be sourced as endless content for seminars and basis of attendee discussions. And with the availability of D-Wave cloud services any group that wishes to extend to formal demonstrations has easy access to do so.

There is no need to wait for ground up organizing if you wish to see such activity in your community. Examples abound of government funded regional initiatives with similar agendas, like those facilitated by the UK National Quantum Computing Center, South Korea’s QCenter, the Vancouver Quantum Algorithms Institute, Yale’s Quantum Computing Institute, and the Chicago Quantum Computing Summit to name a few.

Possibility the highest return on investment towards promoting a quantum community with competence in the field could be through industry hackathon events, which with the competitive dynamics can make these forms of meetups both fun and social as well as educational. Consider partnering with local industry to find business problems that could be channeled to competing teams for evaluation. There are probably countless examples of ad hoc hackathon teams leading to new jobs and sometimes even new startups. Bonus when you give away t-shirts to the participants they become walking advertisements that spread word to their peers.

Quantum for National Defense

Dale Moore, President, Davidson Technologies

Davidson Technologies is a US contractor building applications for security and national defense. They seek to leverage emerging technologies like machine learning and quantum algorithms in their work, optimizing probabilistic solutions towards obscured domains to solve mission critical applications like interceptor assignment and radar scheduling. The speed of quantum annealers towards high bandwidth applications makes their work possible, and it was a significant announcement of the conference that they are installing a new D-Wave quantum computer near their headquarters in Alabama.

Towards Material Discovery with Quantum Annealing Hardware

Carleton Coffrin, Staff Scientist, Los Alamos National Laboratory

There is a respected quantum computing researcher out of CalTech University named John Preskill who famously dubbed recent paradigms of gate based quantum computers the NISQ era, referring to noisy intermediate scale quantum. Carleton notes that Preskill’s paper with this proposal had made an unfortunate projection towards adiabatic quantum computers as “We anticipate analog quantum simulators will eventually become obsolete… they will be surpassed someday by digital quantum simulators”, although such statement was paired with an adjacent comment “when seeking near-term applications of quantum computing technology we should not overlook the potential power of analog quantum simulators.” Carleton’s subsequent research with Los Alamos now suggests that analog quantum circuits will always have a runtime advantage over gate based circuits for optimization, even if gate circuits may become capable of error correction.

Carleton’s latest work seeks to adapt the capabilities of quantum annealers towards a demanding application in material science associated with a magnetic lattice. Although fundamental properties of ferromagnetism has been well understood for some time, recent work related to Penrose quasicrystals has illustrated how some quantum properties of magnets may lead to frustrated spin states in which particle interactions lead to chaotic spin lattices with ambiguity to their optimal state and “islands” within a material of divergent properties, letting arise complicated behavior patterns with quantum characteristics that are challenging to simulate. As such magnetic materials may become adaptable to high value domains in industry (like memory storage, sensors, and conductivity), it was exciting that their experiments found quantum annealing forms of simulation were the first to demonstrate emergent phenomenon seen in experimental data that other forms of simulation had omitted, suggesting that quantum annealers are uniquely capable of simulating operational regions for magnetic lattices that would be difficult to access in a laboratory.

D-Wave’s Murray Thom further clarified that these magnetic lattices with frustrated spin states have some similarity to phenomenon now being found in modern neural networks.

D-Wave Technology: Advantage2

Trevor Lanting, D-Wave Chief Development Officer, D-Wave

The D-Wave product line progression over the years has sought to leverage all of those inherent advantages of quantum annealing that should give them enduring performance advantages over gate based quantum circuits for optimization applications targeting those business scale problems found in industry. Compared to gate model circuits, annealers are more resistant to errors from interactions with their environment. They don’t require significant tuning or preprocessing, as performance tends to be stable over a range of parameters. Scaled circuits are available today with thousands of qubits, in vast excess compared to current gate based models. All of these advantages collectively sum to advantages in scale, quality, and speed.

Leap, the D-Wave cloud service, offers resources that can be sampled from their installations like those in Vancouver, Germany, and California. Their Ocean software stack contains a growing suite of algorithms for hybrid quantum / classical solutions. They offer a rich set of annealing controls to support more advanced forms of investigation into annealing use cases.

The new Advantage2 platform is currently available in the cloud as a 4800+ qubit configuration and the 7000+ qubit stack is in the works. These recent extensions to their architecture have extended hardware performance along the axes of quantum coherence, qubit connectivity, qubit count, and energy scale. Coupled with the new fast anneal software implementation they have enabled operation that can outrun fluctuations in the thermal environment, can provably demonstrate a violation of the bell inequality as proof of quantum characteristics, and have now benchmarked running a useful form of computation in 10 minutes that would have taken the world’s top supercomputer a million years to complete.

All of these innovations take place in conjunction with extensive synergies in technology stack and cost of investment towards D-Wave’s development of their gate model systems. We can be assured that when it is ready to release it will be a competitive product, making D-Wave the only firm offering native solutions for algorithms that may leverage the best of both annealing and gate model forms of quantum computation.

D-Wave Technology: Hybrid Solvers

Alex Condello, D-Wave Senior Director of Algorithms, Performance, and Tools, D-Wave

On the subject of new tools available in their library of algorithms for quantum / classical solvers, consider the example of a set of shipping containers arriving to a harbor requiring scheduling flow of various operations (like off-loading, inspection, cranes, and truck loading). For each instance of a sequence of operations associated with a given container, there is an upstream conditional variable describing the order of accessed containers. One of the new resources now available to the API is associated with an NL solver (nonlinear solver) with a special “list” variable type allowing problems to be framed in a manner where an order of operations can be natively encoded by a single list variable entry. This form of problem formulation abstracts away complicated math and allows the efficient expression of optimization targets in which a solution schedule is fully dependent on some upstream order of operations. Such efficiently expressive formulations shorten the solver time and expand the window of available scale of variables for integration into an optimization objective. A demonstration of this flow shop scheduling formulation is now available in the D-Wave set of tutorials.

D-Wave Technology: Leap Cloud

Fiona Hanington, D-Wave VP of Program Management, D-Wave

D-Wave’s Leap cloud portal offers lots of tutorials and examples that may be adapted to countless adjacent domains and industries. The time is now to sign up for an account and begin your explorations.

Offering real time solver access to currently 42 countries and growing, Leap offers a track record of high availability >99% and last year secured SOC Type 2 Compliance as a demonstration of high standards for data security and quality. They offer production ready systems with full stack application layers between API calls down to the hardware. Their quantum annealing solvers are supported by a library of hybrid quantum/classical algorithms that may be channeled to their system through an API call for virtual and automatic routing to a cloud of quantum and classical resources, suitable for easy deployment to industrial scale applications, e.g. with scales of 2M variables and constraints applied to an optimization target. The cloud resources can even be accessed through cloud based integrated development environments like those available in Github Codespaces, allowing for ease of collaboration between professionals. The Leap service has built in administration privilege to allow organizations to allocate resources towards evolving teams. In the 5+ years since launch, more than 40k users have signed up for accounts and they have solved more than 185M submitted customer jobs. As they roll out recent advances they continue to see immediate usage traction. They offer development support, service agreements and customer support throughout a customer’s development cycle. There is online training suitable for different expertise levels. You should seriously reach out to their sales team as soon as possible. Just do it!

Quantum-Assisted Generative AI for Simulations at the Large Hadron Collider Experiments

Wojtek Fedorko, Deputy Department Head Scientific Computing, TRIUMF

TRIUMF is a Canadian particle accelerator located near the D-Wave offices in British Columbia. Conducting experiments involving high speed particle collisions they seek to measure interactions to source data related to fundamental questions in physics like the origins of mass, the behavior of the universe near time of formation, the sources of dark matter, validations of the standard model of physics, and more. Such experiments often turn to simulations to compare simulated samples with real data, where the cost of such simulations increase with the number of particle interactions. Leveraging generative models like variational auto encoders and sampling conventions ranging from restricted Boltzmann machines all the way to real quantum computers, their work has succeeded in reproducing real distributions with agreement between classical and QPU sourced histograms sampled across a wide variety of conditions. Their work could be considered a validation of quantum annealers as a resource for quantum assisted generative AI.

The Quantum Leap in Generative AI: Implications for Industry

  • Aaron Pressman, Reporter, Boston Globe
  • Yudong Cao, Co-founder & CTO, Zapata AI
  • Jon Zorio, CRO, Zapata AI

It would be an understatement to say that ChatGPT changed the field of artificial intelligence. The founders of Zapata AI from this panel are hoping to contribute directly to the interplay between such forms of generative AI and quantum computing. Yudong Cao, the CTO, spent the preceding decade conducting research on quantum computing applied to chemical simulations and optimization. With Zapata he now turns his focus to answer broader questions about how quantum computing can be used to enhance those expansive capabilities of generative AI that have become apparent from resources like ChatGPT.

He finds that there are some fundamental distinctions between what can be done with just classical resources in comparison to those systems aided by quantum computation. For example, quantum annealers have the native ability to access an expanded probability space. When applied to traditional forms of supervised learning commonly applied in industry, those data intensive applications in which large companies make massive investments may often find that annealers are capable of driving down compute durations and energy usage. When you consider those application domains in which the data is inherently quantum mechanical and driven by quantum mechanisms, the search space may become quite large. Zapata AI expects these domains will have a bigger roll for quantum computers to play.

As modern practices in AI accelerate into diverse modalities like text, image, audio, and video generation, the number of ideas and published papers is likewise becoming difficult to keep up with. For quantum computers to make their mark we will need to find a suitably stable positioning where they can make their mark. After all there are lots of ways that quantum computers can be inserted into various forms of AI implementations, the hard part is to find ways to do so in which there will be enduring advantages to leveraging such resources.

Zapata is looking to leverage their local Boston presence and the vast local ecosystem of business operations and reputable research institutions. They value intellectual rigor, target applications without obvious ethical issues, and seek to incorporate data governance and security practices directly into their applications in domains like cybersecurity, LLM operations, and etc. They are approaching completion of several proofs of concept and finding material value compared to classical solutions.

Ascent to Quantum GenAI

Ed Heinbockel, CEO, SavantX

Presenting at the Qubits conference for a second year running, Heinbockel’s firm SavantX made a big splash last year by demonstrating the tremendous gains to operational efficiency realized by introducing a quantum annealing based solution to the crane scheduling and port operations for shipping container management at the Port of Los Angeles.

It was one of the unexpected announcements to this blogger that his firm, which has been working with quantum technologies for going on five years now, has expanded their product line well beyond the field of supply chain management. In fact Ed says the journalist David Pogue, who presented (and played piano :) at last year’s Qubits conference in Miami was a significant inspiration towards SavantX looking to leverage emerging technologies of generative AI in their application development. They now believe that quantum versions of AI are critical to the future of the field.

The demonstration of this talk in fact turned to an application leveraging language models and database lookup techniques enhanced by quantum annealing to support police investigations. Their Seeker tool is a patented resource that may be particularly valuable for “cold case” investigations, in which some large and domain specific text corpus, e.g. from old officer reports and evidence documentation, may be intelligently queried in aggregate to help identify suspects, establish case theory, and all of those operations that help the police do their job to make a safer world for our communities.

Extending well beyond this narrow albeit important application, Heinbockel sought to put into perspective the significance of the work being presented by D-Wave. Consider the huge investments going into the construction of new data centers at this very minute from all directions of the AI economy. Consider that large language model training and to a lesser extent generative inference at the largest scales will require gigawatts of energy to operate. Consider the potential of D-Wave’s annealers capable of an important and useful class of optimization algorithms that in comparison have almost negligible energy demand.

It is worth pausing for a second to let that sink in.

Day 2 essay follows in a separate post linked here.

For more essays please check out my Table of Contents, Book Recommendations and Music Recommendations.

© Nicholas Teague 2024, all rights reserved

--

--

Nicholas Teague
From the Diaries of John Henry

Writing for fun and because it helps me organize my thoughts. I also write software to prepare data for machine learning at automunge.com. Consistently unique.