Quantum Computing: The Next Big Breakthrough

Brandon Gomes
4 min readMar 12, 2024

--

Photo by Yorgos Ntrahas on Unsplash

Over the past few years, AI development has jumped leaps and bounds. From OpenAI’s ChatGPT, Meta’s Llama, Google’s Gemini, and more recently Gemma, and many more. As the training requirements increase to create these Large Language Models (LLMs), the demand for processing power has never been higher.

For example, ChatGPT has used an estimated 20,000 Nvidia A100 Tensor Core GPUs and is said to eventually need over 30,000. To put this into perspective, a single A100 unit goes for around $20k on the market. This is no small fee to create these behemoth models. Without accounting for initial training, hardware, and energy costs, ChatGPT-3 costs OpenAI up to $700k per day to run and is proposedly higher now.

  • Refer to the authors Mok, Aaron and Zhiye, Liu in the Works Cited.

As impressive as the current technology is, there will soon be better alternatives. Quantum computing will perform tasks at higher speeds, with greater efficiency, and at lower costs, making classical computing obsolete.

Recent developments in quantum computing have been pointing toward a massive leap in commercial use. Starting with IBM’s Heron quantum processor bearing 133 high-quality qubits. I will explain this in further depth, but for now, a qubit is a binary bit that becomes uncertain. The Heron processor was used to create Quantum System Two, a modular computer displayed at the company’s Quantum Summit of 2023. During this, IBM announced its focus on long-term research into large-scale quantum computing.

  • Refer to the author Dotson, Kyt in the Works Cited.

Why is this REALLY important? To jump the gun, they are just better. Quantum computing decreases the complexity of algorithms performed in classical computing anywhere from quadratically to exponentially. To put this into perspective, do you remember Big O?

This image is on a scale of a 1x:10y ratio (x is 0–100 & y is 0–1000). As a quick recall, as the number of elements increases, AKA the number of things an algorithm needs to process, the number of operations increases, AKA how many steps the computer needs to take to return a result. This can represent the complexity of an algorithm and how much time it takes to complete a task. Keep this in mind.

Now, let us observe two algorithms. Let’s start with a problem: We want the prime factors of 9.10043815x10⁵³. If we were to create an algorithm in classical computing to perform this task, it would have a complexity of O(2^n) (refer to the image above). This is terribly cumbersome and extremely time-consuming. However, Shor’s Algorithm, a quantum factorization algorithm invented in 1994 performs this task exponentially faster. Bringing the complexity from O(2^n) to O(n). This is incredible! And to quench some of your curiosity, that large number is comprised of 2¹⁰⁰ + 3⁵⁰ + 5²⁵.

  • Refer to the Classiq article in the Works Cited.

Further, assume we have an unsorted database or, for our example, an unordered list; what are some ways we could search through this list? There are two methods we can take: A) Linear search, or B) Sort the information, then perform a binary search. Sorting the list would be a heavy job, but would make searches down the line easier, but for now, we will focus on a linear search. This method gives every item in the list equal importance, and as its size increases, we notice that linear search has a complexity of O(n²), not great. On the other hand, quantum computing may use Grover’s algorithm which, like Shor’s algorithm, brings the complexity down to an impressive O(n).

  • Refer to the author Kothari, Robin in the Works Cited.

The power, precision, efficiency, and cost of quantum computing when compared to classical computing is a combination that cannot be denied. With major beginning steps being taken extremely recently, quantum computing is nowhere near its peak. Big tech companies are already funding quantum research and are expecting game-changing developments soon. With companies like Google, Microsoft, IBM, Intel, and even AWS pursuing this technology, we are on the verge of the next breakthrough.

If you would like to know more about quantum computing, read my next post:

What is Quantum Computing?. If you’ve read my previous publication… | by Brandon Gomes | Mar, 2024 | Medium

Works Cited

Mok, Aaron. “ChatGPT Could Cost Over $700,000 per day to operate. Microsoft is Reportedly Trying to Make it Cheaper,” Business Insider, 20 April, 2023, How Much Does ChatGPT Cost to Run? $700K/day, Per Analyst (businessinsider.com).

Liu, Zhiye. “ChatGPT Will Command More Than 30,000 Nvidia GPUs: Report,” Tom’s Hardware, 1 March, 2023, ChatGPT Will Command More Than 30,000 Nvidia GPUs: Report | Tom’s Hardware (tomshardware.com).

Dotson, Kyt. “IBM Unveils Next-Gen 133-qubit Heron Quantum Processor and its First Modular Quantum Computer,” SiliconAngle, 4 December, 2023, IBM unveils next-gen 133-qubit Heron quantum processor and its first modular quantum computer — SiliconANGLE.

Classiq. “Quantum Cryptography — Shor’s Algorithm Explained,” Classiq: News and Insights, 19 July, 2022, Quantum Cryptography — Shor’s Algorithm Explained (classiq.io).

Kothari, Robin. “Quantum Speedups for Unstructured Problems: Solving Two Twenty-Year-Old Problems,” Microsoft Research Blog, 4 May, 2020, Quantum speedups for unstructured problems: Solving two twenty-year-old problems — Microsoft Research.

--

--

Brandon Gomes

Entry-level software engineer interning and working toward a Bachelors.