Image of quantum computer. Source: IBM Image Gallery

Unlocking New Possibilities: The Power of Quantum Computing

GeoTechInsights
8 min readApr 3, 2023

--

As we move further into the 21st century, it’s difficult to imagine a world without the technology that surrounds us. The invention of the microchip marked the beginning of a technological revolution, unleashing an exponential increase in computing power and a reduction in manufacturing costs. This phenomenon was anticipated by Moore’s Law, which projected a doubling of transistors on a chip every two years. From desktop computers to smartphones, the speed of technological progress has left us in awe. In just a few short decades, we’ve witnessed humans walk on the moon, self-driving cars on our roads, and virtual reality transport us to new worlds. However, the exponential increase in computing power that fueled these innovations has slowed down. The vertical leap that was initiated by the invention of semiconductors has turned into horizontal growth. As Moore’s Law approaches its end, we’re left to wonder what the future holds. Computer hardware has reached a point of diminishing returns where the cost of adding more computing power is no longer worth the slight increase in performance. This has caused a stall in the exponential growth of technology. But there’s hope on the horizon: a new technology called quantum computing has emerged, with the potential to overcome these limitations and usher in a new era of technological advancement. This essay aims to explore the limitations of classical computation and the potential of quantum computing, which is poised to become a headline innovation in the coming years as the fight for high-tech dominance unfolds.

By examining the inner workings of both classical and quantum computing, we’ll gain a deeper understanding of how quantum computing overcomes the limitations of classical computing and unlocks new possibilities in applied technology. Ultimately, this essay will demonstrate the transformative impact that quantum computing is likely to have on the future of computing and the broader technological landscape.

Understanding where classical computing falls short requires understanding how it works. Computers are complex machines that work by processing information in the form of binary digits or “bits” represented as 1s and 0s. Understanding how a computer works requires a deep dive into the layers of computing. At a high level, users interact with software when using a computer. For example, typing on a keyboard generates words on the screen. This is a user action and software reaction. Software is the medium through which users communicate with the computer and tell it what to do.

Developers create software by generating instructions for the computer to understand what it needs to do based on the users’ actions. They use a programming language to create these commands, which is a human-readable way of giving instructions to the computer. The code created by the developer using a programming language is then compiled, which is the process of turning it into bytecode, a platform-independent version of machine code, which is then turned into machine code.

Machine code is just a series of 1s and 0s which represent on and off signals. It’s the language through which the software communicates with the hardware. These signals determine which components of the computer are activated, allowing the flow of electrons to pass through certain circuits. For example, when a user types the letter “A” on their keyboard, the software generates the machine code instructions, a series of 1s and 0s e.g. 01000001, that tell the computer which circuit to activate to display the letter “A” at a specific location on the screen.

Computers are designed with a specific amount of memory, and a 32-bit architecture allows for addressing up to 4GB of memory, equivalent to 32 billion bits. Increasing the number of bits a computer can handle does have benefits, such as improved performance in certain applications like video rendering. However, there are diminishing returns to the number of bits that can be added to a computer’s architecture. As the number of bits increases, so does the required hardware, leading to increased power consumption and production costs. Moreover, there are also physical limitations to the number of transistors that can be added on a single chip, which is the driving factor to the end of Moore’s Law. As the number of transistors on a chip doubles approximately every two years, it becomes increasingly difficult and expensive to continue increasing the number of bits that can be processed. For example, while modern 64-bit processors can handle up to 16 exabytes of data, this comes at the cost of more expensive hardware and little added benefit for many applications.

To better understand this process, consider an analogy: a computer is like a factory that produces products based on customer orders. The customer (user) requests a specific product (e.g., save a file) from the store (software) in the front of the factory (computer) which is still within the factory, but is designed to look appealing for customers. The factory has a team of managers (developers) who design the product and have created a set of instructions (high-level code) for the factory workers (compiler) to follow when a certain product is ordered. The workers take the instructions and translate them into a language that the machines (hardware) can understand (machine code). The machines in the factory are controlled by switches that can be turned on and off (binary code — 1s and 0s). The worker uses the instructions (machine code) to activate the switches in a specific sequence, which results in the desired product.

The hardware architecture of a computer determines the maximum number of switches that can be activated simultaneously. For example, a 32-bit architecture allows the worker to activate only a limited number of switches at once. If the worker needs to activate more switches than the architecture allows, they must wait for the first set of orders to finish before inputting the next set of instructions (sequential processing). Upgrading to a 64-bit architecture would allow the worker to activate more switches, resulting in faster factory production. However, this upgrade comes at a significant cost. Beyond a certain point, adding more switches may not result in any significant performance gains due to the law of diminishing returns in classical computation. In fact, adding too many switches could even slow down the computer if the hardware is not capable of processing that many switches efficiently. For instance, if the worker has a tray with 128 switches, but the machines can only handle 64 signals in one operation, the worker may not be able to execute instructions any faster than with a 64-bit architecture.

Despite the remarkable progress made in classical computation, its limitations have become increasingly apparent and hindered further advances in technological capabilities. However, the emergence of quantum physics and quantum computing provides a promising avenue for transcending these limitations and unlocking unprecedented levels of computational power.

Our world operates under the laws of nature, which we explain using mathematics. Physics, in particular, seeks to provide mathematical explanations for natural phenomena. These rules help us grasp familiar concepts like gravity. However, when we look at the world on a very small scale, we encounter the “quantum divide,” beyond which our familiar rules no longer apply, and a new set of rules governs entities. The field of quantum physics explains the laws and mechanics that govern matter at a very small scale. By comprehending quantum mechanics, we could unlock unprecedented computational power and technological capabilities. This is akin to how humans harnessed the force of gravity to generate electricity through hydroelectric power, by utilizing the flow of water from a high elevation to a low elevation to create a current that turns turbines and produces electricity

In the world of quantum mechanics, information behaves differently than it does in classical mechanics, which is the observable physics we experience in everyday life. In quantum mechanics, information can exist in superposition, which means it can exist in multiple states at once until it’s measured. In contrast, in classical mechanics, information is deterministic and exists in a single state at any given moment.

When we talk about computers processing bits, we typically think of them as either 1 or 0, on or off. However, in the world of quantum mechanics, information can be in a state of superposition, where it exists as both 1 and 0 simultaneously. This is a challenging concept to grasp, but it’s similar to our understanding of gravity as a natural phenomenon that exists regardless of our ability to fully comprehend it. Quantum computing uses the principles of quantum mechanics to manipulate qubits, which can represent both 0 and 1 at the same time. This allows quantum computers to perform many calculations simultaneously, in parallel, rather than one after the other as a classical computer would, leading to an exponential increase in computational speed for certain types of problems.

The potential of quantum computing to revolutionize various sectors, from healthcare to finance to national security, cannot be overstated. In healthcare, quantum computing could analyze vast amounts of genetic and patient data to enable more accurate and personalized medical treatments. In finance, it could optimize investment portfolios and improve risk management. In artificial intelligence it could drastically improve model performance leading to countless benefits across industries. And in national security, quantum computing could enhance the ability to decrypt and encrypt communications, as well as improve the accuracy of missile defense systems. With its immense computing power, quantum computing could crack military codes and penetrate secure communication systems, giving an advantage to countries that possess this technology. It could also be used to simulate complex military scenarios and design new weapons systems, making it a new frontier in the arms race. By breaking through the barrier currently experienced by classical computation, quantum computing could reinitiate a level of exponential increase in compute power and technological innovation that we had seen in the 1960s when semiconductors were adopted, revolutionizing industries, and shaping the future.

The increased impact of technology on our lives since the invention of computer chips cannot be overstated. The pace of innovation has been staggering since the invention of the microchip, but the limits of classical computing have caused a slowdown in the exponential growth of technology. Moore’s Law, which predicted a doubling of transistors on a chip every two years, has reached its end, as the cost of adding more computing power is no longer worth the slight increase in performance. Fortunately, quantum computing has emerged as a new technology that offers the potential to overcome these limitations and usher in a new era of technological advancement.

By examining the inner workings of classical and quantum computing, we can gain a deeper understanding of how quantum computing overcomes the limitations of classical computing and unlocks new possibilities in applied technology. The transformative impact of quantum computing on the future of computing and the broader technological landscape is immense. We can expect quantum computing to become a headline innovation in the coming years as the fight for high-tech dominance unfolds. The potential for quantum computing to revolutionize fields such as drug discovery, cryptography, and machine learning is enormous, and it is difficult to exaggerate its significance.

As we move into the era of quantum computing, it is crucial that we approach this technology with an understanding of its unique capabilities and limitations. Quantum computing is not simply a faster version of classical computing, but rather, a fundamentally different way of processing information. As we continue to explore the potential of quantum computing, we must also grapple with the ethical and societal implications of this technology. As with any powerful technology, there is the potential for both good and harm, and it is up to us to ensure that the benefits of quantum computing are harnessed for the betterment of humanity.

Thank you for reading the article and supporting my work. If you want to get my future articles right in your mailbox make sure you follow my content and subscribe to my mailing list.

--

--

GeoTechInsights
Dialogue & Discourse

Your destination for a captivating exploration of the intricate interplay between artificial intelligence, economics, history, and geopolitics.