What, Why and How Qubits?

Krish Mittal
2 min readDec 11, 2023

--

In the realm of classical computing, the fundamental unit of information is the bit, representing a binary state of either 0 or 1, akin to a simple on/off switch. However, the landscape of computing is evolving, and at the forefront of this evolution is Quantum Computing (QC). So, what exactly is QC, how does it operate, and why should we be intrigued by its potential?

At the heart of QC lies the quantum bit, or qubit. Unlike classical bits, qubits possess the remarkable ability to exist not only as 0 or 1 but also as a superposition of both states simultaneously. This unique property grants quantum computers unparalleled computational power.

To understand qubits, delve into the world of electrons, particularly their magnetic properties known as spin. Picture these spins as tiny bar magnets with down spin having the lowest energy and up spin the highest. It’s akin to a compass without glass, naturally pointing north at its lowest energy but requiring energy to point south.

Creating qubits involves manipulating electrons. Starting at the lowest energy level with a spin-down, a strong magnetic field — generated by a superconducting magnet in liquid helium — flips the spin to an up state. Maintaining temperatures close to absolute zero prevents thermal energy from disrupting the electron’s state.

To write information to qubits, a precise pulse of microwaves, dependent on the magnetic field, excites the electron. This controlled pulse, akin to a radio tuned to a specific station, induces superposition. Reading this superposition is achieved through transistors.

The question arises: why delve into the complexities of qubits? Consider a pair of classical bits compared to qubits. While classical bits represent 4 states (00, 01, 10, 11), qubits introduce a probability factor, necessitating information about 4 elements. Extrapolate this to 300 entangled qubits, and representing them in classical bits would require 2³⁰⁰ bits, surpassing the total particles in our universe.

This cryptographic implications, as seen in RSA encryption, where classical computers struggle with large prime factorization. QC, on the other hand, could crack such problems in a matter of minutes.

However, the promise of QC doesn’t imply superiority in all computational tasks. For everyday activities like watching videos, playing games, or coding websites, classical computers remain efficient. The true potential of QC lies in exploring realms of mathematics and nature that classical computers find daunting.

In conclusion, the era of quantum computing beckons, promising a paradigm shift in computational capabilities. As we unlock the secrets of qubits, the synergy of classical and quantum computing may hold the key to solving problems previously deemed insurmountable.

--

--