Introduction to quantum mechanics and quantum computing (part 1)

antonious iskander
6 min readAug 30, 2023

This series of articles is part of my Womanium Global Quantum Project.

Quantum Mechanics

Quantum Mechanics is a branch of physics that deals with the world at the nanometer scale. In classical mechanics, which we usually call physics or classical physics, we are trying to understand the world at a larger scale above the nanometer scale. To contrast quantum and classical mechanics a nanometer is 1,000,000,000 times smaller than a meter or 0.000000001 meter. What is so special about the nanometer scale? It turns out that quantum mechanics exhibit different behaviors at that scale. The nanometer is not a hard line that defines the difference between classical and quantum mechanics. There is also modern physics that is a hybrid of the two types of mechanics.

History

Quantum mechanics started out in the early 19th century. However, the term quantum mechanics was coined in the early 20th century. In classical mechanics, around the 17th century, the famous Issac Newton developed a theory that light behaved like a particle. In the beginning of the 19th century a physicist named Thomas Young made a real-world experiment called the double-slit experiment to demonstrate that light acted like a wave. This experiment showed that when there were two slits on anything like a screen or something like a piece of paper then each particle of light passed through both slits at the same time and hit the screen behind it where there were no slits in the foreground and as if there was only one slit in the middle of the foreground screen (picture shown below).

Image by Jordgette, CC BY-SA 3.0 via Wikimedia Commons

This showed similar results to single slit experiments where a single slit was in the middle of the screen or paper and the light hit the screen behind it exactly where the slit in the foreground was placed. This experiment proved the theory that light was a wave which was developed during the time of Issac Newton. However, Newton’s theory of light as a particle was more popular at that time. After this experiment and many others, statistical mechanics was developed. Statistical mechanics uses statistics
to model randomness, uncertainty and such.

In the early 20th century, two physicists Stern and Gerlach made an experiment called the Stern-Gerlach experiment. In this experiment they shot silver atoms between magnetics into a screen. The magnets showed the effects of a magnetic field on atoms. The atom passed two magnets, one north pole and one south pole. Regardless of how the atom was shot onto the screen but depending on the mass of the atom, the orientation of the magnets such as horizontal, vertical or at an angle and strength of the magnetic field that determined how it landed on the screen either on one side or the opposite side e.g., top or bottom, left or right. This is not to say that these factors controlled the atom in a particular direction. For example, the atom would go up or down on the screen in random order when those factors stayed the same, but we can’t control when they go up or when they go down.

We don’t know the exact path that an atom takes but we do know where they end up which we can consider a measurement. This experiment also showed that particles have spins either spin up or spin down which are completely random, meaning they can’t be predicted or controlled. Later physicists were able to develop on these experiments that showed that atoms remember their orientation and forget their orientation. We use these later experiments to develop quantum computers that we can
measure the state and forget the state. We will talk about quantum computers in the next article.

Theories
About a half decade after the Stern-Gerlach experiment, in the mid to late 1920’s theories began to develop to understand or explain the phenomena that were shown in experiments for more than one century. Two top and contradicting theories were present until then. So, a question arose from them. Is light a particle or a wave? Complementarity that was developed by Niels Bohr that said we can measure to see if something like light to be a particle or a wave but not a particle and a wave at the same time. This is similar to superposition but not quite which we will talk about later. Another theory that built on the complementarity was called wave-particle duality. The wave-particle duality is a theory that was developed that explains that light has properties of both a particle and a wave. The property that shows the most depends on the environment and the action taken with light. It also holds that all particles behave like waves as we found out later.

The Copenhagen interpretation were discussions surrounding many past experiments and their interpretations. It was about quantum mechanics in general not about a certain type of experiments or a particular phenomenon. A decade after the Copenhagen interpretation a hypothetical experiment called Schrodinger’s cat which was related to the Copenhagen interpretation. This was just an example not an actual experiment. Schrodinger’s cat said that if we put a cat in a box with a container filled with poison and the container breaks is the cat dead or alive? According to Schrodinger the cat was both dead and alive before looking inside the box. Before looking inside the box was considered something called superposition. Only after looking inside to see, we can determine if the cat is either dead or alive, but not both dead and alive. Looking inside the box was considered a measurement.

Superposition is when an object, such as light, takes both paths like the double slit experiment. It can also be when something is doing both things at the same time. It can also be an object that has both states like Schrodinger’s cat. Superposition is used in quantum computing as two states simultaneously. Also, objects can be anything such as atoms or electrons as well. An object doesn’t break in pieces, takes all paths then become whole again. The object as a whole takes all paths at the same time.

There is only a certain amount of chance to observe or measure the object in each state at a point in time. For example, there is a 1/2 chance that an object will be one state or the other state. How this happens is unknown because we can’t measure all the paths at the same time, that is considered an observation or measurement. When we try to observe all paths or states, superposition disappears or collapses into a single state or path. Since an object can take all paths, the path of the object is indeterminate because there is no way to know the current path without observing or measuring. When we observe then it becomes deterministic.

In the early 20th century, a physicist named Werner Heisenberg creates a mathematical model that describes the limit of how much we know about an object. This is called the Heisenberg uncertainty principle. It shows if we know about the position of an object, we can’t know its momentum. For example, if we know the position of an electron around the atom, we can’t know the momentum of the electron moving around the atom. Another example, if we know the momentum of an atom, we can’t know its location. We can’t know both at the same time and they have an inverse relationship.

In the mid 20th century a physicist named John Wheeler wondered if we can decide to observe or not an object after it has taken both paths? This was called Wheeler’s delayed choice experiment which was just a bunch of hypothetical experiments. However, later we found out through other experiments, that we can actually do that. Wheeler’s delayed choice experiment would let us choose the experiment setup after the object passes both paths but before we observe it. He also wondered would the object know the choice we made after it passes through? The consensus is no but either way is hard to prove through experimentation. Each hypothetical experiment was posed from a question.

In my next article (part 2) we will talk about quantum computing and show how the theoretical and experimental fields of quantum mechanics have influenced the development of real-world quantum computers.

I would like to thank the Womanium Global Quantum Computing Program for encouraging me to start sharing my journey into the quantum world.

Below are resources and futher readings:

edx course:
https://www.edx.org/learn/quantum-physics-mechanics/georgetown-university-quantum-mechanics-for-everyone

wikipedia:
https://en.wikipedia.org/wiki/History_of_quantum_mechanics
https://en.wikipedia.org/wiki/Quantum_mechanics
https://en.wikipedia.org/wiki/Timeline_of_quantum_mechanics
https://en.wikipedia.org/wiki/Introduction_to_quantum_mechanics
https://en.wikipedia.org/wiki/Stern-Gerlach_experiment
https://en.wikipedia.org/wiki/Schr%C3%B6dinger%27s_cat
https://en.wikipedia.org/wiki/Double-slit_experiment
https://en.wikipedia.org/wiki/Wave-particle_duality
https://en.wikipedia.org/wiki/Copenhagen_interpretation

https://en.wikipedia.org/wiki/Wheeler's_delayed-choice_experiment https://en.wikipedia.org/wiki/Uncertainty_principle

book: Introduction to Quantum Mechanics by David Griffiths (textbook also referred to as Griffiths)

lecture: www.feynmanlectures.caltech.edu

--

--

antonious iskander

I started my journey in Quantum Information in 2020. I'm here to share my journey in quantum mechanics and quantum computing.