Limitations of scientific language: Quantum physics

Toni Ram
16 min readMay 4, 2024

--

Evolution of scientific language. Practical understanding on how science is used to evolve theory. Limitations of quantum physics of the first quantization theory developed by Heisenberg, Bohr and others.

Literal human language and science

When writing a simple statement that could be a part of a book: “In a room there is a table and a chair”, the information about the room is conveyed in a simple description. But does the statement fully describe reality? Does it describe all the intricacies and complexity of the table and the room? Of course not.

Further, a more complicated statement can be written, that includes description of the table. for example: “The table has such a form and is of such width… and so on”. That is an improvement of previous statement, but it is still a crude approximation of reality.

Then again, of importance for understanding science, the problem is: does the statement include cracks in the table, information about molecular structure of the table. Does it include position of atoms that constitute the table. Does it include information about energy levels of electrons in each atom? Does it included information about the protons and neutrons in the nucleus? Does it include interaction of electrons and other particles with vacuum energy? Does it include the number of photons in the room, photons from the cosmic background radiation, photons emitted by matter in the room, neutrinos…. and so on.

Obviously it does not, so all of our language is an imperfect description of reality. It is possible to write a whole book about the room and it still would not be precise enough to contain all the information about the room. The conclusion is, the more ambitious the attempt is to describe reality, the more information is required, and it can get exceedingly complicated — in practice so complicated no machine could contain all the information and solve the problem. Physicists for the same reason use ‘approximations’. The statement “In a room there is a table and a chair” is a literature approximation of reality. It contains important information to a human being, where number of photons in the room is not of importance so it is not part of the descriptive model.

Roger Penrose quote on mathematics: ”[Our understanding about] physical reality depends on something which is more precise then [literal human language]”

All language defines reality up to some precision, the more precision is required, language or statements will be more complicated. What is important to know where the statement or theory fails and should be replaced by a better alternative. Science is just that, a series of mathematical approximations of reality. With advancement of human civilization, the theories of science get more precise, and serve the purpose better. Here are a couple of truisms. Language can be crude or can be precise. Statements can be true or false or anything in between.

Language of science is mathematics, and not English or other human language. No one in their right mind would try to use literature to describe quantum physics or try to build a quantum computer. Mathematics was specifically designed for the purpose of understanding physical reality. Physics is way more more accessible with mathematics — but the paradigm is not perfect. Mathematics statements can be true or false or anything in between.

No wonder that the complete field of mathematics is considerably more vast then specialized mathematics that describes physics, sciences and reality. Any language can be abstracted to enormous complexity, without any of it referencing what exists in reality.

Human languages have been evolving and becoming a more precise description of reality. Physics and mathematics is the same, it has been evolving for centuries. Old Greeks first used arithmetic and geometry for nowadays simple problems. Euclidean geometry developed by Euclid had already been a mathematical theory developed for the purpose of understanding reality. Galileo’s principle of relativity and Newton’s laws were the beginning, a simple statement about our reality and how universe works. It used simple mathematical language, a bit of calculus (derivatives, gradients, integrals) and nothing more. You can use it to calculate period of orbits of planets or build simple mechanical machines.

At the next level, in 19th century, Lagrange, Euler, Hamilton, Poisson (and others) developed a mathematically sophisticated theory of classical mechanics that was a better description of the universe that allowed a more complicated study of many particle systems, but many properties of the universe were not included, such as structure of atoms or a finite limit to the speed of light c. The next description of reality was even more advanced, as Einstein developed special and general relativity, that described gravity as curvature of spacetime. Yet, structure of atoms remained, until Heisenberg and Bohr. Quantum physics was developed. And so on.
The ultimate limitation on precision of out understanding of reality was revealed by Werner Heisenberg (with help of Bohr, Born, Pauli and others), the man who made the greatest development in physics and science in human history — the theory of quantum physics and uncertainty principle. It is a precision that is not caused by limitation of technology, but is intrinsic to reality. I have written extensively about uncertainty of many variables or observable physical quantities that comes from not just quantum theory but as a consequence of calculus.

Evolution of physics

Even the article I am writing right now is a an attempt to convey a description of theory of science, an attempt to make it as simple as possible, without taking away important information about reality and theory of quantum physics. Here is another attempt by use of categorization and branching of fields of physics.

Branching of theories of physics and their evolution from simpler theoretical models.

Classical mechanics leads to Newtonian gravity. Also classical mechanics combined with electromagnetism leads inevitably to special relativity as elementary particles can not have speed faster then speed of light c. On a separate branch of evolution, classical mechanics evolves to quantum mechanics. But the quantization of the electron and atom properties is not enough.

Particles can be created, photons can decay to positron/electron pairs, electrons and positrons can emit photons, particles and antiparticles can annihilate and form neutral charge particles like the photon, and so on. So quantum mechanics and special relativity combines into quantum electrodynamics, that describes interaction of electrons, positrons and photons.

With gauge theory (that describes fundamental forces as caused by exchange of gauge bosons such as the photon), quantum field theory is the last development, a theory describes interactions of all elementary particles — all of which were detected in experiments, the last was the Higgs boson discovered in CERN. The standard model[2] has 23 free parameters (couplings of particles with the Higgs field, strength of fundamental forces and so on).

Elementary particles of the standard model of particle physics.

Standard model of cosmology[1] is a wonder because it describes the universe with only a few parameters (cosmological constant or vacuum energy ratio of total energy of the universe, ratio of total matter, ratio of radiation energy, Hubble expansion rate of the universe and other parameters that describe specifics of inflation theory).

There has to be an important distinction between: mathematical theorems that are shown to be true with logic — statements that are true unconditionally, always. AND the other use of mathematics, the use in science, which is an attempt to model reality and physics of the universe. That is an important difference, so the fields of mathematics I define to be:

LOGIC MATHEMATICS — deals with abstractions and theorems based on logic. For example equation x=1 makes sense and is logical because both x and 1 are abstractions that can be equalized.

and SCIENCE MATHEMATICS — deals with description of reality as best as it can be done, based on experimental observational data with current technology.

There is in some cases overlap between logical and scientific mathematics, as it is for the law of probability — which is unconditionally true in reality and also true by both logic and experiment. It is an indication that it is a law that is fundamental — and in fact quantum physics is completely built from probabilities.

Just like the beginning statement about a table, every language, including mathematics of science (which includes every statement or equation), as description of reality, is an imperfect, constantly improving activity and attempt to understand how universe works. It is not, unconditionally true. Although, certain aspects of reality can be approximated very well with mathematics, so that for all practical purposes it works almost perfectly. Mathematics can also lie, and represent pseudo-sciences and complete non-sense.

Yet, for some reason when certain people use mathematics in science, they use it as it was unconditionally true.

Limitations of precision of language and information are not just hypothetical or ontological but also based on physics theory and experiments. According to the holographic principle, the maximum amount of information possible in universe could be related to number of bits represented by number of Planck areas on the area formed by cosmological horizon — which is the boundary of the observable universe. Planck area is defined by length of the smallest length possible the Planck length. Holographic principle is based on study of semi-classical (quantum) gravity and horizons in spacetimes — and ultimate consequence is that all information can be written on the area of a sphere formed by the horizon of spacetime. So the entire universe could be a hologram, and all information written on the area formed by the cosmological horizon.

By increasing precision, the problem increases the required computational complexity. However, there are sometimes people that make genuine contributions at optimization of language, condensing and minimizing information into simple theoretical models and equations that can very precisely correspond to experimental data.

Einstein equations are many differential equations dependent on derivatives of the gravitational field as curvature of spacetime and source of gravity which is energy or mass.

Approximation

Because of limitations of language and mathematics of science, virtually every theoretical model has some degree of approximation, and it is at a certain level of limitation of our knowledge and ability to describe reality. Theories of science are evolving as the human civilization evolves. Even if our theories could be an incomplete (ontological) description of reality, as long as it can be used to develop technology that advances our ability to evolve society, and build experiments that show failures of previous theoretical models, there is no problem or conflict.

As a simple example of ontological incompleteness not being a problem, is measurement of position of a particle: it is paradox to require measurement of the position to an infinite precision. What would be even the purpose of such futile attempts? No computer can write down an infinite number of decimal places which is required from the measurement of infinite precision. Quantum mechanics is a necessity feature of the universe to prevent the paradox of ‘infinite precision measurement’ (further explained in articles). Universe without a small Planck constant would not be just unliveable, but would be an impossible paradox. It is obvious that laws of physics can and have to work without infinite precision of observable quantities. Infinite precision measurement is an impossible, impractical and illogical requirement. It would require building a computer with infinite complexity. So every theory of science has to be an approximation to prevent impossibility of infinite complexity, and quantum physics has to replace classical world in microscopic measurements of reality.

In any case Carl Popper had claimed that theories of science can not be defined to be unconditionally true, but can be falsified — peer reviewed theories that do not work are removed from academia and new theories are proposed. Which is true, since we don not have the ultimate Theory of Everything. Theories evolve from hypothesis and conjectures to successful models by verification through experiments.

In every universe postulates of classical physics (determinism, objectivity, realism) have to breakdown and have to be replaced by a quantum world where exact precise position or velocity of particle is impossible to determine. It is an example where impractical becomes illogical, because it is illogical to try to measure some observable value to infinite number of decimal places. It is also why quantum wavefunction is necessity of reality — elementary particle have to be in more place then one — it is impossible to localize elementary particles too infinitely dense point. Also, the universe can not contain infinite amount of information — which would infinitely precise measurement require. Quantum uncertainty is necessary to make sense of the mentioned impossibility. Ontological completeness is limited by amount of information universe can contain and that measurement requires.

Equations of the quantum physics are highly sophisticated models that can describe microscopic quantum systems. ‘First quantization’ quantum theory developed by Heisenberg, Born, Bohr, and others was meant to describe primarily the atom. The ‘first quantization’ term refers to quantization of energy levels of the atom, and also for example position and momentum. It did not describe where do particles come from, how are they created: a result of the quantization of fields (electromagnetic field, electron field), which is described by quantum field theory.

First quantization

As a consequence of the experimental fact that the electron in the atom is orbiting the nucleus, at certain lengths and with energy that is quantized (it can have only a specific value), the electron is described by a wave-function and not a definite determined trajectory. Wavefunction properties describe probability of the electron being in some location in space. As a simplified explanation, the electron wave-function forms a standing wave that enforces itself — which prevents the electron from descending to the positive nucleus. As I have described in article, that is not necessary a problem and it is perhaps a necessity of the universe that particles can not be localized in space to a infinitely precise point.

Wavefunction as a function of spacetime can be obtained by solving differential equation which is dependent on the Hamiltonian[2] which represents total energy in the system (included kinetic and potential energy).
First quantization equations, that through wavefunction describe the electron is a more accurate scientific language description of microscopic reality then Newton’s laws, but still is just an imperfect theoretical idea, it has sever limitations.

Certain philosophers or physicists use ‘first quantization’ equations as if it is the final equation of quantum theory and reality. They do not understand the limits of first quantum equations, that they are an imperfect statement about reality. First quantization works in specific systems such as the Hydrogen atom. It can calculate energy values of electrons until certain precision in decimal places. Because it does not include relativistic effects (speed of light c) and since electrons velocity is approximately 0.1% of speed of light, energy in the real Hydrogen atom is slightly shifted. The effect of spin can also be incorporated by design, but there are better alternatives: quantum field theory (QFT). First quantum theory is just an incomplete approximation of reality, although an improvement over Newton’s laws and classical mechanics.

All first quantization non-relativistic quantum equations are equivalent to one another (including the original developed by Heisenberg and Born), which can be shown by mathematical transformations. Which demonstrates that theory is mathematically self-consistent.

‘Entangled living creature’ experiment

When physicists try to provide a popular explanation of quantum physics they usually describe an experiment where a living creature is entangled with a single elementary particle (by some mechanism that measures the particle) which is simultaneously in many states at once (which is also called superposition). The measurement collapses the quantum state of the particle from many state options into one state, the machine registers the experiment result, and then the fate of the creature is determined. Before that, it is also claimed that according to equations of quantum physics the living creature can be in many states at once, as a quantum wavefunction entangled with the machine and the elementary particle.

We know that the ‘entangled living creature’ experiment had never been, and can never be successfully performed. Even the most efficient and recent quantum computer has qubits that are entangled for no more then one second. So then first quantization of quantum physics is insufficient description of reality, and only approximation, that fails to full describe the universe. It works for certain systems such as the Hydrogen atom, but only for a certain limit in accuracy (Dirac equation has better precision because it includes relativistic effects and vacuum fluctuations effects can be included with quantum field theory).

Limitations of the first quantum theory

The problem is that ‘first quantization’ is really a severe simplification of quantum physics and reality. First quantization developed a century ago has a lot of limitations:

— it describes only one particle (or several particles with one equation for each), which can be further complicated by adding interaction terms that are approximate.
— potential in the equations is a mathematical approximation of more complex interactions — no description of the origin of potentials (in reality they are created by other quantum fields/particle).
— does not include interaction with other quantum fields unless approximated, even for example the electromagnetic field.
— it does not incorporate creation and annihilation of particles. Particles appear out of nowhere and exist forever — without explanation of their existence. Origin of particles unknown.
— does not consider that particles can not have a larger velocity then speed of light, relativistic effects result with existence of antiparticles.
— does not include spin by default. If the spin is added the non-relativistic equation becomes the Pauli equation.
-does not include antiparticles. Originally, before Dirac equation[2] posited existence of antiparticles, antiparticles did not even exist in theory, neither they were detected experimentally.
-no explanation for infinite self-interaction of the electron which makes the mass and energy infinite (as I have explained in article)
-no renormalization of the structure constant or evolution of the probability of interaction with energy which is confirmed experimentally (renormalization group theory), no resolution of infinities (as I have explained in article).
-no explanation for mass generation. How does the electron have mass? Explanation: Higgs field.

I emphasize: no interpretation or theory of quantum physics can be complete without explanation on how particles are being created and annihilated.

The electron is not some static object. As a simple description, it is constantly bombarded with virtual and real particles from the environment, exchanging photons and gravitons with other particles, changing its momentum and energy, even if in the atom its energy is slightly shifted by Lamb effect caused by vacuum fluctuations (interactions with virtual particles). That process is called decoherence, because the wavefunction is no longer coherent. It is not properly understood how decoherence works — which is basically an approximation of a very complicated process. The environment consists of photons, neutrinos, vacuum fluctuations as virtual particles, gravitons, electron interact with the Higgs field as well, all other particles, and so on. Experimentally only average energy value makes sense, which is why in practice when performing a experiment the measurement is performed many times and values averaged.

Mathematically, we can not have a lot of confidence in the original Heisenberg/Bohr quantum theory. After the measurement, after collapse of the wavefunction, the elementary particle should be localized in space to a point, the electron is represented mathematically by a Dirac delta function, which is a infinitely dense point in space. A mathematical fact: after the collapse of the wavefunction, the equations of quantum physics evolve the initial Dirac delta function by spreading the point in space.

Quantum wavefunction after collapse by measurement is represented by a infinitely precise point. After which it spreads in space if not measured.

If we are going to consider first quantization literally, as the equations that completely describes our reality, then every (free) particle should with time have a infinite spread in the wavefunction (from Delta function) if no measurement had been performed (or if there were no collapse). The solution of the same equations describe the electron probability wavefunction which it spreads and diffuses into every part of the universe, if not measured. The same way classical energy spreads from a point source, the wavefunction spreads all over space, it spreads until infinity.

Cosmic rays that travel through the universe without measurement for millions of light years, would be reaching planet Earth with infinite spread in the wavefunction, and would have an almost exactly the same probability of being everywhere in position, in space. If there were no decoherence. From the initial condition, electron wavefunction evolves into a function with increasing width. Eventually, if not measured the statistical distribution would encompass the entire universe — which is not possible.

But we know that is not true from experiments.

The fact that ‘Entangled living creature’ experiment is impossible to perform in reality is the best indication that first quantization is wrong and only works in special cases. It is not possible in practice to entangle a quantum particle with a an entire macroscopic living thing or object, while the state remains stable.


What is important is to differentiate quantum field theory (QFT) with original quantum physics developed by Heisenberg and Bohr, and with Dirac equation that includes Einstein’s special relativity with a finite speed of light c. First quantization deals with elementary particles, with assumption that particles already exist — it deals with small energy problems where particles are not created or annihilated, such as the Hydrogen atom, which is already a stable configuration. QFT deals with fundamental forces and origin of elementary particles and how they were created. The insight of study of QFT can not be overstated. Claiming any serious conclusion about quantum theory based on the first quantization equations, which are a simplification of quantum theory, is overambitious.

It is possible to attempt to study the entire universe as a single wavefunction. However, the electron being in superposition of different states is not the same as the Universe being in superposition of many states. Electron is just one particle and quantum physics applies by quantum mechanics equations, although for a short time since in a experiment the electron will interact eventually. Even if somehow you create a perfect vacuum (in practice impossible), there is still the lowest energy state of quantum fields and quantum effects that interfere with the measurement. The universe has 10⁸⁰ particles and emerges as a classical system that can not be studied with one quantum wavefunction. You can not ignore quantum decoherence of even a system of one quantum particle and definitely also can not study the entire universe as a simple quantum system, based on the first quantization. Neither can you perform the measurement of the entire universe to confirm the theory. All of it shows how much of physics is approximation and applied theoretical models.

Mathematics is language designed for science. But, just like any other language, mathematics is imperfect and in process of evolution. I believe this is one of main reasons why almost every interpretation of quantum theory (based on first quantization) until now, has been inadequate. Using ‘first quantization’ to describe reality entirely is I believe to be a miseducated attempt, doomed to failure. Probably, until Theory of Everything is developed, there will not be a successful complete interpretation of quantum physics.

In the next article I will describe the next advancement in quantum physics: Dirac equation and quantum field theory.

[1] Planck 2018 results-I. Overview and the cosmological legacy of Planck. Astronomy and Astrophysics, 641, p.A1. 2020.
[2] An introduction to the standard model of particle physics. Cottingham, W.N. and Greenwood, D.A., 2007.

--

--

Toni Ram

Physicist. Scientist. Published author. I have a degree in astrophysics and particle physics. Quantum theory, Cosmology, Gravity, Inflation theory.