On Contingency

Dean Taciuch
The Startup
Published in
8 min readJan 1, 2020
Photo by Linus Nylund on Unsplash

In the introduction to Human Use of Human Beings, Norbert Wiener describes the concept of the “contingent universe.” This is the world in which we live, in which events are contingent on other events. This world is deterministic, yes, but at some level always uncertain. Wiener’s Cybernetics explains how complex systems, such as humans, animals, and automatic machines, handle such contingencies: we gather information about the world and its responses to our actions, and we adjust future actions accordingly. Such adjustments are never perfect, because the contingencies of the world are always changing.

These contingencies are rooted in quantum physics, but the complex interconnections of real-world events add even greater uncertainty. Contingency is an informational issue. Complete information is not possible, even in the simplest of systems. In complex systems, more information is always available.

In an ideally deterministic world, Newtonian mechanics are perfectly valid. The universe runs like clockwork, wound up by a divine hand and left to run in perfect precision. In such a world, any unexpectedness is the result of instrumental error, a lack of precision on the part of the human doing the measuring or recording. If one had a perfectly accurate account of a system at any point in time, the future and the past of that system could be calculated. Such precision is indeed possible with very large, or very simple, systems — the mechanical orreries of planetariums, for example, are as accurate as modern astronomical software for determining the positions of the planets. The limits of Newtonian mechanics, however, are evident when we try to use these laws to predict something like the weather.

Weather is a purely deterministic system — there is nothing inexplicable about it. As measuring devices get better, predictions improve. Radar, satellites, and computer modeling all give meteorologists more information about the complex interactions in the atmosphere. But the amount of available information is staggering — to fully describe the entire system would require knowing the position and momentum of every molecule, every airborne particle, and every force acting on those particles. And that information is dynamic and interconnected, so one would need to continually update the information in real time. The atmosphere is a Newtonian system, but one of immense complexity.

This complexity was first noticed in the physics of steam engines — systems, like the atmosphere, composed mostly of gasses. The physics of steam engines gave us the laws of thermodynamics, which, unlike Newton’s laws, are statistical. These statistical mechanics are always a bit uncertain, and this uncertainty has a name: entropy. As initially understood, entropy was a measure of the energy lost in any heat engine. The amount of energy stored in the fuel is greater than the amount of energy obtained from burning it, which is greater than the amount transferred to the water to create steam. The steam contains more energy than the final output of the engine. That is, the engine is never 100% efficient.

This inefficiency is unsurprising for 19th century technology, but thermodynamics also tells us something a bit more surprising. Unlike Newtonian systems, thermodynamic systems are irreversible. The lost energy in a steam engine cannot be recovered without doing more work (which would require more energy). Entropy was understood as a form of irreversible disorder in a system. The stored energy in coal, for example, is more ordered than that in steam. And some of the energy escapes the system entirely, in the form of waste heat. One cannot create coal by running a steam engine in reverse.

The law describing this irreversibility is the Second Law of Thermodynamics, which states that entropy of a closed system will never decrease. Understanding why requires looking into the statistical nature of thermodynamic systems. Concepts such as temperature and pressure are actually statistical: temperature is the average energy of the movement of the gas molecules; pressure is the average force these particles exert on the walls of the container. Left alone, the system will inevitably settle into a state of thermal equilibrium, in which the molecules all have roughly the same energy, and are moving at roughly the same speed. Any minor variations will become smaller and smaller, as fast moving molecules collide with slower moving ones or lose energy to the walls of the container. A hot system tends to cool down.

This tendency for systems to cool down never spontaneously reverses. Another way to state the second law is that heat never flows from a cold area to a hot one. The outside environment is colder than the hot gas in the chamber, so any energy lost to the outside cannot be recovered. The walls of the piston heat up, and in turn heat the air around them. This energy dissipates into the environment and is lost. This irreversibility alarmed scientists of the time, as it implied that eventually, the universe would run down. This “heat death of the universe” would mean that everything would reach the same temperature, and entropy would reach a maximum.

Recall, though, that the energy in the system is measured statistically, not directly. To measure it directly would require knowing the instantaneous position and momentum of every molecule in the piston. If one could do that, the system would be a Newtonian system. James Clerk Maxwell devised a thought experiment to do exactly this: Maxwell’s Demon. The demon in this system is an entity which can observe every molecule in the chamber. The chamber has a “frictionless” door which the demon can open to allow fast-moving molecules through, and close to exclude slow-moving particles. One side of the chamber would increase in temperature and energy, defying the second law.

Of course, this was only a thought experiment. Frictionless doors don’t exist. But friction isn’t why Maxwell’s Demon fails. The failure of Maxwell’s Demon is due to another surprising attribute of entropy: it is deeply connected to the idea of information. To reduce the entropy of the system, Maxwell’s Demon has to gather information about the position and energy of every molecule in the system. But the formal connection between entropy and information wouldn’t be made until the mid-20th century.

Claude Shannon was studying communication systems for Bell Labs in the 1940s. His 1948 article, and later monograph, A Mathematical Theory of Communication, derived an equation for the statistical behavior and analysis of information which was the same as the earlier equation for entropy derived by Boltzmann in the 1870s. Shannon called this informational concept “entropy,” as it followed the same laws as thermodynamic entropy. Like thermodynamics, Shannon’s entropy was a statistical measure of disorder. Like thermodynamic entropy, which is associated with waste heat, informational entropy is noise.

What is sometimes confusing about Shannon’s concepts of information and entropy is that entropy is a measure of the informational content of a message. But in his theory, information is not be confused with meaning. Shannon information is not necessarily meaningful information — it is simply unexpected. If you were expecting a particular message, and you receive it, you have gained no information. The information content, and thus the entropy, of that message is very low. But a completely unexpected, even random, message would have very high information content, and very high entropy.

When Wiener was writing his introduction to Human Use of Human Beings, he did not quite appreciate the implications of Shannon’s theory. Wiener was concerned with meaningful information. Meaningful information increases as the expectedness of a message decreases, but only up to a certain level. Completely random information may be as useless as completely expected information:

Shannon versus Wiener Information

For Shannon, as a message passes through a circuit, it can pick up information (gain entropy) due to the noise in the circuit — and all circuits have some noise — line noise caused by interference. Even a perfectly shielded line will have “shot noise” caused by the quantum fluctuations of electrons or photons. These manifest as noise in a circuit, as random light or dark pixels on a screen. In Shannon’s theory, noise adds entropy, or disorder, to a message. They increase its information content because they are unexpected.

These quantum contingencies are real, and they are a real part of our world. If we examine a steam engine in terms of information theory, we see that we have incomplete information about the system. If we actually knew the momentum and position of each molecule, and kept that information continuously updated, the entropy of the system would remain constant. It would not become more disordered, because we would have all of the information needed to describe the system at any point in time. The system would also be reversible, as any completely Newtonian system would be. We could be Maxwell’s Demon.

But we cannot get all of this information simultaneously. Following all of those molecules gets us down to realm of quantum mechanics, just as the shot noise in Shannon’s circuits did. And like entropy, quantum mechanics is statistical. Famously, the Heisenberg Uncertainty Principle tells us that one cannot know both the position and momentum of a particle simultaneously. If we know the position of a particle with certainty, we will be uncertain about its momentum, and vice versa. The more certain the position is, the less certain the momentum.

Some non-technical versions of Heisenberg suggest that the measurement of one property disturbs the other (measuring momentum will disturb position and vice versa). While this may be true in a classical Newtonian system, that is not what the Uncertainty Principle says. Quantum indeterminacy is not the result of instrumental error. Position and momentum are related in such a way that they cannot both be determined at the same time. In many interpretations of quantum physics, these complementary properties are undetermined until measured, and accurate measurement of one property precludes accurate measurement of another. Unlike classical systems, a quantum system is not disturbed by a measuring instrument — rather, the properties of such a system are determined by the choice of measurement.

These quantum effects are generally too small to have an effect on our world (the “classical” world). Most of our day-to-day contingencies are due to the vast amounts of information generated by the interrelationships between events. But even if we could follow all of the information in our contingent world, quantum indeterminacies would still remain.

On a practical level, we cannot know the positions and momenta of gas molecules in a cylinder; the amount of information is too great. If we make the system smaller, though, quantum effects prevent us from having complete information about the system. If we make the system larger, we have a world: a world in which we can always get more information, but never complete information. This is our ever-changing, dynamic, contingent world.

--

--

Dean Taciuch
The Startup

(Pronounced TASS-itch) teaches in the English Department and the Honors College at George Mason University in Fairfax Virginia