Chaos Theory, The Butterfly Effect, And The Computer Glitch That Started It All
For centuries, we thought that the Universe was completely deterministic. But even if you know all the rules, you can’t get rid of chaos.
As Bob Dylan famously sang, “You don’t need a weatherman to know which way the wind blows.” Yet if you do have enough wind velocity information, combined with an array of readings from barometers, thermometers, and such, you might ask a weatherman, particularly a trained meteorologist with access to state-of-the-art computers and software, to make a sound forecast. We often plan our outdoor activities these days with the help of newscasts, websites, apps, and voice assistants that provide reasonable forecasts hours or days in advance. It is rather amazing that meteorology can perform such a feat.
On the other hand, if we happen to rely on a sunny forecast to schedule a picnic, and it rains instead, we don’t condemn the entire field of meteorology, or dismiss it as useless guessing. We recognize that it is an imperfect science. Moreover, we recognize that it can only give us probabilities of a particular outcome, not a definitive prediction for what must come to pass. While compared to decades ago, forecasts are so much better, they’re far from flawless. And even with advances in technology, the theory of deterministic chaos shows that they’ll never be perfect.
Everyone knows that quantum theory embodies randomness — or, as Einstein famously put it, “dice-rolling.” But the weather is a large scale effect, which Newtonian physics should be able to handle. Indeed, it does, and quite well. However, chaos theory points to the limitations of prediction for even deterministic, Newtonian physics.
Newton’s second law of motion, the net force on an object equals its mass times its acceleration, embodies the type of mathematical relationship known as a differential equation. That equation acts as a kind of machine for processing the raw data of initial conditions for a system of particles — its precise set of positions and velocities at any given moment, along with the forces of interaction — and churning out location and speed coordinates indefinitely into the future.
In his 1814 treatise, “A philosophical essay on probabilities,” French mathematician Pierre Laplace speculated that Newtonian mechanics heralded a rigid determinism that would theoretical enable the successful prediction of the entire future of the universe, given absolute knowledge of its complete state at any given time. The only catch is that the prognosticator would somehow need to step outside of the universe and obtain a complete snapshot at once of all the particles in it and their instantaneous trajectories. In philosophical discussions such a hypothetical being has been dubbed Laplace’s Demon. As Laplace wrote:
We may regard the present state of the universe as the effect of its past and the cause of its future. An intellect which at a certain moment would know all forces that set nature in motion, and all positions of all items of which nature is composed, if this intellect were also vast enough to submit these data to analysis, it would embrace in a single formula the movements of the greatest bodies of the universe and those of the tiniest atom; for such an intellect nothing would be uncertain and the future just like the past would be present before its eyes.
In the same essay, Laplace argued that any need to invoke probability in nature stemmed from ignorance, including uncertainty in weather forecasts. Someday, he suggested, weather forecasts would be perfectly accurate — as predictable as the orbits of planets — with nothing left to chance. Yet even if it weren’t for quantum phenomena like Heisenberg’s uncertainty principle, this wouldn’t be the case. No matter how well you know the initial conditions, determinism doesn’t rule the Universe.
In the early 1960s, MIT meteorology professor Edward Lorenz was convinced that the mainframe computers used to great effect in planning weapons tests and launching satellites into orbit would help yield accurate weather forecasts. Given that weather is determined by a set of measurable factors, such as temperature, pressure, and wind velocity, conventional wisdom at the time was that a solid model, complete set of data, and a powerful number-crunching device, could, in principle, predict the weather conditions well into the future. With that goal in mind, Lorenz constructed a simple set of equations for air convection and programmed them into his cabinet-sized, vacuum-tube-based Royal-McBee computer.
He input an initial set of data, switched the computer on, and waited for the printout. Placing the output next to the machine, he decided to re-enter some of the data and run the program longer. Typing it in meticulously, he was astonished to find that the program yielded a radically different forecast. Finally, he realized that the computer printout had rounded the data, and what he had input was slightly different the second time than the first. Somehow, even for a straightforward, deterministic set of equations, a minute change in initial conditions yielded radically different behavior.
As he would later note, in what was dubbed the ‘butterfly effect,’ the extreme sensitivity to initial conditions meant that the flapping of a butterfly’s wings over the Amazon could influence the weather in China. This phenomenon, pioneered by Lorenz and others, has found widespread application as deterministic chaos.
Lorenz not only discovered chaos, he also identified its key mechanism. When he graphed his data along several axes, he noted the strange property that iterating (plotting the trajectory over time) any two nearby points resulted in their separation. The gap would grow greater and greater with each iteration until the mathematical “offspring” of the two points would be so widely separated that they be in completely different regions of the cloud of information. On the other hand, points off the cloud, if iterated, would quickly approach it. Thus the dynamics of Lorenz’s equations served two contradictory purposes: repulsion of trajectories within the data set and attraction beyond it. Such a complex system is called a “strange attractor,” with the specific dynamics discovered by Lorenz called the “Lorenz attractor.”
Other strange attractors were discovered soon thereafter, notably the Hénon attractor, identified in 1976 by French mathematican Michel Hénon. Strange attractors possess a peculiar self-similar structure, dubbed “fractals” by French-Polish mathematician Benoit Mandelbrot. If you map out a strange attractor and “blow up” any given region, that smaller region appears similar in structure to the whole thing. Similarly, enlarging any tiny section of the region reveals a similar pattern to the region itself, and so on. Mathematically, that implies a fractional dimensionality, hence the term “fractal.”
We owe Lorenz a debt for finding a key flaw in Laplacean determinism. Even in Newtonian classical mechanics, with its clockwork regularity, some systems are so sensitive to initial conditions that they are effectively impossible to predict. Unless you know every data point with perfect precision — next to impossible with realistic measuring devices — such chaotic systems act as randomly as a series of coin tosses. Thus along with randomness in quantum systems, effective randomness in some classical systems, such as the weather, seems a key feature of nature. God plays dice in more ways than one.
Paul Halpern is the author of fifteen popular science books, including The Quantum Labyrinth: How Richard Feynman and John Wheeler Revolutionized Time and Reality.