Boltzmann brains and entropy
Ludwig Boltzmann was an Austrian physicist. His greatest achievement was the formulation of statistical mechanics. The consequences of his work are still being investigated today — including, interestingly enough, in the field of cosmology, under the guise of the Boltzmann brain. But first, a little background.
Entropy can be defined (in a non-rigorous sort of way) as the tendency of systems to go from an ordered, defined, structured sort of state (like the moment when you first pour milk into coffee) to a more uniform state, equilibrium (like when the milk has fully dispersed through the coffee).
The reason for this has to do with phase space. Think about the pieces of a jigsaw puzzle. There’s only one way they can be correctly placed, right? Now, there’s a ton of different ways they can be incorrectly placed. That’s kind of like phase space — it’s the different ways a situation can play out, sort of. More specifically, it’s all the possible states of a system.
So, let’s think about a cloud of steam. When you pass your hand through it, it doesn’t really look much different, right? It is high entropy. Why? Because there are so many particles of steam, if you rearrange a couple, it looks the same. Boltzmann knew this. He said that systems went from low entropy to high entropy because there were many more unordered states than ordered states. It’s a matter of probability.
Now, because it is a matter of probability, sometimes things move from high entropy to low entropy. This is called a fluctuation. This is really unlikely, which is why we don’t see things spontaneously appear, or reassemble. While we drop a mug and it breaks, we sure don’t see mugs coming back together and leaping into our hands. This is the sort of thing that happens over timescales orders of magnitude larger than the length of the universe. In other words, you’re not going to see anything spontaneously reassemble anytime soon.
But the probability still stands. There are so many more possibilities for systems in phase space that are high entropy that it is incredibly more likely for something to move from low entropy to high entropy. This leaves us with the question of why the universe started with low entropy. If systems are “naturally” high entropy, why was there ever anything that was low entropy? Why, if things go toward equilibrium, does anything distinct, like people, computers, paintings, anything — exist? This is still a problem today in cosmology.
What does all this mean in the context of Boltzmann’s formula, S = k log W? Well, first, S is the entropy of a system, k is a constant, and W is the number of microstates in a macrostate. This is what we were talking about earlier with phase space.
The number of microstates is basically the number of ways we can rearrange, say, the molecules in a steam cloud, so that the overall steam cloud looks exactly the same (this overall picture of the steam cloud would be the macrostate). So, when there are many microstates within a given macrostate, the system is higher entropy. When there are fewer microstates within a given macrostate, the system is lower entropy.
To illustrate the second part, think about a couple of wooden toy blocks — maybe four. There really aren’t that many ways that you can arrange them so they are different but look the same (for example, if the macrostate was having all four in a vertical stack, you could maybe move the bottom block to the top, but you can’t take the top block off). So the system is low entropy.
Interestingly, this is why entropy has such a large connection to information theory — the number of ways you can make a message make sense, or contain large amounts of information, are very similar to the number of microstates in a macrostate.
Boltzmann’s answer to the cosmological problem
Imagine the entire universe in thermal equilibrium — in other words, the highest state of entropy possible. Now, since it’s the highest state of entropy possible, the entropy cannot possibly increase, so it’ll stay steady…except for fluctuations. We can calculate how likely fluctuations are. As can be expected, larger fluctuations are exponentially less likely than small ones, but every type of fluctuation will eventually happen.
In other words, maybe our universe is in a state of fluctuation away from its normal equilibrium. The low entropy of the early universe, according to this idea, is a “statistical accident”. Another way you could look at it is that the big fluctuations create a smaller, new, low-entropy universe within the larger maximum-entropy one.
After this, Boltzmann used anthropic reasoning to explain why we’re in the fluctuation regions as opposed to the vast, vast majority of the universe’s time, which is in a state of thermal equilibrium (anthropic reasoning is based off the anthropic principle, which you can learn more about here). A quick summary of the principle is that we can only wonder about the problem if we’re here, which to a lot of people seems kind of like an invasion of the question.
With this reasoning, you might as well say, “That’s just the way it is.” Instead, we have to consider what experimental results we should see if this idea is true. People have done this, and, well, let’s just say there are problems.
At last, the subject of the article! The most basic problem with Boltzmann’s idea is called a Boltzmann Brain. The fluctuations that we are talking about, the low-entropy fluctuations, are very rare (the lower the entropy goes, the rarer they are). If we find ourselves explaining the low entropy of the early universe with the anthropic principle, we should be in the minimum possible entropy fluctuation that allows for existence.
That minimum fluctuation is a Boltzmann Brain, or the fluctuation that allows for a conscious brain with enough sensory inputs to look around and recognize that it exists before going out of existence. These fluctuations are rare, but they are much, much less rare than the type of fluctuation we are in.
So…why are we in the type of fluctuation we are in? We have no idea. Honestly, none. And that leaves us (sort of) back where we started — why did the universe start in such a low entropy configuration? Here, again, we don’t know.
“We are an impossibility in an impossible universe.” -Ray Bradbury
For learning about the entropy arrow of time and the cosmological aspect of entropy, I’d recommend From Eternity to Here by Sean Carroll. Wikipedia’s article on entropy is as always useful. A note about this article — I write answers on physics stack exchange, and I heavily used two to write this article — these answers can be found here and here.