The Second Law of Thermodynamics

Matthew Gliatto
ILLUMINATION
Published in
12 min readDec 18, 2020

The Second Law of Thermodynamics relates the concept of entropy, a measure of disorder, to the concepts of heat and temperature. It states that the change in a system’s entropy can never be less than the heat added to the system, divided by the ambient temperature. One very important corollary is that if a system is isolated (meaning it doesn’t interact with its surroundings), then its entropy can never decrease …… and for any real-world process, it will always be increasing. In an isolated system, the entropy is always rising.

Mathematically, the Second Law of Thermodynamics is written as:

The Second Law of Thermodynamics

where δQ is the heat added to the system, T is the ambient temperature, and dS is the change in the system’s entropy. (I explained the concept of a “system” in my previous post. All systems are assumed to be closed systems unless stated otherwise.)

But this mathematical statement leaves so many questions unanswered; most importantly, I still haven’t explained what entropy is. In this article, I will shed some light on these terms. After briefly discussing heat and temperature, I will explain the concept of entropy; I will then discuss the Second Law of Thermodynamics (2nd Law of TD) and its significance.

I. Heat and Temperature

Heat and temperature are not the same thing. In fact, in the context of science, “heat” is really a verb rather than a noun. In science, heat is a process, while temperature is a quantity.

Heat is when you transfer energy to a system by means of conduction, radiation, or latent heat release. Conduction is when a warmer object heats an adjacent cool object. Radiation is energy traveling through space (like how the sun heats the earth). And latent heat release is when something changes from gas to liquid, or from liquid to solid, and releases energy by doing so. All of those transfers of energy are classified as heating. Heat is one of two ways of transferring energy to a system, work being the other.

The earth absorbs radiation from the sun. That’s how the sun heats the earth.

Temperature, meanwhile, is a measure of the average speed of the particles in a system. It is a property very familiar to us and can be measured with a thermometer. Temperature can be reported in the Fahrenheit (F), Celsius (C), or Kelvin (K) scales — but in the context of the 2nd Law of TD, it must always be reported in Kelvin. As a result, it will always be a positive number.

In the 2nd Law of TD, the temperature must be reported in the Kelvin scale

Heat and temperature are related to each other: when you heat a system, its temperature usually (but not always) rises. Nevertheless, they are two different things: heat is a process, while temperature is a quantity.

II. Entropy: Microstates and Macrostates

Entropy is a measure of disorder. It is one of the strangest concepts in all of science, because it unites the molecular world and the macroscopic world. It’s also a remarkable concept, because it has always surprised me that there’s a way to quantify disorder. But there is, and it’s called entropy.

Although some students would prefer to just leave it at that, entropy does have a formal definition. It is defined by Boltzmann’s equation:

Boltzmann’s Equation

where S is the entropy of a system, W is the number of microstates associated with the system’s current macrostate, ln is the natural log, and k is Boltzmann’s constant. Boltzmann’s constant is equal to 1.38 x 10^(-23) J/K. (J/K means joules per kelvin. That’s just the units that are associated with the constant.)

But as with the two Laws of Thermodynamics, there’s a lot to unpack in that one little equation, especially the concepts of microstates and macrostates.

Boltzmann’s equation and Boltzmann’s constant are both named after Ludwig Boltzmann (1844–1906). Boltzmann was the founding father of a branch of physics called statistical mechanics. And indeed, microstates and macrostates are concepts in statistical mechanics.

Ludwig Boltzmann (1844–1906)

I don’t want to delve too deeply into statistical mechanics, so I’ll try to keep this brief. The macrostate of any given system refers to the set of its macroscopic properties: the sort of things that you can directly measure and observe. For example, if I measured the temperature, pressure, volume, and density of the air in this room, then that would be its macrostate. But if I changed any of those properties — for example, if I cranked up the thermostat and raised the air temperature — then that would be a new macrostate.

Meanwhile, a microstate refers to the specific arrangement of all the molecules in a system. For any given macrostate, there are lots and lots of microstates that could produce that same macrostate. Just think of all the ways you could “scramble” all the air molecules in a room, all of which would yield the same macrostate (the same temperature, pressure etc.).

How many ways could you arrange all the molecules in a room? (Image from silcom.com)

For a given system in a given macrostate, the number W refers to the number of distinct microstates that could produce that same macrostate. And Boltzmann’s equation defines the entropy of the system to be the natural log of W, multiplied by Boltzmann’s constant (k). As such, the entropy (S) is just a modified form of W. Entropy is an indirect measure of how many microstates could produce the system’s current macrostate.

For any macroscopic, real-world system, the number W will be astronomically large, probably well over 10^(10¹⁰). That’s why we have to take the natural log of it and why we have to multiply it by Boltzmann’s constant, which is an infinitesimally small number. (We also had to correct for the units.) But after making these adjustments, the number S is small enough that we can work with it. Entropy (S) is measuring the same thing as W, but on a much more practical scale.

III. The Significance of Entropy

Again, entropy is an indirect measure of W, the number of microstates that could produce the system’s current macrostate. But why do we care about this quantity? Why does entropy matter?

The significance of entropy is that it tells us how close a system is to equilibrium, a state in which all opposing processes are in balance. A system that is closer to equilibrium has more entropy than a system that is farther from equilibrium.

This seesaw is in equilibrium — albeit an unstable one — because all forces are in balance. (Image from schoolphysics.co.uk)

When I first learned about this, I found it very counterintuitive. Equilibrium is when all things are in balance — that sounds like order. Meanwhile, entropy is a measure of disorder. But a system in equilibrium has the maximum possible amount of entropy. How does that make any sense?

The trick is to realize that the subject of statistical mechanics understands the ideas of “order” and “disorder” very differently than we do. In statistical mechanics, a system is considered to be more disordered when there are a larger number of microstates that could yield its current macrostate. And that number is highest when the system is in thermodynamic equilibrium. Thus, equilibrium is the state of maximum disorder. It’s weird, but it’s true.

As such, entropy is a measure of how close a system is to equilibrium. Whenever a system approaches equilibrium, its entropy increases, and whenever it strays farther away from equilibrium, its entropy decreases.

For example, let’s say I have a bathtub filled with water. Let’s say I drop a bucket of ice in one end and pour some boiling water into the other end. Now, one end of the bathtub is warmer and the other is cooler. But over the next few minutes, the warmth will be redistributed by circulation and diffusion, and all the water will return to its original temperature.

Let’s say I filled this tub with water and then warmed one end and cooled the other end. After a few minutes, all the water would return to its original temperature

Over the course of that little experiment, the entropy started out high, plummeted downwards, and then returned to (roughly) its original value. Initially, the water in the tub was in equilibrium: maximum possible entropy. But when I added the ice and the boiling water, the temperature distribution was unbalanced. It was no longer in equilibrium and so the entropy was then much lower. But over the next few minutes, the water returned to its original, balanced state, equilibrium was restored, and the entropy returned to (roughly) its original value.

In my little experiment, the entropy plummeted downwards but gradually recovered

There are two other important correlations regarding entropy. First, entropy is higher when the temperature is higher. For example, if I have two pots of water, one boiling and one at room temperature, then the boiling one has much more entropy than the room-temperature one. Second, entropy is an extrinsic property: it increases when you add mass. For example, a two-pound weight has more entropy than a one-pound weight, just because it has more mass.

The bigger weights have more entropy than the smaller weights, just because they have more mass. (Image from colourbox.com)

IV. The Second Law of Thermodynamics

Having explained the meaning of entropy, we can finally return to the 2nd Law of TD. Recall that the 2nd Law of TD states that

where δQ is the heat added to the system, T is the ambient temperature, and dS is the change in the system’s entropy.

The Second Law of Thermodynamics states that for any system, over an infinitesimal interval of time (dt), the change in the system’s entropy can’t be any less than the heat that the system gained, divided by the system’s temperature. Thus, the 2nd Law of TD places a lower limit on the entropy change. It can’t be any lower than δQ/T.

As with the First Law, the reason the heat and entropy terms are written with d’s is that they are differential quantities. It is assumed that we were only observing the changes in the system over an infinitesimal interval of time, dt. But we could integrate the 2nd Law of TD over an actual, nonzero interval of time and then we could drop the d’s:

The 2nd Law of TD over a nonzero interval of time, assuming constant temperature

where Q is the heat added and ΔS is the change in the system’s entropy. (But you can only do that if T is a constant.)

Also like the First Law, the heating term is denoted δQ, rather than dQ, because heat is not a state function. A state function is a property of a system in the present, which you could measure without having to know about what was happening to the system beforehand. Entropy is a state function, but heat is not. That’s why we write the entropy change as dS but the heating term as δQ.

V. Reversible and Irreversible Processes

It may seem a bit odd that the 2nd Law of TD contains a “less than or equal to” symbol, ≤. Most laws in physics just have an equals sign, and a few of them might have greater-than or less-than signs, but ≤ is definitely unusual. However, it turns out that the 2nd Law of TD is really a combination of two statements: one for reversible processes and one for irreversible processes. For reversible processes, it’s an equals sign, and for irreversible processes, it’s a less-than sign.

In thermodynamics, a process that happens to a system is called reversible if you could, hypothetically, make it go backwards and recreate the system’s original state, while keeping the system in thermodynamic equilibrium the whole time. A process is called irreversible if it’s not possible to do that.

For a reversible process, we get the equality version of the 2nd Law of TD: δQ/T = dS. And for an irreversible process, we get the inequality version: δQ/T < dS. In the real world, no process can ever be truly reversible, although some of them can be pretty close. All real-world processes are irreversible. As such, it might make more sense to just write the 2nd Law of TD as an inequality: δQ/T < dS.

VI. The 2nd Law of TD for Isolated Systems

There is one very important corollary to the Second Law of Thermodynamics: if a system is isolated, then its entropy is always increasing. In fact, many textbooks will just state this as the Second Law itself, even though that’s not completely accurate.

A system is isolated if it has no interactions with any other systems. In particular, this means that the system cannot give or receive heat. Therefore, δQ = 0. Plugging that into the 2nd Law of TD gives us 0 < dS. That is, dS must be a positive number. In an isolated system, the entropy is always increasing.

This system of particles is isolated. It does not interact with its environment. (Image from stackexchange.com)

The significance of this fact lies in two important facts about entropy that I had listed above: first, that it is a measure of disorder, and second, that it measures how close a system is to equilibrium. Since we know that the entropy of an isolated system is always increasing, that means that an isolated system is always becoming more disordered and is always approaching equilibrium. In other words, if you leave something alone, it will tend towards equilibrium, and by doing so it will become more disordered.

(Again, it may seem strange that “disordered” means the same thing as “close to equilibrium”. It has to do with the way statistical mechanics understands the concepts of “order” and “disorder”.)

When it comes to the tendency towards equilibrium, we can see this in countless instances in the real world: if you leave something alone, it ultimately approaches a balanced state. As just a few examples:

If you build a sandcastle, it will ultimately be flattened by water and wind.

If you put a drop of red food coloring in a pot of hot water, then within a minute or two, all the water will be pink.

If you leave a cup of hot coffee in the kitchen, the coffee will ultimately reach room temperature.

If you’re filling a bathtub with water, and you turn up the hot water, then at first the front end of the tub will be warmer than the back, but by the time you’re done with your bath, all the bathwater will be at the same temperature.

If you build a sandcastle and then leave it alone, it will ultimately be flattened by water and wind

The moral of the story is that you can create an imbalance in a system, but as soon as you leave the system alone, it will ultimately re-establish balance. When left alone, a system’s entropy rises and it approaches equilibrium. That is the takeaway message from the Second Law of Thermodynamics.

VII. As Applied to the Whole Universe

In particular, this message applies to the biggest system of all: the entire universe. The universe is, by definition, an isolated system. And that means that the entropy of the entire universe is always rising. That means that the universe always tends towards disorder. But it also means that the universe is approaching equilibrium. In the very, very long term, we would expect that the universe would reach a state of perfect equilibrium. But who knows? We won’t be around by then, anyway.

The entropy of the entire universe is always increasing.

VIII. Other Applications

Entropy is also involved in two other important quantities in thermodynamics: the Helmholtz Free Energy (F) and the Gibbs Free Energy (G). These are defined by F = U — TS and G = H — TS, where U is internal energy, H is enthalpy, and enthalpy is defined by H = U + pV. The significance of the Gibbs Free Energy is that it dictates the spontaneity of a process. For any system, a process can happen spontaneously if and only if ΔG < 0.

IX. Works Cited

1. Atmospheric Thermodynamics (Second Edition), by Iribarne & Godson, 1981.

2. Chemistry: The Central Science (13th Edition), by Brown et al. 2015.

3. Essential Environment, by Withgott & Laposata, 2012.

4. Biology (Eighth Edition), by Campbell et al., 2008.

5. Modern Physics: Second Edition, by Harris, 2008.

6. Wikipedia

7. YouTube

X. See Also

The First Law of Thermodynamics

Images are from copyright free from Wikipedia. Formulas cited by author.

--

--