Image Credit: Simon Robertson / Flickr Creative Commons

Don’t believe the dictionary: entropy is information, not energy

Domino Valdano
Physics as a Foreign Language

--

If you look up the definition of the word entropy in the dictionary, there’s a good chance you’ll see it defined as a type of energy. Well, please don’t believe everything you read — even if it’s printed in an official looking dictionary! The main point I’d like to make in this post is:

Entropy is not a type of energy, it’s a type of information.

I’m not trying to make a controversial statement here, just sharing a view which I think most physicists are plenty aware of: dictionaries often get physics words horribly wrong. Some physicists may prefer to define entropy in other equivalent ways, avoiding the specific term “information”, but I doubt there is anyone who actually works with entropy in physics today who thinks of it as a type of energy.

I have no problem with Wikipedia’s definition of entropy, or with some dictionary definitions, but sadly most dictionaries get it wrong. As an example of one of the worst, Merriam & Webster’s definition starts with:

a measure of the unavailable energy in a closed thermodynamic system that is also usually considered to be a measure of the system’s disorder

This definition is incorrect. In fact, I think it’s possibly the worst dictionary definition of any word I’ve ever run across. There are 2 big problems with it:

First, entropy is a quantity which is well-defined for all systems (open or closed), so including the adjective “closed” in there is unnecessary and contributes to the widespread internet myth that the laws of thermodynamics apply only to closed systems rather than to all systems. Investigating the thermodynamic properties of open systems, including their entropy, is a huge area of active research in science. In fact, just about all of chemistry and biology deal with open systems. So giving people a dictionary definition which excludes them from consideration is a huge oversight.

But second and more importantly, entropy and unavailable energy are two different concepts. At best, there is a connection between the two which holds in some rare idealized cases. The phrase “unavailable energy” is actually ambiguous and could refer to several different things, but none of those are the same as entropy.

Every measurable physical quantity has units attached to it. So one necessary condition for two physical quantities being the same is that they have (or at least can be expressed in) the same units. For example, energy can be measured in units of joules, calories, ergs, kilowatt-hours, or electron-volts. All of these are equivalent in that the difference between them is only a conversion factor, similar to the conversion factor between miles and kilometers or gallons and liters. For example, 1 Calorie = 4184 Joules, and 1 Joule = 10-million ergs. If the dictionary told you that volume was the same thing as energy, then you’d know immediately it was lying, since volume is measured in units of cups, gallons, liters, cubic-centimeters, etc. and cannot be measured in joules or calories.

What units is entropy measured in? The most natural units are bits (the basic unit you measure any kind of information in), but for convenience and historical reasons it’s more common to measure it in Joules/Kelvin. These are equivalent units, as 1 Joule/Kelvin = 1.045×10²³ bits = 13-billion Terabytes = 13 Zettabytes. None of these is equivalent to joules, calories, or any of the units that energy can be measured in. Like volume, it is simply not the same type of thing as energy.

When measured in Joules/Kelvin, there is a connection between energy and entropy. I usually avoid equations in this blog, but I’m going to write down a few to illustrate the connection between the concepts here. If we represent energy as E, temperature as T, and entropy as S, then there is another property of physical systems which physicists like to define called the “Helmholtz free energy”, often denoted by F:

F = E-TS

Like E, F is also measured in units of energy (joules, calories, etc.). That’s why they can be on opposite sides of an equals sign without being multiplied by any other variables. For certain types of systems (those held at constant temperature and volume), F represents the available energy of a system, the “free energy”. However, for other types of systems (those held at constant temperature and pressure), F is not the available energy; the available energy is instead given by the Gibbs free energy, usually denoted by G:

G = E-TS+PV

Because neither G nor F represents the “free energy” of a system in all situations, a lot of scientific organizations now recommend dropping the prefix “free” and just calling them the Helmholtz energy and the Gibbs energy. This avoids anyone making the mistake that either of them refers to the one true “available energy” of a system, which isn’t really a well defined concept without specifying what type of system and how you plan on extracting energy from it.

At least with the Helmholtz energy, there is a reasonable interpretation of the second term on the right (TS = temperature x entropy) as the “unavailable energy” of a constant temperature constant volume system. This is energy which you cannot extract as useful work without changing the temperature or volume of the system. But for just about any chemical or biological experiment, it’s actually pressure that’s held fixed rather than volume so this interpretation doesn’t even work for many very simple cases.

Beyond that, there are other situations where neither the temperature nor the volume of a system are held fixed. In those cases, neither the Helmholtz energy, nor the Gibbs energy represent a free energy, even though the entropy of such systems is still well-defined. When the temperature is changing, this is called “non-equilibrium thermodynamics” and represents another large class of important physical systems and a large area of research.

Even in the cases where there is a direct connection between entropy and unavailable energy, it’s TS rather than S itself which is the unavailable energy. But if the temperature is held fixed, they are at least proportional to each other — so you can use one as a proxy for the other in this case. Similarly, there are special situations where the volume of a system can be used as a proxy for the energy of a system. But as far as I know, nobody has ever taken this as a license to define volume as energy or energy as volume. The same can be said for just about any two physical quantities you can think of: they are sometimes directly related, and sometimes not.

Why do dictionaries get this one so wrong? They’re actually repeating a widespread myth that shows up in a lot of places in science articles and elsewhere on the internet. There were even a series of articles which came out in 2010 claiming physicists had demonstrated you could convert information into energy. The world is full of information, it’s everywhere around us. So obviously if this were true, we would have an unlimited energy supply. Forget worrying about coal, oil, solar, or nuclear power — just tap pure information! Unfortunately, it’s every bit as false as the Merriam & Webster definition, and based on the same myth. What’s true is that you can convert information into entropy and back, because entropy is just one of many forms of information. Similarly, you can convert matter into energy and back, since matter is just one of many forms of energy. But matter and energy are very different from information and entropy. They represent different categories of thing, and that’s why they are measured in different units.

How did this myth get started? Its origins can be traced back to a man named Peter Tait:

Peter Guthrie Tait

Peter Tait was a 19th century Scottish physicist, and a friend of two more widely known Scottish physicists of the late 19th century: James Clerk Maxwell and William Thomson (aka Lord Kelvin).

The term entropy was coined by the German physicist Rudolf Clausius in 1865. Clausius didn’t know at the time that entropy represented the hidden information of a system, he just knew that he had found a particular combination of thermodynamic variables that seemed to play an important role in physics. So he labelled this by the symbol S and named it entropy. In some of his earlier papers, he had referred to it as “transformation content” (because he discovered it while analyzing the capacity for heat engines to transform heat into work). So he took the Greek word trope meaning “transformation” and added the prefix en- and the suffix -y to make it sound similar to energy. He saw the two as analogous because they were both conserved quantities in reversible physical processes such as the Carnot cycle of a heat engine.

Rudolf Clausius, who coined the term “entropy”

Tait wrote a more popular but less rigorous account of Clausius’s work, where he explained Clausius’s entropy as “available energy”. This annoyed Clausius, who wrote to Tait explaining that he was misrepresenting his work. He may not have known exactly what entropy was, but he knew it wasn’t a type of energy.

Clausius suggested that if anything, entropy seemed to have more to do with how much the molecules in a gas were spread out than how much energy they had. But overall, little was known about microscopic physics in the 19th century, so he knew that it was best to leave entropy defined only as a specific mathematical state function rather than to try to interpret its true meaning further. Tait was unconvinced that there was anything wrong with his idea of entropy as available energy, so he kept on spreading that around.

At some point, Maxwell also made a similar mistake, writing in one of his books that entropy was “unavailable energy”. But several years later, after reading the work of Josiah Willard Gibbs, and some urging from Clausius, he publicly recanted that and removed the reference to entropy as unavailable energy in an updated version of the same book. He blamed Tait for confusing him on this point.

To get a better visual picture of how clearly different the concepts of entropy and unavailable energy are, here is a diagram Gibbs drew illustrating the relationship:

Diagram by Josiah Willard Gibbs — 1873

The vertical axis measures energy, while the horizontal axis measures entropy. The available energy in this diagram, known more commonly now as the Gibbs energy, is the line segment AB. Obviously, the relationship between energy and entropy is complex, but entropy represents neither available nor unavailable energy. This was clear to Clausius, it was clear to Gibbs, and after reading Gibbs it became clear to Maxwell. Tait may have convinced his friend William Thomson, and he may have succeeded in getting his definition into the dictionary, but the way the word is used by actual physicists today is consistent with its original definition by Clausius (correctly understood by Gibbs and eventually Maxwell), not with Tait’s distortion of it. Clausius was right to resist interpreting it, as a full interpretation of what entropy is on the microscopic level required Shannon’s information theory of the mid 20th century to complete.

150 years later, I think it’s about time for Tait’s misinterpretation of entropy to die. Entropy is a measure of the amount of hidden or missing information contained in a system, not a measure of the amount of available or unavailable energy. If you’re reading this, Merriam & Webster, please consider updating your definition.

If you enjoyed reading this, please click the 💚 to recommend it to other Medium readers. Thanks!

--

--

Domino Valdano
Physics as a Foreign Language

PhD Theoretical Physics, UC Santa Cruz 2009, Advisor: Tom Banks