Entropy: How To Actually Measure Uncertainty

The science behind zip compression and password protection!

Hemanth
Street Science

--

Entropy: How To Actually Measure Uncertainty — A stick figure on the left flips a coin and is asking the following question in its head: “Heads or tails?” Below this bubble is seen the following word highlighted inside a square block: Entropy.
Entropy — Illustrative art created by the author

In information theory, entropy is a measure of uncertainty. It lies at the heart of some of the most fundamental computer science concepts such as the unit of information (bit) as well as advanced applications such as data compression (zip files), password protection, etc.

Since the concept of information goes beyond the world of computers, I often find myself applying my knowledge of entropy to a wealth of real-life applications. So, I thought it would be a worthy topic to cover in its own series.

In this essay, we will begin by understanding the notion of entropy with the help of simple examples. For this purpose, the only prerequisite is that you are familiar with fundamental probability theory concepts. Without any further ado, let us begin.

Flipping Coins and Counting Heads

Let us say that I flip a fair coin and ask you to guess the outcome. Although the outcome is “uncertain”, you and I can be “certain” that there are only two possible outcomes: Heads (H) or Tails (T).

In the world of computers, the outcome (H or T) can be represented by a single binary digit (known as bit). In other words, one bit captures the level

--

--