Uncertainty: Entropy

Joshgun Guliyev
Mar 3 · 4 min read

Have you ever thought about how certain you are about the occurring events or decisions you make. If you have, let’s talk more detailed.

Firstly, let’s start with “what is uncertainty?” . “Uncertainty refers to epistemic situations involving imperfect or unknown information. It applies to predictions of future events, to physical measurements that are already made, or to the unknown”. That is the definition from Wikipedia.

Basically it is the measure of how much you sure about something. For example , if you know that there are 3 red balls in the box, and if someone ask you to pick a ball and say the colour of the ball, without looking at it. How much will you be sure about saying the colour of the ball? Answer will be Red with 100 percent sure. It means that your uncertainty level is in the minimum at this state, because you have enough knowledge about the box. Then , if there are 2 red and 2 green balls, and the same question is asked, will you be sure with your answer as previous one? Absolutely not, there are equally possible two outcomes , and in this scenario you are totally unsure about the outcome. That is what the uncertainty is.

In Statistical Information Theory ,there is a termination called “Shannon entropy” , which measures the uncertainty that we talk about. Let’s recall the previous scenarios again and think mathematically about them.

We can see that Higher entropy means we are more uncertain about something and we are sure if the entropy is low.

Or we can define the entropy by the average number of yes/no questions to ask to find the answer of random outcome in the smartest way. For example, in the first scenario , how many questions do you need to ask to find the colour of the ball ? Zero. Because you know that is red and entropy is zero.

In the second scenario, if someone pick a ball, you will need 1 question to ask to find the colour off bal. For instance , you can ask “is it red?”, if yes , it is red, else it is green. If there are 2 red balls and 1 green, and 1 blue? You can ask “is it red? If yes , it is red, else is it green , if yes green , else it is blue. So you will ask 1 or 2 question , and averagely (1+2)/2 = 1.5. Here entropy will be 1.5 . Let”s check it with formula:

Let’s say that we have a fair coin, and has two possible outcomes as Head and Tail. And we have a box which there is 3 red balls , 1 green , 1 blue and 1 yellow ball. So probability of Head or Tail is 0.5 and probability of picking a red ball is also 0.5. It looks like if they have same probability , so we are equally sure of Tail and Red ball, because there is half a chance of occurrence for both of them. But if look at the entropies for both situation,

It shows that however the probabilities are same, we can not be sure in the same level if the options are more. We are more certain for the outcomes of Coin here as its entropy is smaller than box’s.

Entropy is the widely used measurement in the Machine Learning, especially in the classification algorithms like Decision Trees, and I wanted to give a brief and easy explanation about the “uncertainty” and its measure “entropy”.

Thanks for Reading!!!

Analytics Vidhya

Analytics Vidhya is a community of Analytics and Data…

Analytics Vidhya

Analytics Vidhya is a community of Analytics and Data Science professionals. We are building the next-gen data science ecosystem https://www.analyticsvidhya.com

Joshgun Guliyev

Written by

Industrial Engineer with focus on Machine Learning, Optimization and Deep Learning. I love to share my knowledge and ideas :)

Analytics Vidhya

Analytics Vidhya is a community of Analytics and Data Science professionals. We are building the next-gen data science ecosystem https://www.analyticsvidhya.com