Information Theory and Sciences

Gani Çalışkan
Turk Telekom Bulut Teknolojileri
3 min readAug 22, 2023

Introduction to Information Theory

Information Theory is a branch of applied mathematics and electrical engineering that deals with the quantification,storage and communication of information. It was developed by Claude Shannon in the late 1940’s and early 1950's.

Who is Claude Shannon?

He was an American mathematician, electrical engineer and cryptographer. He is regarded as the “father of information theory”. He did several inventions related to information theory, including artificial intelligence and circuit design. His most notable work is “A Mathematical Theory of Communication” in 1948.

A Mathematical Theory of Communication

That is the diagram of the theory. Shannon’s article laid out the basic elements of communication.

  • An information source that produces a message
  • A transmitter that operates on the message to create a signal which can be sent through a channel
  • A channel, which is the medium over which the signal, carrying the information that composes the message, is sent
  • A receiver, which transforms the signal back into the message intended for delivery
  • A destination, which can be a person or a machine, for whom or which the message is intended

Formula example of that theory

Information entropy is a concept from information theory that measures the amount of uncertainty or randomness in a set of information. It’s a measure of how much surprise or unpredictability is associated with the possible outcomes of a random process. You can see the formula below.

Example Question

What is the information entropy of a coin flip?

H(X) = — (P(heads) * log2(P(heads))) — (P(tails) * log2(P(tails)))

Since both P(heads) and P(tails) are 0.5 (assuming a fair coin), the calculation simplifies to:

H(X) = — (0.5 * log2(0.5)) — (0.5 * log2(0.5)) = 1 bit

You can see the image below that summarizes the outcomes of the coin flips.

Resources

--

--