Mathematically, Too Much Data Kills Productivity

Natu Myers
Free Startup Kits
Published in
5 min readFeb 26, 2018

The idea of entropy in information theory is analogous to one of our biggest challenges in learning during the internet era.

Information Theory is…

The mathematical study of the coding of information in the form of sequences of symbols, impulses, etc., and of how rapidly such information can be transmitted, e.g., through computer circuits or telecommunications channels.

The concept of information entropy was introduced by Claude Shannon in his 1948 paper “A Mathematical Theory of Communication

The above formula means: what’s the minimum number of storage “bits” needed to capture a certain piece information (here’s an example of the formula being used).

Information, Entropy, Uncertainty

“Entropy” is a very important part of information theory. Entropy is a measure of your general uncertainty of what has happened, or is going to happen.

Some elaborations:

You can think of entropy as the wasted energy in a system caused by chaotic or scattered disorganization. — http://qr.ae/TbSsvR

Entropy is the tax you pay to nature for converting one form of energy to another. — http://qr.ae/TbSsn0

Entropy is randomness, complexity, disorder, uncertainty, information, data, and chaos.

Too Much Information = Noise

Ironically (and to be frank), more order is less information. More chaos is more information. Too much information confuses us and takes energy to deconstruct!

Our brains cannot understand too much noise because it is literally an overflow of information. There is too much entropy. In audio or visual processing, noise, or “white noise”, is a random signal having equal intensity at different frequencies. This randomness makes it uncertain and it takes a lot of “energy” to decipher or determine meaning from this randomness, (or chaos).

In a sound wave, our brain can pick up on how it sounds like because it can clearly be modelled by a function such as y = sin(x). A simple sound wave (in the #B note may look like):

Low Randomness/Entropy/Information

Low Entropy/Randomness/Information

A noise wave is unpredictable and has a nearly-impossible-to-predict function such as y=Random(x) where “Random” is a highly chaotic but deterministic way of determining values.

High Randomness/Entropy/Information

High Entropy/Randomness/Information (Visually, the sample principles apply to white noise)

Thermodynamics

Entropy has implications in thermodynamics as well via a different context. Solids have low entropy. Their form is consistent. Liquids have more entropy and gasses have the highest entropies of all due to their changing state.

The second law of thermodynamics can be paraphrased as “In any closed system, the entropy of the system will either remain constant or increase.”

Some argue that the second law of thermodynamics means that a system can never become more orderly. Not true. It just means that in order to become more orderly (for entropy to decrease), you must transfer energy from somewhere outside the system… — https://socratic.org/questions/what-is-an-example-of-entropy-from-everyday-life

Even thermodymics’ definition of entropy, energy, or randomness is linked to information theory’s idea of excess information and how it takes more energy to convert a system to a state of randomness to order. Even the third law states that “The entropy of a perfect crystal at absolute zero is exactly equal to zero.”

Less information, less entropy, less uncertainty

Noam Chomsky created a hierarchy of all possible types of grammars that can be created (either by animals or machines). The more complicated the language, the more intelligence it takes to turn the language into meaning.

Remove the Entropy in Your Life

Focus on your goals. There are natural processes that limit unnecessary information in your life, but after the advent of the internet, information theory’s analogy to us has never been more practical.

It’ll take 23 million years to browse the entire internet. Not counting the deep web (90% of the internet), and video (which totals up to millennia).

Well over 295 exabytes of information has been stored on the internet. 300 exabytes = 300000000000000 megabytes. The average webpage is 2 megabytes. 300000000000000 mb / 2mb per page = 150000000000000 webpages. If you can read a webpage in 5 seconds. So, it will take 150000000000000 pages * 5 sec per page = 750000000000000 seconds or 23 million years.

Research Things With A Plan

If information is all we need, we would all be billionaires with six packs. The reason why so many people cannot make it work has nothing to do with information. — Matt Kay

It takes energy to mine the use out of the excessive data we have access to.

The “plan” we make to achieve our goals should be our “grammar” or “blueprint” in extracting meaningful data from the giant miasma of information we have available.

When setting out to learn something new we must make sure what we’re learning is simply the best possible things that can help us accomplish our goals.

Our biggest threat and impediment we have in taking advantage of all the information we have, is not scouring the web properly and being distracted and getting unintentionally shifted from achieving our goals.

If we fail to plan, we plan to fail.

Get my eBook and the notes I got from the silicon valley sent to you:

I’m freely giving out over 50 documents of notes from Silicon Valley CEOs, to MBA’s on how to start a startup PLUS my eBook “Football Coding & Startups”.

Author: Natu Myers

(Website: Natumyers.com) (eBook: NatuBook.com)

Interested in Writing for Free Startup Kits? Click here to become a writer!

--

--