🦉 10x curiosity — Issue #58 — Probabilistic Thinking — Bayes Theorem

Tom Connor
10x Curiosity
Published in
5 min readMay 2, 2018

Thinking….

Bayesean think is a way of looking at the world in a probabilist fashion, accepting nearly everything comes in grey scale rather than 100% black or white. We intuitively use Bayes reasoning all the time. Say you’re hungry and looking through the pantry for a snack — yum there is that chocolate cake from a couple of days ago… or was it a week ago? No matter, chocolate cake doesn’t go off you think to yourself. But maybe there is a small chance it isn’t quite ok to eat, say 10% odds that it is off, so you have a good look at it — surely that couldn’t be a bit of mould on it? The odds now have increased slightly that it is not edible — maybe 20% chance. You give it a sniff. Mmmm bit sour maybe? You break a bit open with a fork — is that a maggot crawling out??!!!!! All of a sudden your initial confidence that the cake is edible has evaporated and with each piece of disconfirming evidence swung completely the other way. You won’t be eating the cake.

This is Bayesean thinking and it is all about using the information around you to be less wrong. Start with an initial assumption (your prior belief) and then use subsequent evidence to update your prior.

The Reverend Thomas Bayes was an 18th-century British clergyman who figured out exactly how to deal with these kinds of problems, where we want to know the probability that our hypothesis (that is, our causal explanation for why something happened) is correct given a new piece of evidence (that is, something that has happened).

Written out the equation looks as follows:

Bayes’ Rule

The Reverend Thomas Bayes was an 18th-century British clergyman who figured out exactly how to deal with these kinds of problems, where we want to know the probability that our hypothesis (that is, our causal explanation for why something happened) is correct given a new piece of evidence (that is, something that has happened).

We read the left side, called the posterior, as the conditional probability of event A given event B. On the right side, P(A) is our prior, or the initial belief of the probability of event A, P(B|A) is the likelihood (also a conditional probability), which we derive from our data, and P(B) is a normalization constant to make the probability distribution sum to 1.

An important application of Bayes theorem relates to understanding the cost benefit of conducting extensive medical testing. Examples like this highlight where our intuition lets us down badly. Take the example where there is a test for a disease that is 80% accurate and gives only 10% false positives (that is says you have the disease when you don’t). If 1% of the population has the disease and you test positive, should you be worried?

Let’s test our intuition — If you take 100 people, only 1 person will have the disease (1%), and they’re most likely going to test positive (80% chance). Of the 99 remaining people, about 10% will test positive, so we’ll get roughly 10 false positives. Considering all the positive tests, just 1 in 11 is correct, so there’s a 1/11 chance of having the disease given a positive test. That is, a positive test has increased your risk from 1% to about 8% — still reasonable odds and not time to worry prematurely!

Bayes theorem is used extensively in applications such as spam filters or recommender algorithms. Using a combination of the senders email address and the text in the headline (“viagra” and “sale” for instance!) can significantly alter your prior probabilities and allow accurate filtering to the junk mail folder.

Another very interesting application was during the cold war where the USA at the height of hostilities always had planes airborn with live nuclear weapons able to be deployed at a moments notice should the USSR strike. In the book The Theory That Would Not Die McGrayne outlines how Bayes Theorem was used by the statician Madansky to reveal the frighting risk this policy exposed the population to:

Madansky calculated the number of “accident opportunities” based on the number of weapons, their longevity, and the number of times they were aboard planes or handled in storage. Accident opportunities corresponded to flipping coins and throwing dice. Counting them proved to be an important innovation.

“A probability that is very small for a single operation, say one in a million, can become significant if this operation will occur 10,000 times in the next five years,” Madansky wrote. The military’s own evidence indicated that “a certain number of aircraft crashes” was inevitable. According to the air force, a B-52 jet, the plane carrying SAC’s bombs, would average 5 major accidents per 100,000 flying hours. Roughly 3 nuclear bombs were dropped accidentally or jettisoned on purpose per 1,000 flights that carried these weapons. In that 80% of aircraft crashes occurred within 3 miles of an air force base, the likelihood of public exposure was growing. And so it went. None of these studies involved a nuclear explosion, but to a Bayesian they suggested ominous possibilities.

For those interested in finding more, here are some excellent sources to geek out on

Let me know what you think? I’d love your feedback. If you haven’t already then sign up for a weekly dose just like this.

Links that made me think…

Incredible drone photo’s by Büyüktaş create a warped view of the world
Herdimmunity 520x292

The Unforgiving Math That Stops Epidemics | Quanta Magazine — www.quantamagazine.org
If you didn’t get a flu shot, you are endangering more than just your own health. Calculations of herd immunity against common diseases don’t make exceptions.

1*7gapas0s4yszhfqxyhawgw

Meet our newest self-driving vehicle: the all-electric Jaguar I-PACE — medium.com
Today Waymo and Jaguar Land Rover are announcing an electrifying new partnership. We’re joining forces to design and engineer the world’s first premium electric fully self-driving vehicle, built for…

Apr18 23 521849360 philip gould 01

How Humble Leadership Really Works — hbr.org
Top-down leadership is outdated and counterproductive.

--

--

Tom Connor
10x Curiosity

Always curious - curating knowledge to solve problems and create change