Bayesian Thinking in Everyday Life

How Bayes’ Theorem can be and should be applied in our everyday life to help us be more rational.

Chong Han Khai
6 min readMay 25, 2020

More than 200 years ago, Thomas Bayes came up with a brilliant idea that has helped shape the world today, called Bayes Theorem. This theorem was famously used by Alan Turing to crack the German Enigma code during World War II and has a lot of applications across different fields in the modern world. However, the intuition behind the seemingly intimidating concept (as with all math and statistics) is actually simple. In fact, we are constantly applying this theorem in our lives without realizing it.

In this blog, I start off with explaining some concepts and give a classic Bayesian 101 example, followed by the benefit and example of applying bayesian thinking in our daily lives.

Here is the only formula in this blog post, so please do read on, I promise that there will be no more formula.

Bayes’ Theorem

I will give a simple and classic bayesian example to explain this equation. If you went to test for cancer and the doctor claims that the test is 95% accurate (i.e. out of 100 people with cancer, the test will be positive for 95 of them and 95 out of 100 people who do not have cancer with test negative). If you tested positive, does it mean that there is a 95% chance that you have cancer? It is definitely wrong but it is also the conclusion that most people will jump into if they are tested positive.

This is where beauty of Bayes’ theorem jumps in, it teaches us to include our prior (one can think of it as knowledge or context) before jumping to conclusions (posterior probability). In the case of cancer,

  1. Prior is knowing the proportion of the entire population who has cancer.
  2. Evidence is proportion of the entire population who gets a positive test.
  3. Likelihood is probability of testing positive given that you have cancer.

Assuming that 1% of the population has cancer, we will actually arrive at 16.1%. Although 16.1% is still a rather high probability, you probably should not be too pessimistic and conclude that you have a cancer.

Illustration of how we arrive at the posterior probability.

Above is a diagram showing how the entire population look like. To simplify the explanation, let’s assume that there is 10,000 people who did the test, there is actually 590 people who tested positive (0.99 * 0.05 + 0.95 * 0.01) and only 95 of them have cancer.

In this blog post, I am attempting to list down some of the more common and real life things we should actively apply bayesian thinking.

Remember your priors

In real life, many people have stereotypical thinking about different things. For example, you saw a nerdy looking guy with glasses in the university, your friends ask you to guess if he is from business major or computer science major and your first instinct is probably computer science. However, you might have forgotten that 90% of the students in the university is a business major, i.e. your prior.

To give another example, one of your colleagues, Alice is complaining that Bob does not finish his work and you think that Alice secretly hates Bob. However, maybe 80% of the people complaints about colleagues who do not finish their work, so complaining about someone not finishing their work is not really a strong evidence of hatred.

Combining the prior knowledge you have with the evidence you observe gives you a more certain and robust conclusion.

Think of the counterfactuals

You recently came up with an idea for a project and told one of your colleagues, Bob about it. He gave a positive feedback and you are confident that your idea is going to work. After you implemented the idea, you found out that it did not work. In this example, you probably should have asked a few more colleagues to validate the idea because Bob is optimistic about almost all ideas. Hence, if you think about two parallel universes where your idea works in one and does not work in the other, you would most likely have received positive feedback from Bob in both worlds. In other words, Bob’s feedback is not very useful.

Here’s another example related to the recent Covid-19 disease and how government of some countries are reacting to it. You have been reading a lot of negative comments and news about how slow some governments are in taking precautionary steps such as lockdowns and mobility restrictions. You conclude that these governments are doing a bad job in handling this crisis. Let us consider a parallel world where these governments responded quickly and managed to contain the virus at the expense of economic recession and high unemployment. You will probably also be reading negative comments and news as well about how the governments overreacted to such a minor disease. So in this case, the negative comments is perhaps not a very strong evidence to suggest that the governments’ response to the disease is bad.

Actively thinking about how likely you would see similar evidence in a parallel universe helps you gain a clearer understanding of how useful your evidence are.

Don’t stick to your priors forever

In Bayes’ Theorem, priors are the beliefs an agent holds regarding a fact. The reason why it is called a belief is because it is what a person believes to be the truth and not necessarily a universal truth. Most of the time, we are trying to get our beliefs as close to the truth (posterior) as possible. Ironically, as human, we are often too rigid on our beliefs, especially the first time we get some evidence. I often hear people taking a statement seriously the first time they heard it from a friend and strongly make it as their belief without validating them with more evidence. Perhaps because of a fear of admitting mistakes, people are sometimes reluctant to gather evidence or to a certain extent even ignore evidence that changes their beliefs in the first place.

In this case, we should change our mindset such that changing beliefs is admitting mistakes. A better mindset is perhaps to treat updating beliefs like “I’ve gathered more evidence about a belief that I have previously and I am updating my belief slowly so that it is closer to the truth (posterior)”. To give an example, you might have a very strong bad impression about someone without even meeting them because you heard a gossip from a close friend about a bad thing that the someone did. Let’s say this is a friend you trust a lot and started of with a strong belief, but you should really still keep an open mind to gather more evidence from more people and change your belief accordingly if your friend is indeed wrong about the person in the first place.

We are sometimes stubborn to admit making mistakes. But we should change our mindset to “I am updating my beliefs based on evidence I’ve gathered so that it is closer to the truth”.

Conclusion

In my blog post this week, I shared an elegant statistical theorem that is intimidating to many people and applying it in the context of our everyday life. I hope I manage to convince you about how this theoretical concept can change how you should think about things moving forward.

To recap:

  1. Don’t forget about the power of priors and solely look at the evidence, zoom out and have a bigger picture before jumping to conclusions.
  2. Think about the usefulness of your evidence based on counterfactuals, ask yourself if the evidence you see would really be any different from what you would observe in a parallel universe.
  3. Don’t be too certain about your priors without too much evidence, keeping an open mind allows you to gather more evidence to form a belief that is closer to the truth.

--

--

Chong Han Khai

A data person who cares a lot about best practices and being scientifically correct.