Update your thinking
On the morning after the first presidential debate, quite a few conversations I overhead included a variation of “I don’t think it changed anybody’s mind.”
Confronted to a problem, some (most?) of us take pride in “getting it right” the first time around, and this pride may get in the way of changing our point of view.
This pride, however, is misplaced and the truly commendable attitude appears to be that of the person who, in light of new evidence, isn’t afraid to change their point of view. That is, we should strive to emulate John Maynard Keynes who said: “When the facts change, I change my opinion. What do you do, sir?” (actually, it is unclear whether Keynes did say this, but the point remains.)
More easily said than done
Let’s admit it, we’re biased. Yes, you too. Don’t believe me? Let’s run a test:
Consider psychologists Tversky and Kahneman’s example:
A cab was involved in a hit and run accident at night. Two cab companies, the Green and the Blue, operate in the city; 85% of these cabs are Green and 15% are Blue.
A witness identified the cab as Blue. The court tested the reliability of the witness under the same circumstances that existed on the night of the accident and concluded that the witness correctly identified each one of the two colors 80% of the time and failed 20% of the time.
What is the probability that the cab involved in the accident was Blue rather than Green knowing that this witness identified it as Blue?
Take a moment to evaluate the probability and write down the answer, we’ll get back to it in a moment.
There’s a large body of evidence supporting that our judgment is less than perfect. These intellectual imperfections affect all levels of our thinking during our problem-solving process, including our analysis of evidence, as we tend to reinterpret or discredit information that oppose the hypotheses we favor. This works in concert with motivated reasoning: using different standards of evidence to evaluate propositions that we wish were true as opposed to those that we wish were false; that is, when evaluating an agreeable proposition, people tend to ask, “Can I believe this?” whereas when evaluating a threatening proposition, we tend to ask, “Must I believe this?”
The point is that we probably shouldn’t trust our instincts when interpreting new information.
So, what can we do?
First, admit that you’re biased. Overconfidence, failure to account for base rates, anchoring, hindsight bias, … the list goes on. Psychologists have identified many ways in which our judgment isn’t as good as we think it is. And yes, that means that your judgment is probably not as good as you think it is. So reexamine your point of view and try to identify which bias(es) you most easily suffer from. Then step back and try to repaint the picture accounting for their effects.
Flip the situation, if you can. Stanford’s Michael McConnell [1:15:35] was recently invited to debate whether President Obama had usurped Congress’s constitutional power. In his concluding remarks, he offered an approach to check your thinking: “I have long told my students in constitutional law that when thinking about issues of constitutional power, they should think not about the presidents and leaders that they admire and trust, but the ones that they disagree with, ones they don’t trust.“
Adopt a Bayesian approach. Since “shooting from the hip” doesn’t seem to cut it, we would be well advised to use a more structured approach in assessing new evidence. Bayesian inference can help us do just that as it provides a framework to update your beliefs in light of new evidence.
In the cab example above, the witness identified the cab as blue and even though the witness is right four out of five times, the probability of the car being blue is only 41%.
With a simple equation, a Bayesian approach can help take away the guess work.
In his address to Caltech’s graduating class of 1974, physicist Richard Feynman advised his students to “not fool yourself — and you are the easiest person to fool.” Countless analyses support that claim that we are easily fooled in multiple situations, and failing to update our thinking in light of new evidence is one of those. Therefore we ought to account for our imperfections, removing them when we can and, when we cannot, reducing their impact.
Let’s go back to our cabs. This is Bayes theorem; although it looks scary, it really isn’t that bad.

Applying it to our cab case, start by defining the hypothesis and the evidence:

Next, substitute numerical values:

So the probability that the cab was blue given that the witness identified it as blue is only 41%. If your estimate was significantly different, you may want to not rely as much on your instincts when updating your thinking in view of new evidence and, instead, adopt a more structured way.