Blinkers and intuitions
Our intuitions may inadvertently fit us with blinkers
If you live in the UK as an expatriate and have a name like mine, you stand out. Yes, you have to learn to live with a wide variety of creative pronunciations of your moniker, but it has advantages too. Having a rare name means people remember you more easily, and if they have to look it up, the odds are there is going to be just one person with your name.
It is easy to extrapolate from this, of course. My name is unusual for many people in Britain, and therefore very rare, so intuitively it follows that names that are unusual to me are also rare. Easy to do, but easily mistaken. Time and time again, when I look for the twitter handle of the author of an interesting scientific paper, my intuition is that there will be hardly any users with, say, a particular Chinese or Korean name — simply because Chinese or Korean names are not common in my environment. And then, to my surprise, I find that there are dozens of people named Min-Hui Li or Sungsik Park.
The assumption that the things that are either common or unusual to us are common or unusual in general is, well, not that unusual. You have probably come across it on web forms where you need to enter your address — only, in your country it follows the town, whereas in the country where the site is based, it precedes the town. Or more annoyingly for many people with an Asian surname, the requirement could be that your last name has at least three characters.
Aargh! A spider in my bed! (image via YouTube)
If you have young children around you, you may be familiar with the cartoon character of Peppa Pig, a charming piglet girl exploring the world with her parents and her little brother George. In one of the episodes, she encounters a spider known as Mr Skinnylegs. Peppa is frightened of the creature, but Daddy Pig reassures her that “spiders are very, very small, and can’t hurt you”. This is pretty good advice for young people in the UK, where Peppa Pig originates from. But when the episode was shown in Australia in 2012, a viewer complained that some local spiders were neither very, very small, nor harmless.
The complaint was upheld and the broadcaster removed it from its online offer. As the programme is aimed at small children, this sounds reasonable, and the ABC should not have aired this episode. But did the makers realize that this storyline might not be suitable for countries where spiders should be approached with care? Or were they a bit blinkered by the assumption that spiders in their environment are nothing to be afraid of?
Differences in social or geographic context are not the only reason why we can be lured into following an intuition that turns out to be incorrect. Consider this question: does screening for cancer prevent premature deaths? If the disease is detected earlier, it can be treated earlier, and the chance of success of many treatments is lower when the cancer is advanced. Obvious, no? And this would appear to be confirmed by the higher 5-year survival rates of people who participate in a screening programme. But our intuition might be fooling us.
Imagine 200 women aged 63, who all have breast cancer. 100 are in a screening programme and their condition is correctly diagnosed. They are treated, and two of them die in the next 5 years. The other 100 receive a breast cancer diagnosis because they have found a lump in their breast at the age of 65. They too are treated, and five years later, 77 of these women have died. The survival rate for the women who were screened is 98%, while that of the women who were not is 23%. A spectacular difference, which the largest breast cancer charity in the USA used to highlight.
Screen your intuition
Hurrah for screening? Not so fast. What happened to the 98 surviving women who were screened in years 6 and 7 after their diagnosis — aged 69 and 70? Did they all survive those two years too? The survival rate doesn’t tell us. If, in our thought experiment, 75 of the 98 surviving women died in these two years, the screening would have had no benefit whatsoever: in both groups we started with 100 women aged 63, and 7 years later 23 would be alive aged 70, in either group. Yet one is showing a 98% 5-year survival rate, and the other just 23%.
What we see here is known as lead time bias. If someone would die from cancer aged 70 anyway, it looks as if they survive longer if you diagnose them aged 63 than if you diagnose them aged 65. Yet our intuition is that early diagnosis means more effective treatment means longer survival. Worse, it gives us blinkers — when we see this, we don’t feel the need to look further for a measure that would really tell us whether early screening works. What happens to the overall number of people dying from cancer? If that falls, there is definitely an effect. What is the average age at which people die from cancer? If that increases, people may not be cured, but at least they live longer with cancer. Who needs this, though, if the data gives us comfort, confidence and confirmation of our intuition?
Data can, unfortunately, indeed exacerbate the problem. Statistics and figures give instant credence to, well, to something. But not necessarily to your intuition. There is an (apocryphal?) WWI story around the introduction of tin Brodie helmets in the British Army, to provide the soldiers with better protection against flying shrapnel. However, when the number of injured people brought in with head wounds were counted, the data revealed that, surprisingly, it had gone up by a large percentage, instead of gone down. Did this mean those who doubted that the helmets would be any better than the cloth caps at protecting the soldiers’ skulls were right? Of course not. What happened is that fewer soldiers died on the spot of their head injuries and more of them survived (but with injuries).
Another classic (and true) war story illustrates the same kind of intuitive blinker effect. In WWII the allied forces sought to minimize losses of aircraft to enemy fire. Researchers had been studying the damage to the aircraft that managed to return, and recommended that they be reinforced where the damage was worst. That was what their intuition told them, and what the data appeared to confirm: what was the point of strengthening the fuselage and the wings where there were no bullet holes? But Abraham Wald, a statistician, stopped them in their tracks. The holes, he pointed out, were exactly where the aircraft were the strongest: they were clearly capable of flying home even with the damage. It was the locations where the surviving planes were unharmed that needed reinforcement: the planes that got hit there didn’t make it back.
Intuition is a good thing to have, but we must be careful not to allow it to fit us with blinkers. We may believe we know the answer and have the data to support it, but it’s worth checking whether it is the answer to the right question. And for that, we should take off our blinkers.
Originally published at koenfucius.wordpress.com on February 8, 2019.
Thank you for reading this article— if you enjoyed it, let me (and others) know by tapping the applause button to give it some claps, and please also do share it. (There are Twitter and Facebook buttons, click here to post it on LinkedIn, or simply copy and paste this link.) And in case you want to read more articles like this one, see all my other posts (I publish one every week) here. Thanks!