What Can The Dunning Kruger Effect Teach Technologists?

McKinsey Digital
McKinsey Digital Insights
5 min readMar 3, 2021

by Dave Kerr — Senior Expert, Digital — McKinsey & Company

Spend long enough working in technology and it is likely that you come across an interesting sounding law, principle, theory or concept. Technologists tend to build their own library of laws, frequently shared as useful reminders, interesting facts or sometimes simply funny one-liners.

Many are simple to explain. For example, Moore’s Law states ‘The number of transistors in an integrated circuit doubles approximately every two years.’ Others are easy to grasp because they convey an element of humor or irony, such as Hofstadter’s Law: ‘It always takes longer than you expect, even when you take into account Hofstadter’s Law.’

Unlike many other laws or concepts, the Dunning Kruger Effect is not easy to explain. However, it is this complexity that makes it a particularly relevant principle for today’s technologists.

What is the Dunning Kruger Effect?

In 1999, the Journal of Personality and Social Psychology published a paper, ‘Unskilled and Unaware of It: How Difficulties in Recognizing One’s Own Incompetence Lead to Inflated Self-Assessments’. The paper’s abstract describes what has now become commonly known as the Dunning Kruger Effect:

“People tend to hold overly favorable views of their abilities in many social and intellectual domains. The authors suggest that this overestimation occurs, in part, because people who are unskilled in these domains suffer a dual burden: Not only do these people reach erroneous conclusions and make unfortunate choices, but their incompetence robs them of the metacognitive ability to realize it.”

~ Justin Kruger and David Dunning

David Dunning would later condense this into:

“If you’re incompetent, you can’t know you’re incompetent…The skills you need to produce a right answer are exactly the skills you need to recognize what a right answer is.”

~ David Dunning

If we attempt to reduce this further, we risk losing the nuance which makes this effect interesting. “If someone thinks they understand something really complicated, but actually know almost nothing about it, we cannot trust that they know what they are talking about.” — this is not particularly insightful.

Strangely enough this simplification of the effect could be considered as the Dunning Kruger Effect in action. In attempting to simply explain a complex and subtle domain, we have trivialized it and it could be said that we have not understood enough of the subtleties to coherently describe it.

In truth, Kruger and Dunning’s paper is not simple — experiments were carried out across the areas of humor, logic, mathematics and grammar. Similar results were found in all categories but there are many potential sources of complexity. It may seem like common sense to declare that people who don’t know much about a domain grossly overestimate their knowledge of it, but does that track in all domains?

The Dunning Kruger Effect in Technology

Some would say that we rarely see the Dunning Kruger Effect in action in technology. Even technologists with low levels of experience tend to understand that the domains they work in are highly complex and that their own understanding is limited.

A cybersecurity expert will understand how vast the world of cybersecurity is and how little of it one individual can genuinely comprehend. However, a novice will also likely grasp this, and may even be more likely to under-estimate their ability (this is, interestingly, another common cognitive bias).

The results of Kruger and Dunning’s experiments actually tend to demonstrate that those with the lowest levels of ability in a domain suffer most from an inability to judge their skills. As the authors note, this is not actually intuitive. We would expect life experience to teach over time that things are often more complex than they appear. Frequently underestimating your ability does not necessarily teach you to account for this in the future.

The Implication For Technologists

The Dunning Kruger Effect is a particularly relevant principle for technologists, where highly consequential decisions may depend on the expertise of a small group. We should remember;

1. The Dunning Kruger Effect is an example of cognitive bias i.e. a flaw in objective reasoning

2. It may be the case that unskilled practitioners regularly suffer from this bias

3. It may also be the case that the opposite bias occurs — highly skilled practitioners are prone to overestimating the complexity of a task

Now whether or not these effects apply to all or some technologists, if we accept that bias may exist, we can work to account for it. Bias can be conscious or unconscious, and recognizing that the potential for bias exists is the first step in mitigating it.

It could be said that Dunning Kruger’s lesson to technologists of all ability levels is to always assume that you are subject to some degree of knowledge gap or bias — and that this can be circumvented with stringent checks. In data science, assessing a model’s potential for bias or discrimination requires an examination of the metrics it is measuring. What are the potential sources of bias, is there a risk in the dataset being applied, is there a risk in the features being measured?

For example, a financial services model charged with determining which loans should be underwritten will not deliberately be tasked with considering protected characteristics such as gender. However, income will be considered when evaluating credit score, and we know that a gender pay gap persists worldwide — in the US, the average full-time female employee earns 81% of the salary of a male counterpart. This poses a gender discrimination risk — the model may be tasked with making a decision based on credit score but inadvertently grant fewer loans to women. This discrimination is not a designed feature and will only become obvious by checking for it. A multidisciplinary, diverse group can mitigate this risk. A data scientist can advise the team that the model may discriminate. But it may require a credit officer, with a completely different background and education, to advise that such discrimination is illegal (and certainly unethical). These are high-consequence actions and pleading ignorance is unacceptable if the model is deployed.

To simplify if there is a potential for bias in our thinking, the correct approach is to source more data from a more diverse group. We avoid relying on single, potentially flawed data points. We understand that the observations we have made may be incorrect or carry implicit assumptions. These actions should help us make better and more informed decisions and produce technology which is free from the inherent knowledge biases that all of us — novices and experts — carry to some degree.

--

--