On Invisible Gorillas and Software Development — A Review of The Invisible Gorilla, by Christopher Chabris and Daniel Simons
As a reader of fiction, I enjoy the flash and bling of a Rushdie or a Marquez. But I like my non-fiction relatively sober (except if its written by Adam Gopnik, or maybe Ian Frazier). Sober doesn’t mean boring, of course. Sober still includes substantive information, relevant context, thoughtful opinion, clear exposition, enjoyable prose. What’s not to like?
Pop-sci essays and books have a delicate balance to maintain between the pop and the sci. Many, maybe most, have a tendency to teeter in the direction of the pop. That might lead to fame and fortune for the author (heard of a guy called Malcolm Gladwell, by any chance?) but while such material makes for fantastic party conversation, rarely does it rise too far above the level of entertainment.
It is always a pleasure to come across something that defies that stereotype (some call it the “high bar” established by Gladwell, among others) — a book that is fun to read *and* elucidates a topic of interest. Something that uses the tools of persuasion and pedagogy to engage *and* to educate in equal measure. Something that is well enough written to be fun *and* has substance enough to satisfy.
I’ve had the privilege and pleasure to read, over the past few months, three books that did all that; with all three related to how our brains work.
All three have taught me interesting things about why we work a certain way, and have some lessons for me in my world of software development.
In Incognito: Secret Lives of the Brain, David Eagleman brilliantly deconstructed advances in neuroscience for people like myself, who use the term “neuroscience” as a proxy for “too complex for me”. In Fooled by Randomness: The Hidden Role of Chance in Life and in the Markets, Nassim Nicholas Taleb brilliantly played the role of your smart, well-traveled, polymath uncle, sharing stories and making highly opinionated points (with brio, but also with evidence) on how little we consciously understand randomness and probability, and how much that lack of understanding affects our behavior.
And finally with The Invisible Gorilla: How Our Intuitions Deceive Us, Christopher Chabris and Daniel Simon equally brilliantly demonstrate the various illusions under which we operate.
It is a great read, but a sobering one. Would you enjoy being told that your self-assessment of how much attention you are paying to a task, how well you remember an event, even how much you believe in your own potential were all overestimations? Illusions? But we’re all scientists, deep down. We’re all rational beings, and only absorb such information in the context of bettering ourselves through deeper understanding. So its all good.
The book has a smooth, predictable pattern and rhythm. Start with a real story and quickly arrive at its inexplicable finale. Describe some hypotheses around what that finale is not only explicable, it is even likely. Present empirical evidence of an illusion behind this hypothesis. Repeat. While the book loses a bit of steam near the end as the material becomes less compelling, the rhythm allows us to continue to absorb it almost as effectively as the more compelling material with which the book begins.
In all, the authors cover the illusions of attention, memory, confidence, knowledge, cause, and potential. Just for fun, and to genuflect in the direction of my paycheck, I thought about how these illusions explain some familiar tropes of my profession.
This is too easy. Have you seen a typical software developer at work? Constantly switching between this and that window? Responding to email while waiting for a build? Completing a checkin while reviewing that code, which started when that migration was so slow she started something in the meantime? All this while sitting in the status meeting and trying to make sure she at least heard her name being called?
Can’t be done. Can’t be done *well*, anyway. Yet we prize multi-tasking and ooh and aah at those that apparently do that well.
(Also, driving while carrying on a phone conversation — try not to do that, okay? Hands-free don’t mean nothing.)
I’ll just say one thing here: while face-to-face communication is great in so many different ways, there is one superb advantage of email — its in *writing*. So I can archive it into my digital memory, my perfect and incorruptible digital memory, to recall it when needed. Knowing about the illusion of memory only makes this more obvious.
I guess I’m going to have to get better at writing up those meeting notes!
Here’s my personal favorite quote from the book. Many engineers will love this one:
…confidence appears to be a consistent quality that varies from one person to the next, but has little to do with one’s underlying knowledge or mental ability.
Relatively quiet and thoughtful engineers who never have simple yes/no answers to “can we do this?” questions, rejoice! A lot of the confidence you see around you is just bluster.
More than a few managers I’ve interacted with have told me of how they use the confidence shown by someone reporting status or projecting a schedule as an indicator of the probability of success. As an art, or at best a very soft science, project management rarely leaves room for scientific experimentation on approaches. And most of us don’t have the discipline, much less the opportunity, to conduct such experiments. So while we might be burning through dollars on a regular basis because we fail to learn something that might make us more productive or efficient, it is safer to play within the lines of tradition and “best practices”, all the while glossing over the fact that many of those best practices are mostly intuition behind a thin veneer of data.
Sadly, we’ve all seen big shows of confidence work for others. Maybe even ourselves. But let’s be aware that this is dangerous — confidence is not automatically a sign of competence.
On the flip side, if you lead a team or some such, it is important to know when to project confidence and when to open the kimono a bit. The vast majority of people expect confidence from their leaders (ever seen a diffident politician or CEO?).
There are entire books on this subject. But why not start with an xkcd reference:
And follow this up with a tweet:
Oh, and why not a quote from a recent New York Times Magazine article titled What Do We Really Know About Osama bin Laden’s Death? (The concept of our almost innate need to see patterns in data is also covered in another article published on the very same day in the very same newspaper: Gamblers, Scientists, and the Mysterious Hot Hand.)
An appealing narrative can exert a gravitational pull that winds up pulling facts in its direction.
As so-called knowledge workers, we live in very complex environments and build very complex things. Let’s take an example: the time required to address a customer-reported product problem is affected by everything from the type of product, the customer in question, the specific people involved in responding to the problem, the processes in use, the tools in use, and many more. We know that reducing this resolution time is important, but damned if we know what exactly we can do to make the change.
So we fall back on our experience, draw from “ best practices” (those again!) and do some well-accepted Good Things (TM). We might improve diagnostics capabilities. We might streamline our processes. We might improve our tools. And then when the number goes down, we give (most of) the credit to the most recently implemented changes. Because, frankly, the alternative — that we couldn’t predict what would make the time go down — is too hard to admit.
Ever had anything to do with building software? You don’t need an explanation for this quote from the book, then:
Over and over, the illusion of knowledge convinces us that we have a deep understanding of what a project will entail, when all we really have is a rough and optimistic guess based on shallow familiarity.
I have to admit, this forces me to adopt a mindset of working against people’s illusions, rather than against their perceived incompetence. Not only do I understand the powerful influence of this illusion on those presenting their “rough and optimistic guess” as a plan, I am forced to confront the fact that my conception of my own knowledge, and the confidence I have in it, might themselves be illusions.
There you have it. A fun book, with much to teach. Fun, because the conclusions are so counter-intuitive. Much to teach, because who knew how little we knew of ourselves?
You can read it as a casual pop-sci book. You can look around and you enjoy spotting the illusions in action every hour of every day. You can draw insights into being a better parent, a better colleague— really a better person — by being conscious of the illusions under which you might be operating, and by cutting others some slack, knowing the strength of these illusions.
Once you’ve read it, take a moment absorb the blows that it deals your ego, then figure out how best to use your new-found knowledge.