How to spot bullshit like a boss
If you have the Medium mobile app, this is wayyyyy more entertaining as a dog-riddled “series” https://medium.com/series/how-to-spot-bullshit-like-a-boss-e1ffa0462c67
“Every kid starts out as a natural-born scientist, and then we beat it out of them.” — Carl Sagan
Beyond his tireless work around aliens, how on earth does astronomer Carl Sagan map to business research? Let me tell you.
But first, behold the lowly platitude: explaining critical reasoning skills is a formidable task.
In his book, “The Demon-Haunted World: Science as a Candle in the Dark,” Carl explores how he maintains integrity in his reasoning process with a “baloney detection kit.”
In a perfect world, everybody would use this thing. It’s like a mental “washing machine” for all colors of information.
Careful reasoning applies to all research and analysis. While research feels easier to approach as a training subject, analysis is more complex and textured. In practice, there are many researchers who act as analysts (and vice versa), but not all are capable of that shift in thinking.
As an example, consider a basic difference between the people who gather information (“researchers”), and the people who make sense of it all to inform decisions (“analysts”). A researcher is great at finding sources and figuring out if they’re useful or reliable. He then goes about getting it and pulling it all together around topics of interest.
Sure, turning those mountains into molehills involves making sense of it, but that still happens while foraging.
An analyst looks at that same information, but to gauge reliability he spends more time looking at it through the lenses of its creators as well as potential consumers of his analysis. What allows an analyst to generate a strong opinion or recommendation is when he combines that scrutiny with extra processing and reasoning skills.
And an analyst anchors all that to concrete awareness of the reason for “wanting to know that stuff” to begin with. Why? Because he’s supposed to be affecting an outcome. In contrast, not all researchers know (or need to know) the end-game while they’re running on the field.
That means that an analyst might examine how others will attack or dispute the information, and how he’ll defend findings. Or he might spend more time looking at potential bias or distortion around the source of information to figure out if it is as “reliable” as it is “interesting.” There are endless options for doubt.
If mainstream news were to use this approach, with emphasis on reliability, there would be almost no talking heads called “analysts.” In reality, most news just focuses on the “interesting” to spin up talking points. Eventually, people do call shenanigans. And we do see facts and evidence, but not often.
Carl’s kit is valuable because it can enhance the integrity of our reasoning process. And that may improve results whenever we are finding, evaluating and engaging information.
That means that the mindset of an analyst is equally beneficial to researchers. While gathering information, we are often thinking about (or planning for) post-processing and analysis. Regardless of who owns those next step, understanding basic principles of analysis improves outcomes.
In the book’s chapter “The Fine Art of Baloney Detection,” Sagan introduces scientific approaches that translate unusually well when thinking about everyday skepticism. A skeptical attitude is a hallmark of a strong analyst. They must push for truth or validity in findings that stand up well to alternative interpretations.
His kit intends to help us all steer clear of being duped or misled (or hoodwinked, or bamboozled), and he shares several specific tools. He also spends considerable time digging into dozens of fallacies in logic. These tools teach us what to do as much as they teach us what not to do. At a minimum, they should prompt any researcher or analyst to think long and hard about what it means to call findings “reliable.”
Of course, reprinting Sagan’s lengthy prose would pretty much violate every kind of fair use standard, so just go buy a copy. Until your order arrives, I paraphrase this bit of his treatise and attempt to add some color. So let’s dig in:
1We always want to get impartial and dissociated confirmation of facts and findings. That makes them more valid and reliable. That is why, for example, we speak to multiple people to corroborate findings.
2When you find good information, get other smart people to discuss its integrity and validity. That means getting other perspectives on whatever you found out.
3Beware the experts. Do not always take information from the best or most knowledgeable sources at face value. Just because a source is some kind of expert or guru, they can still be wrong. Very wrong.
4Come up with more than one way to confirm an idea or fact or explain some crucial information. Think of ways that might show that some of those are true or false, or reliable or not reliable. Once you get through that exercise, you have a better chance of being right about whatever you think you know.
5Get over yourself. Do not get too attached to your ideas just because they are your own. Question them. If you can find any reasons for doubting your ideas, assume that others will do that too. If you can’t find any reasons, others will probably still find ways to punch holes.
6Numbers are hard, but working with qualitative and non-numerical findings can be more difficult. Those kinds of findings are open to interpretation. If possible, try to find a way to put numbers around things. If that isn’t possible, expect to work harder and be ready to counter alternative interpretations of your findings.
7“If there’s a chain of argument, every link in the chain must work (including the premise) — not just most of them” (No rewrite, those are Carl’s words — as concise as it gets).
8Remember Occam’s Razor. When there is a hard way to explain something and an easy way, go with the easier and simpler way.
9When you come up with a valid idea for how to do something, such as “find something out,” ask if it is easy to prove that your idea is wrong or bad. Assume that you will need to walk highly skeptical people through your thought process. Imagine that somebody else is going to have to repeat what you did, and ask if they will get the same results.
Is this the beau ideal for critical thinking and analysis? Doubtful.
Just because your analysis clears these hurdles does not mean its any good. And no piece of the kit will allow you to sidestep the part where you do actual “thinking.” But it’s a great start for those making the leap from research to analysis. And for everybody else, it will make you a greater everyday skeptic. Until then, question everything.
This article is inspired by www.theg2.com, research training that focuses on getting information about, around, from, and through people.