The unpopular, but more logical side of everything

The Complete Guide to Cognitive Biases

Less formally known as ‘Why People Believe The Bullshit They’re Told’

By now, you probably know that I don’t buy into things easily. I try to look into them, and a lot of the times, I spot the rubbish lying below. I’ve talked about brain gym, explaining why it’s a load of crap. I have talked about why GMO isn’t just good, it’s great, and have asserted the value plastic bags bring to the environment (quite proud of that last one).

I’ve talked about why everyone would want feminism (even if you hate women) and have explained how data mining will help you. Although, big data collection methods like over-surveillance, aren’t going to help us spot any terrorists.

I have challenged juvenile, revealed the real reason we need prisons (and prisoners) and even refuted the concept of equality (“turned it on its head”, as one commenter put it).

And there is a lot more to come.

But all the while, I’ve wondered why people believe this stuff. Why they are so gullible. Here is the solution to the mystery that’s bugged me for so long.

This is just a part of the entire report. My subscribers get the whole thing for free as soon as they join. Want a copy of your own? Enter your email below!

Why Don’t People Think for Themselves?

There are a number of reasons why people would choose to go with the flow, and they vary from person to person. But for the sake of my argument, I have narrowed it down to just three.

The list is not ranked in order, as doing so would be impossible. Different practices rely on different aspects with varying degree depending on how much effort they need to put in to con you.

1) The Placebo Effect

This is exploited so widely, it’s hard to find an instance where it’s not, at least partly, in play.

The Placebo Effect refers to the power of suggestive reasoning, especially in cases where medical treatment is involved. It is the pioneer force in meditation and one of the main ones in brain gym. But it doesn’t stop there. Vitamin supplements rely on the Placebo Effect quite heavily too, as Adam Conover explains.

Most people tend to believe hoaxes because a few (influential) people said they work. The common people just chose to believe them. If they get it wrong, they assume there’s something wrong with how they are doing things, not with the actual practice (I elaborate on this in my post on meditation).

2) Conformity

In the 1950s, Solomon Ash conducted an experiment in which he asked a group of people in one room to assert the length of a line. All but one of the people were stooges, and they collectively decided on what is the wrong answer. Now here’s the interesting part. The one true subject (who was unaware of the fact that the rest were stooges) conformed to the others’ opinion one third of the time, despite knowing it was wrong.

3) Sciencey Sounding Terms

This is the main reason people buy into bogus practices.
When we think people are smart, we trust them more. It’s only natural. If you think someone knows what they are talking about, you’ll be naturally inclined to listen to what they say.

Lots of people know this (mostly subconsciously), and people who are trying to sell you into something new (not revolutionary new, just plain bad new) exploit it. It’s a fact. If a quack (fake doctor) or a salesman is to include technical terms in their proposal, you are more likely to give in. In fact, whole study has been done on it!

In March of 2008, the Journal of Cognitive Science conducted a series of experiments. They elegantly demonstrated that people will buy into bogus practices and opportunities if they are dressed up in fancy science terms. Most of the times, these terms aren’t even relevant. Subjects were randomly given one of four explanations for various phenomena from the world of psychology. The explanations either contained neuroscientific terms or didn’t, and were either ‘good’ ones or ‘bad’ ones. An example of a bad one would be simply restating the phenomenon.

Here is one of the scenarios: “Experiments have shown that people are quite bad at estimating the knowledge of others: if we know the answer to a question about a piece of trivia, we overestimate the extent to which other people will know that answer too.” For this experiment, the good, without neuroscientific terms explanation was: “The researchers claim that this [overestimation] happens because subjects have trouble switching their point of view to consider what someone else might know, mistakenly projecting their own knowledge onto others.” The explanation with neuroscientific terms was: “Brain scans indicate that this [overestimation] happens because of the frontal lobe brain circuitry known to be involved in self-knowledge. Subjects make more mistakes when they have to judge the knowledge of others. People are much better at judging what they themselves know.” As I’m sure you can see, there is very little value added to the explanation here (which is quite a crappy one), the only different being the addition of sciencey words. But….this performed better.

The experiment was conducted on three different groups: everyday people, neuroscience students, and neuroscience academics. All three groups judged the good explanations as more fulfilling, but the two non-expert groups judged the neuroscientific explanations as better.



This is just a part of the entire report. My subscribers get the whole thing for free as soon as they join. Want a copy of your own? Enter your email below!