Don’t believe everything you read.

Quantum Thought

Earning your opinion through Bayesian reasoning

Published in
5 min readFeb 20, 2018

--

A democracy only functions with a well-educated electorate, and those who are misinformed or are too intellectually lazy to find out the truth are ergo dangerous to our democracy. The misinformed aren’t “bad” or “stupid.” But they do hold a fundamental flaw in their conviction in their right to hold an opinion.

Everyone is not allowed an opinion. Opinions are earned.

First off, we need to accept that we can’t blindly rely on others for all of our opinions. Just because some people are smarter or more reliable than others about certain things, they aren’t necessarily smarter than everyone about all things. Overconfidence leads experts to make proclamations on matters far from their field of expertise, like a Nobel Prize-winning chemist who becomes convinced that vitamin C is a wonder drug, or a pediatrician from Australia who becomes a global spokesperson for Soviet foreign policy, or an atomic physicist who goes on TV to tell people that CO2 is good for the environment.

William Happer thinks more CO2 will be good for the planet. He compares the claims against carbon dioxide to the “demonization of Jews under Hitler.” Happer has never even published a scientific paper on climate science.

It’s deeply important for people to carefully consider and research a subject before landing on an opinion, because humans have many cognitive biases that make it difficult to overcome misconceptions and falsehoods. People, for one, experience cognitive dissonance, where they learn selectively facts that only further confirm their worldview. This is easier than ever in the age of the internet where you can always find at least one other person who agrees with you. Cognitive dissonance is also linked to the “anchoring effect,” the tendency to rely too heavily on the first piece of information acquired on a subject. This leads people to form a nascent opinion early on that’s difficult to correct in the future. And being “smarter” doesn’t make you resilient to these biases — in fact, it’s well known that it’s harder for “smart” people to overcome cognitive biases — higher IQ actually makes cognitive biases worse, because smarter people are better able to narrate themselves out of inconsistencies.

We can’t necessarily trust our sources, and we can’t necessarily trust ourselves. So, how should we decide between fact and fiction in deciding our opinions?

I think the solution is to apply the scientific method to all subjects. You need to constantly be weighing probabilities and testing hypotheses. This is something Nick Szabo calls “quantum thought.”

“Totalitarian thought asks us to consider only one hypothesis at a time. By contrast, quantum thought demands that we simultaneously consider often mutually contradictory possibilities. I can be both for and against a proposition because I am entertaining at least two possible but inconsistent hypotheses. If you are unable or unwilling to think in such a manner, then either you’re not doing something cutting edge or you’re not being intellectually honest with yourself.”

Science doesn’t produces absolutes — everything is probabilistic.

This is also known more colloquially as the Bayesian approach, where we consider our prior knowledge along with new data, and we represent our belief with a probabilistic distribution that quantifies our uncertainty about the world (ie “I’m 75% confident that this WSJ article is reliable because they’ve been reliable in the past — however, this article has not been properly cited”). This scientific process is heavily exploited by pundits to sow doubt into any logical conclusion. But this doesn’t mean that the process is incorrect, or that you should abandon this kind of critical thinking.

Trust the process.

Bayes Law. Welcome back to Statistics 101, folks.

I think you should reconsider beliefs when you realize that they’re really only opinions —and you should be particularly wary of beliefs you’ve developed on subjects you don’t know much about. Unfortunately, when people are asked for their opinions on topics outside their competence, most people feel compelled to take a stance, and this is the most dangerous position to be in. When you’ve aligned yourself with a belief, the consistency bias won’t let you backtrack because you’ve taken a stand on your opinions, and it’s difficult for you to contradict yourself, both publicly and privately. With the prevalence of social media, with everyone “liking” pages and “following” celebrities and exposing themselves to propoganda from non-experts, our society has self-segregated itself into curated clusters of subjective opinions.

“[The consistency bias] scares me enough that I am rarely willing to sign a petition anymore, even for a position I support. Such an action has the potential to influence not only my future behavior but also my self-image in ways I may not want.” — Robert Cialdini

Daniel Moynihan said “Everyone is entitled to his own opinion, but not his own facts.” I disagree with Moynihan. People aren’t entitled to opinions. Opinions are earned, and should only belong to those who’ve carefully considered and evaluated the verity of their hypotheses.

We need to hold ourselves, and each other, to a higher standard. Our democracy depends on it.

I’ve written over 35,000 words about 20 topics in energy and environment — check them out if you’re looking to learn about the sector. See my Table of Contents for an index of everything I’ve written about so far.

Speaking of energy, help fuel my coffee addiction!

--

--

intersection of cleantech, fintech, and machine learning