The Mind That Watches Itself

Five rules for avoiding echo-chambers and cultivating intellectual growth

Jason Liggi
Small Business Forum

--

What does it mean to “know” something?

To many, this question may seem to have a straightforward answer, but it’s one of those questions that likes to dodge being nailed down. An example called the “Gettier problem”:

“Alice sees a clock that reads two o’clock, and believes that the time is two o’clock. It is in fact two o’clock. There’s a problem, however: unknown to Alice, the clock she’s looking at stopped twelve hours ago.”

Does Alice know that the time is two o’clock? The concept of knowledge itself is slippery, and this is just the tip of the iceberg.

Eliezer Yudkowsky, rationalist and artificial intelligence theorist, characterises the problem of belief and knowledge as a distinction between “the map” (belief), and “the territory” (reality). His repeated refrain: “the map is not the territory”, captures an important distinction between belief and reality, that all we have is access to our own map. We never have direct access to reality, we are always dealing with some representation of it. Immanuel Kant drew this distinction too, differentiating between things that existed out there in reality (things-in-themselves), and the thing as it appears to a conscious observer (the phenomenon).

Our conscious experience, generated (mostly) by our brains via the processing of data received from our senses, is a map of a reality that exists “out there”, permanently beyond our reach. When you look at a mug of coffee on a table, the representation of that mug in your conscious experience is built from your sensory data and memories/concepts already in your mind. This data consists of things like: the wavelengths of light that hit your retina activating particular cones/rods, the molecules in the air that react with cells in your nostrils, the mind’s internal conceptual understanding of what a mug is, and your memories of having seen / smelled / tasted coffee on previous occasions.

Much like in The Matrix, if I could pump the right data into your sensory organs, I’d have the ability to shape aspects your conscious experience. I could make you see a giant floating purple strawberry directly in front of you, as clearly as you saw the coffee mug.

So what does this mean? We only have access to our internal representation of a forever inaccessible outer reality… so what?

The important part is this: all we have is a map. Our subjective experience is our map of reality, and no map captures perfectly the territory that it represents. No map of London could show the location of every pothole in every road. This would be impractical, and a map is, by definition, an imperfect representation.

Optical illusions are a testament to the inaccuracy of your map, and they happen because the hardware used to generate it has certain limitations. There is also a very long list of known cognitive illusions.

When it comes to our maps, as with city maps, much may be missing, aspects may be exaggerated or understated, some parts might just be plain wrong. When it comes to city maps, if we wanted to check the accuracy, we’d simply go to the location that the map represents and compare reality against what we had… but we can’t do that with our subjective experience. As we discussed above, we have zero access to reality as it really is.

When we talk about whether or not our beliefs are “true”, we are talking about the accuracy of our map in relation to reality as it really is.

Here we come to the root of our problem: how do we improve the accuracy of our map, if the only way we can navigate reality is by relying on it?

(1) Practice intellectual humility

Our knowledge falls into four categories:

Image source: http://www.ministryoftesting.com/2015/03/not-sure-about-uncertainty

It’s important to always remember that the vast, vast majority of what there is to be known falls into that fourth category: the unknown unknowns. The things that we are neither aware of, nor understand. Our map is missing far more than it captures, and even the known knowns may not be accurate (and often we discover they aren’t). It simply means things that we think we are aware of, and that we think we understand.

The greatest scientific discovery was the discovery of ignorance. Once humans realised how little they knew about the world, they suddenly had a very good reason to seek new knowledge, which opened up the scientific road to progress. — Yuval Noah Harari, Homo Deus

The first step to growing your knowledge (increasing the accuracy of your map) is admitting that you are mostly ignorant. You have far more to learn than you already have learned. This remains true for even the smartest, well-educated person. In fact, a lot of the time, intelligence can act as an obstacle to the intellectual growth if it results in a less self-reflective, contemplative attitude towards your own mind (which it often does).

The real source of our theories is conjecture, and the real source of our knowledge is conjecture alternating with criticism. […] The role of experiment is to choose between existing theories, not to be the source of new ones.
— David Deutsch, The Beginning of Infinity

According to Deutsch, the growth of knowledge occurs as a result of a repeating process of conjecture and criticism. One of the most fascinating things about this process is that it doesn’t require any sort of “mind” or consciousness to occur. Take evolution by natural selection, for example: the conjecture is the random mutation to genetic code that occurs occasionally during the copying process, and the criticism is natural selection: whether or not the organism survives to reproduce once again. Successful genes encode knowledge about the best way to build an organism that will survive and reproduce.

That’s great for science, and it’s great for natural selection, but how do we apply this to the growth of knowledge on an individual level?

(2) Seek out and understand those you disagree with and/or dislike

Your worldview is a collection of all of the individual beliefs that you hold. These can vary in size and significance, ranging from the inconsequential: whether or not you think you left your oven on this morning when you left the house, to the fundamental and foundational: whether or not you believe that God exists and takes an active interest in your life.

As we discussed above, a map is, by definition, an imperfect representation of reality. Unless you think that all of your beliefs are 100% true, and you are not, never have been, and never will be, mistaken about a single thing, then you are tacitly admitting that lurking in your worldview are beliefs that are incorrect.

Every single one of us is currently wrong about something, probably many things. The problem is: how do we know which beliefs are accurate, and which are inaccurate?

As human beings, our nature is not to subject our own beliefs to constant critique. In fact, we much prefer the opposite. Often, criticising someone’s deeply-held belief can lead to the backfire effect, where a person presented with irrefutable evidence against their position instead entrenches themselves further in their pre-existing belief. At this point, I’m sure you’re thinking that the bias wouldn’t affect you, and so I’ll point you to the bias blind spot, another cognitive bias in which one is able to recognise the impact of biases on the judgement of others, but not on their own judgement. In fact, even when told specifically which biases are likely to be affecting them, people are still unable to counteract the effects.

The causes of our increasing political polarisation may lurk somewhere between these biases, and our increasing propensity to create echo-chambers for ourselves via social media (which has sky-rocketed in popularity over the last ten years). We follow and listen to those that we already agree with, and we block or mute those that we don’t, until essentially all we’re doing is validating our own opinions and beliefs over and over again. We also see this in our “campus culture” at Universities, where some believe that trigger-warnings and safe-spaces are causing huge damage to our education systems, all in the name of avoiding ideas and discussions that are uncomfortable for us, and that challenge our pre-held notions.

The solution to these issues on a personal level should be obvious. You must seek out and understand beliefs and opinions that oppose yours. It cannot be said more plainly: those that you disagree with are your most important allies when it comes to intellectual growth.

If all the news that you read, and all the people you follow on Twitter, all subscribe to the same belief system, be it religious, political, or anything else, then you are making a conscious decision to stop growing and to stay where you are intellectually.

You should attempt to re-express your target’s position so clearly, vividly, and fairly that your target says, “Thanks, I wish I’d thought of putting it that way.” — Daniel Dennett, Intuition Pumps and Other Tools for Thinking

It’s not as simple as reading or listening to the arguments of the opposing side, you have to make a concerted effort to really understand the arguments yourself. You need to “steel man” the arguments (the opposite of “straw man”, in which you caricature your opponents argument and respond to that).

In studying a philosopher, the right attitude is neither reverence nor contempt, but first a kind of hypothetical sympathy, until it is possible to know what it feels like to believe in his theories, and only then a revival of the critical attitude, which should resemble, as far as possible, the state of mind of a person abandoning opinions which he has hitherto held.
— Bertrand Russell, A History of Western Philosophy

(3) Don’t form your beliefs on the basis of how many likes / retweets they’d get

This may seem like an obvious point, but we shouldn’t underestimate the power of the human desire for conformity, and our deep-seated need to belong to a community of like-minded individuals.

Research has shown that an adolescent responds to “Likes” on social media by activating regions of their brains associated with it’s reward-circuitry, including the Nucleus accumbens. This region of the brain is also implicated in addiction, and the processes that lead to addiction. We are also more likely to “Like” risky photographs (sexually suggestive, or containing things like alcohol and cigarettes) when other people have already “Liked” them, and less likely if they haven’t.

Lonely dissent doesn’t feel like going to school dressed in black. It feels like going to school wearing a clown suit. — Eliezer Yudkowsky, Lonely Dissent

This desire to conform makes sense if you consider our tribal past. In fact, Deutsch speculates in The Beginning of Infinity that the driving force behind the evolution and origin of human creativity may have been to better equip us to understand and mimic the existing culture. We became really creative so that we could be the best at being just like everyone else. Being the outlier, the “odd one out” in nascent human societies was the equivalent of a death sentence.

Here again, we face a problem: how do you tell when you’ve formed a belief just to conform?

Very few people (I’d be tempted to say none) will deliberate about their position on a subject, and consciously decide that they are going to conform to the view that they know that their friends and peers will approve of, and then decide to believe that. It’s more of a subconscious, emotional pull that manifests itself as a bias towards a particular side that you don’t even notice. And, as we discussed earlier, we have a blind spot for our own biases.

This is part of the reason that seeking out opinions that oppose your existing ones is so important. It forces you to find justifications for your own positions, and reflect on why you believe what you believe. You can’t defend a position if one of the primary reasons that you hold it is that other people will approve (even if you’re not aware you’ve done that). Of course, people can start at a position because they want others to approve, and then read, and research, and learn all of the responses to all of the arguments against that position for the sole purpose of maintaining it, but all they are doing in that situation is digging a trench.

Don’t take refuge in the false security of consensus, and the feeling that whatever you think you’re bound to be okay, because you’re in the safely moral majority.
— Christopher Hitchens

(4) Foster compassion for, and attempt to empathise with, everyone, especially those you disagree with

During the Stone Age, humans organised themselves into small, egalitarian hunter-gatherer societies. This was how all human beings originally lived. This wasn’t a conscious decision, it’s in our genes. It hasn’t left us, and it never will.

Originally, our tribes were most likely an accident of our geography. We coalesced with other nearby human beings via social relationships and family ties. At most, we probably knew and interacted with approximately one hundred and fifty other human beings. Our identity was tied up with the tribe that we belonged to and defined in opposition to other tribes.

The Robbers Cave Experiment bears this out. We find a way to form ourselves into tribes and delineate our differences even when we share so much with the out-group. Despite having almost identical backgrounds, two groups of twelve-year-old formed themselves into distinct groups, defined themselves with distinctive characteristics and looked negatively upon the opposing group.

There were occasional references to the Eagles as “sissies,” “cowards,” “little babies,” etc. Upon returning to camp for supper, the Rattlers made it clear that they did not want to eat with the Eagles, who as it happened were not there. — Chapter 5, The Robbers Cave Experiment

In the modern era, we often form tribes based around shared morality / values, and as Jonathan Haidt explains in The Righteous Mind: “Morality binds and blinds”. We can often become blinded to the views and opinions of those we consider (explicitly or implicitly) the “out-group”.

The question here becomes: how can we avoid a tribal mindset? How can we avoid being blinded to the opinions of the “out-group”, and how can we prevent people that consider us to be the “out-group” from being blinded to our views / opinions?

The stories of once bigoted and insular people like Megan Phelps-Roper, ex-member of the Westboro Baptist Church, or former white nationalist Derek Black, show us that often it’s honest engagement and compassion that truly encourage them to re-examine their beliefs.

…supporters of superstition and pseudoscience are human beings with real feelings, who, like the skeptics, are trying to figure out how the world works and what our role in it might be. […] If their culture has not given them all the tools they need to pursue this great quest, let us temper our criticism with kindness. None of us comes fully equipped.
— Carl Sagan, The Demon-Haunted World

To be clear, this doesn’t entail being sympathetic towards all beliefs, or giving a voice to those with views we consider dangerous or bigoted. The important part is attempting to understand why people hold particular beliefs.

Understanding why the Nazis believed what they did is a key tool in preventing anything like the Holocaust from ever happening again. The worst atrocities in history weren’t simply caused by bad men, they were caused by men with fervent beliefs in dangerous ideologies.

Avoiding a blinkered tribal mindset requires fostering empathy, compassion and understanding towards members of “out-groups”. If we can learn to instinctively attempt to see things from the perspectives of those that we disagree with, we remain open to ideas / avenues of thought that would otherwise be closed.

(5) Re-evaluate your core beliefs on a regular basis

Being open and willing to think critically about the opinions / ideas of all human beings is vital, but it’s only one piece of the puzzle. Even a person that achieves this monumental task perfectly will only be overseeing the creation of a worldview built from the conjectures of others.

Every intellectual revolution in human history required one enterprising individual to search amongst our shared beliefs, find an assumption and excoriate it.

Copernicus questioned the mistaken assumption that the Earth was the centre of the Universe. Darwin, the assumption that life had been created in it’s current form. Einstein, that time, space and motion were absolute. Some future intellectual pioneer will question another assumption that we collectively hold, and evidence will show that he or she is correct and that we are deeply mistaken.

An intellectual is someone whose mind watches itself. I like this, because I am happy to be both halves, the watcher and the watched.
— Albert Camus

There are undiscovered incorrect assumptions lurking in your current worldview, and the hardest ones to find are those that seem sacred or fundamental. Intellectual growth is the process of identifying and correcting them. Your mind needs to learn to watch itself.

--

--