Listen to this story
What does it mean to change your mind? If we’re talking about something that truly matters to you, being wrong is a kind of moral shock, a realignment of personality and purpose. It’s a kind of life experience akin to other great shocks. Some of us are one kind of person before we have children and another kind of person afterward — not all at once, not overnight, but as our experience of parenthood seeps into old assumptions and alters them. Similarly, a realignment of reasoning is never brought about by reason alone. It requires us to admit not only that someone else’s stance is reasonable, but also that those reasons have a claim upon us — that we are willing to be changed by them.
Another way of looking at this is to consider the truism that if you were me, you would believe what I believe. If it’s true of us and others, it must also be true of our own past and present. Whoever you were when you believed something that you now disagree with — or when you felt something that you no longer feel — the old you is now lost to time. And it’s in our relationship with these past selves that I think we can find a model for thinking more usefully about thinking.
How do you feel toward your past? If you’re like me, I imagine you feel a mixture of sympathy, pride, anger, and bewilderment. You may now have a greater understanding of why you did many things than you had at the time, or at least you may believe that you do. But because you remember how it felt to be you and why you felt the need to do the things you did, you probably also know the futility of most of the advice you might wish to send backwards in time.
The lesson I take from this is that empathy is a more useful starting point than judgement.
This doesn’t mean dissolving all discussion into relativism. Rather, it means seeking common ground in lived experience — and tracing the limitations of our self-perceptions.
Transforming the World into Questions
Human beings don’t deal in neutral information. We exist inside our own minds and theories, glimpsing our shared reality only through the lens of individual experience. We cannot possibly take in all the information around us, understand everything, or spend our time considering all possibilities and perspectives.
Our conscious awareness is thus highly selective and geared toward behaviors that enabled small groups of humans to cooperate around common causes and across hundreds of thousands of years of evolution. We prefer speed and simplicity to slowness and complexity; we are most influenced by the immediate and the local; we tend to see things in terms of patterns and narratives, and these patterns and narratives reflect what we already know; we extend these patterns into our accounts of the past and the future.
Do you trust this person or not? Do you take a risk in this situation? What do you enjoy, and why? Feelings flush our bodies and brains before we are consciously aware of what is going on, allowing us the possibility of decision and preference in the first place. In psychological terms, these emotional reactions inform a kind of mental shortcut or rule of thumb, allowing us to make decisions without using up too much time or energy-intensive consideration. Shortcuts like this are known as heuristics.
These heuristics replace a complex question with one that’s amenable to a quick, simple, and instinctual solution. Instead of evaluating the relative merits of every single dish on an extensive menu, we go with what feels familiar and appealing. Instead of laboriously weighing the merits of different political candidates, we go with a feeling based on easily recalled scraps of media coverage. In both cases, we’re vulnerable to manipulation and error: harmlessly, in the first instance, but with potentially more profound consequences in the second.
What I take from this is that our decisions matter much less than our options — and that what matters most is how we transform the world into questions in the first place. What’s most interesting about a menu is not what people choose, but what the process of designing a meaningful menu entails. By the time the casting of votes begins, the election result will be what it will be; it’s everything leading up to that point that matters.
Similarly, when you’re building a tool or a system, you need to debate the assumptions and implications of what you are doing at the design stage — and to set yourself tests that’s are possible to fail. As entrepreneurs are fond of muttering, failure teaches us how to plan better the next time, while planning improves the quality of our failures. But this is only a virtuous cycle if we can both commit to a plan of action and face up to its consequences. Time and experience are the best tutors.
A Well-Stocked Box of Tools
Here’s one of the most useful generic pieces of advice I carry around in my head:
Don’t make the perfect the enemy of the good.
I’m not sure where I first heard this, but it captures the everyday truth that obsessing over perfection is a seductive excuse for both failing to start something and refusing to finish it. It also captures a philosophical point that bears repeating: The quest for pristine, logical perfection is often not only unachievable but also destructive. It takes us away from the world and each other, and along the way helps to give good thinking a bad name.
What should we do instead? Here’s a second piece of advice: “We must remove the rigidity of thought. We must leave freedom for the mind to wander about….”
This time, I do know where it comes from: the physicist Richard Feynman, quoted in James Gleick’s biography Genius. In someone else’s mouth, the words might have been little more than banality. Feynman, however, knew and cared a great deal about the subject of his advice: how children should be taught mathematics and what it means to develop an enquiring mind.
Rather than obsessing over definitions and set methods for solving problems, Feynman believed that students should take a flexible, open-ended approach. If a method produced the right answer, great. If it didn’t, they should try something else, and then something else again, letting practical experience be their guide.
Feynman’s approach to problem-solving relied on having what he referred to as a well-stocked mental “box of tools,” together with an attitude rooted in engineering, of a relentless interest in following wherever real-world results led. I love this idea of acquiring mental tools adequate to actuality — or, at least, less inadequate — because it dispenses with the paralysis of perfection. We don’t need to be right about everything, to sum it all up, or to anticipate every possibility in order to think well. Rather, we need to set about becoming less deceived, together.
When I think about what it means to think well, this is where I begin: with a practical project of error reduction, rooted in our shared reality.
Surely You’re Talking Rubbish
The contemporary philosopher Daniel Dennett is sympathetic to Feynman’s view. His 2013 book, Intuition Pumps and Other Tools for Thinking, both echoes Feynman in its title and references him in its introduction, before setting out 77 “tricks of the trade” for better equipping the mind. They range from anti-rhetorical alerts to illustrative thought experiments.
Here, for example, is a neat takedown of absolutism in belief and understanding:
A young child is asked what her father does, and she answers, “Daddy is a doctor.” Does she believe what she says? In one sense, of course, but what would she have to know to really believe it?…If understanding comes in degrees, as this example shows, then belief, which depends on understanding, must come in degrees as well.
In other words, whether someone claims to believe something or not — that their parent is a doctor, that God exists — matters less than what, precisely, their version of this belief entails.
A thinking tool of Dennett’s I often invoke is one of the simplest: Watch out for anyone who introduces something they intend to critique with the word “surely.” “Often,” Dennett writes, “the word ‘surely’ is as good as a blinking light locating a weak point in the argument” — because it attempts to dismiss someone else’s point of view without engaging beyond incredulity. “But surely it is nothing other than a biological fact about people,” begins his example of a guilty sentence — an attack on Dennett’s own theory of consciousness by the author Ned Block.
By asserting something contestable as a “biological fact,” Dennett suggests that Block has committed a category error: Assuming that the point at issue is so self-evident that it requires no evidence or support beyond assertion. This doesn’t mean that it’s invariably illegitimate to take something as self-evident, but rather that the category of genuinely self-evident things is smaller than we like to think. Common sense isn’t always sensible, and facts never come unaccompanied by theory. For this reason, we need to aim at as full a transparency as possible about our key assumptions — to replace each “surely” with an apparatus of explanation.
Another way of putting this is that meaningful engagement between ideas relies on a kind of contract: the agreement that reasons beyond “it’s obvious” must be given for accepting or rejecting what someone says. If we’re unwilling or unable to offer such reasons, our interactions risk becoming a more or less weaponized exchange of assertions: a kind of game in which victory belongs to the most cunning, violent, or stubborn player, independently of the claims made on each side (I bribe or flatter you; you hit me until I give in; my angry mob attacks your angry mob; whoever shouts loudest and longest wins).
When commentators lament the dearth of reasoned debate online or in contemporary politics, it’s this kind of gamesmanship they have in mind — one explicitly uninterested in defending assertions on their own merits.
More generally, it’s for this reason that teaching people how to think well is usually treated as synonymous with teaching people how to be more reasonable. Reason, skepticism, and objectivity are good; unreason, irrationality, and bias are bad. Surely no reasonable person could disagree?
Stop Being So Logical
In fact, I believe this commitment to dissecting the workings of arguments is only the beginning, because it’s here that we risk another category error—that of cutting off the business of logical deduction from experience. Reasoning may allow us to justify why we think something or why someone else ought to agree with us, but this doesn’t mean it can lead us toward a single, ultimate truth. Just as often, it helps us entrench our disagreements while distracting us from mechanisms for moving beyond them.
Consider what happens when a thoroughly reasonable exchange of ideas takes place between two people who profoundly disagree with one another. Does the fact that each person offers detailed and meticulous reasoning in support of their view mean that once they have fully considered each other’s ideas, they will then converge on the most sensible shared conclusion? No. What’s much more likely is that skillful reasoning enables each to offer a rebuttal of the other’s argument — or, at the least, to demur until they can amass further evidence.
The author and philosopher Julian Baggini skewers this neatly in his 2016 book, The Edge of Reason, noting that “when, for instance, an atheist comes across a clever new version of an argument for the existence of God which she cannot refute, she does not say ‘Ah! So now I must believe in God!’ Rather, she says, ‘That’s clever. There must be something wrong with it. Give me time and I’ll find out what that is.’ Similarly, a theist will not lose her belief just because she cannot refute an argument for atheism. Rather, that argument will simply become a challenge to be met in due course.”
Baggini goes on to quote Max Planck’s dictum about the progress of science and its entwining with the lives of those researching it: “A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it.”
Even in the most abstract of theoretical pursuits, reasoning is more often an extension of the will than a path to change. Two perfectly reasonable lines of thought starting from two different sets of assumptions will end, perfectly reasonably, in two different places. Unless we can grasp how it’s possible to think and feel profoundly differently about what appear to be shared questions — unless we can trace the many chains of belief and understanding behind them — the best we can hope for is tolerance.
Well, So What?
What I’m trying to say is that our thoughts do not really belong to us — not in the individual, analytical sense that most of us assume. They belong to our circumstances, our habits, our history, and our opportunities; to the systems granting or denying us options; to the people we have been and are in the process of becoming. We do not enjoy unfiltered access to our own reasons or reasoning, let alone other people’s, and yet we are able to both make ourselves understood and describe our common circumstances. We cannot do either of these things perfectly, and the final word will never be spoken — but it’s for this very reason that our best hope is to open ourselves to others and to experience.
I promised practical advice for thinking at the start of this essay, and here is some that I’ve come to rely upon. Build habits, build community, gather resources. Don’t even think about doing it all alone. Don’t get hung up on originality, or being the first, or being the best — just try to find a way to be better than you have previously been. Ask what works for you, or what seems to work, and don’t be afraid to mix your methods. Don’t get shackled to your past. Cut your losses, change your mind, allow yourself to be shocked into novelty. Pick and mix from the best that’s out there — the books, the conversations, the encounters — and accept that luck plays a larger role in almost everything than any of us might wish. Embrace serendipities. Don’t believe for a moment that sticking to a plan is an adequate strategy for life or discovery.
At the dawn of Western philosophy, Plato’s Socrates distinguished himself from self-appointed wise men because, unlike them, he knew the limits of his knowledge — “what I do not know I do not think I know.” Ignorance is a fine foundation for philosophy, yet even Plato goes too far. None of us can know what we do not know, not really. The vastness and strangeness of it is too great, too far beyond the frames we hang upon the world.
Here’s one thing I learned from parenthood that surprised me: It’s possible to wake up one morning loving someone else more than yourself — to be prepared to kill for them — and yet to seem unchanged. Don’t be fooled. Reasons are masks as well as explanations: stories we tell about the people we used to be. I have been transformed by experience and by other people more often and entirely than I care to admit — yet it’s only by admitting this that I can start to appreciate how much change remains possible.
Live in awe of both your ignorance and your knowledge. Cherish the absolute unlikeliness of being and the fact that none of it could be any other way.