When I look back at the first quarter-century of my life in school, it’s hard not to be astounded at the sheer range of subjects I was taught: grammar, chemistry, algebra, European history, postmodern literary theory, film studies, and countless others. We all have a similar list, with some variation at the margins. But contemplating all those courses a quarter-century later, as a 50-year-old, what really strikes me is what was missing from the list.
In all those years at school, not once did I take a class that taught me how to make a complex decision, despite the fact that the ability to make informed and creative decisions is a skill that applies to every aspect of our lives: our work environments; our domestic roles as parents or family members; our civic lives as voters, activists, or elected officials; and our economic existence managing our monthly budget or planning for retirement.
I’m not the sort of person who gripes about all the useless trivia I learned in school; I’ve made a career out of finding meaning in the obscure realms of different disciplines. But I wish at least some of that time in the classroom had been dedicated to the art of deciding. This strange omission has become something of an obsession for me, because for the past eight years I’ve been researching and writing a new book about complex, long-term decision-making called Farsighted. The book is a survey of an emerging multidisciplinary field of research that has given us a great deal of new information about how to make better choices at important crossroads in our lives, choices that may have consequences that reverberate for years, even decades: whether to change careers, or buy a house in the suburbs and leave the city, or launch a new product in a new market.
Most of the important research in this emerging field has been conducted on small- to medium-sized group decisions: a group of military advisers weighing different options for an invasion; a community board trying to decide on the proper guidelines for development in a gentrifying neighborhood; a jury determining the guilt or innocence of a fellow citizen. For good reason, these sorts of decisions are formally described as “deliberative” decisions. When we first encounter the accused burglar on a jury trial, we may well have an instinctive response of guilt or innocence that comes to us through a quick assessment of his or her demeanor or facial expression or through our own preexisting attitudes toward crime and law enforcement. But systems engineered to promote deliberative decision-making are specifically designed to keep us from naively falling into those preconceived assumptions, precisely because they are not likely to steer us toward the correct decision. We need time to deliberate, to weigh the options, to listen to different points of view before we render a judgment.
In all those years at school, not once did I take a class that taught me how to make a complex decision, despite the fact that the ability to make informed and creative decisions is a skill that applies to every aspect of our lives.
A course on decision-making need not rely exclusively on social psychology experiments to cultivate decision-making skills. Recent history abounds with case studies where complex decisions were made by groups of people who consciously adopted strategies and routines designed to produce more farsighted results. Take, for instance, the sophisticated decision process that led to the raid on Osama bin Laden’s compound in Pakistan. We have a lot to learn from studying decisions like that, because we can apply those techniques to our own choices and use that knowledge to evaluate the decision-making skills of our leaders and colleagues and peers.
You almost never hear a political debate — or a shareholder meeting — where one of the candidates or executives is asked how he or she goes about making a decision, but in the end, there may be no more valuable skill for someone in any kind of leadership position. Courage, charisma, intelligence — all the usual attributes we judge when we consider voting for someone pale in comparison to the one fundamental question: Will he or she make good choices when confronted with a complex situation? Intelligence or confidence or intuition can take us only so far when we reach one of those difficult crossroads. In a sense, individual attributes are not sufficient. What a “decider” — to use George W. Bush’s much-mocked term — needs in those circumstances is not a talent for decision-making. Instead, what he or she needs is a technique — a specific set of steps for confronting the problem, exploring its unique properties, weighing the options. That technique is something that can be taught.
It is true that the brain science and philosophical implications behind the way we decide will regularly appear on the syllabi of Cognitive Science or Psych 101, or in philosophy electives on, say, the utilitarians. And business schools regularly feature entire courses on the topic, most of them focused on administrative or executive decisions. But you will almost never find a required course dedicated to the subject in even the most progressive high school. Are there more important skills than the ability to make hard choices? I can think of a few rivals: creativity, empathy, resilience. But surely complex decision-making has to rank near the top of the list. It is at the very heart of what we mean when we use words like “wisdom.” So why isn’t it an anchor tenant in our schools?
Framing an entire course around the farsighted decision actually has the potential to light up interest in other fields.
The nice thing about a field like decision science or decision theory — or whatever name you want to give to it — is that the field is a sort of intellectual chameleon: It plays well in a highbrow context and in a pragmatic one. There’s a deep well of philosophical literature and a growing body of neuroscience research that wrestle with the problem, but it’s also a problem with immediate practical utility for everyone. Who doesn’t want to make better choices?
There’s a pedagogical argument for this approach as well. Framing an entire course around the farsighted decision actually has the potential to light up interest in other fields that can sometimes seem dry when they are quarantined off in their traditional disciplinary silos. One of the topics I cover is the discovery of the brain’s “default network,” the regions of the brain that are active during what we colloquially call daydreaming, where the mind wanders off and imagines future scenarios and outcomes. Now, the default network might come up as a sidebar in a sophomore biology survey during the unit on neurology. In that context, it’s just another set of facts to memorize: Today, it’s the default network; tomorrow, we’re covering neurotransmitters; next week, we move on to the amygdala. But put the default network in a class that’s explicitly designed to teach you how to make better decisions, and suddenly the whole idea of daydreaming as a cognitively rich activity takes on new relevance. You don’t have to be planning a career as a brain surgeon to find it useful to learn about this strange superpower that turns out to be central to our uniquely human ability to imagine long-term outcomes from our actions.
What fields would such a syllabus incorporate? Certainly it would involve the study of history, moral philosophy, behavioral economics, probability, neurology, computer science, and literature. The course itself would be a case study in the power of diverse perspectives. But beyond the multidisciplinary sweep, students would learn a series of techniques that they could then apply to their own lives and careers: how to build a nuanced map of a complex decision; how to design a scenario plan and a premortem; how to build a weighted values model (a kind of updated version of the classic pro-and-con list.) They’d learn the importance of sharing private information when deliberating in diverse groups and the value of measuring uncertainty. They’d learn to seek out undiscovered options and to avoid the tendency to fall back into narrow, initial assessments. They’d learn the importance of being other-minded in making a hard choice and how reading great literature can help enhance that faculty.
The other case for bringing decision-making into the classroom is that it provides a valuable bridge between the sciences and the humanities. When you read philosophy in the context of the promise and peril of future A.I. algorithms that may augment human decisions, you can see immediately how seemingly abstract ideas about logic and ethics can have material effects on our technological future. When you read literature as an exercise in improving our ability to make farsighted decisions, you can appreciate the way novels mirror the scientific insights that arise from experimental studies, in their shared reliance on the power of simulation to expand our perspectives, challenge our assumptions, and propose new possibilities.
But the most important argument for instating decision-making as a required course for high school students is this: No matter what you do in your life, no matter what career path you take, the ability to make the right choice when it really matters is a skill that will serve you well for the entirety of your adult life. No doubt there are a thousand electives out there — in high schools and college humanities programs, not to mention business schools — that dabble in some of these decision-making themes. It’s time we brought them into the core.