Are You Certain?

Howard Gross
Communicating Complexity
8 min readJan 10, 2024

--

Most people have a hard time with uncertainty. That is because of all the highly complex systems in the world, few are more hardwired than the human brain. Humans have been programmed for millennia to try to control their environment. But the unpredictability of uncertain circumstances makes that extremely difficult. When faced with not knowing what will happen in the future, the mind perceives such ambiguity as a threat, inducing both psychological and physiological stress. Which is unfortunate, since there is a lot of uncertainty going around these days. And more sure to come.

Uncertainty is an unwelcome by-product of complexity, which is neither predictable nor controllable. Unlike risk, where possible outcomes are known and their probabilities can be discerned given appropriate time, thought, and effort, uncertainty is often indeterminable. There is simply no way of knowing, and the impact can be overwhelming.

In its 2021–2022 Human Development Report, the United Nations Development Program described the emergence of an “uncertainty complex” unlike anything seen in history. According to the organization, more than six in seven people worldwide are insecure; and insecure people are less trusting. Only 30 percent of the adult population believe others can be trusted — the lowest score ever recorded.

The reasons for such apprehension are many. More citizens of the world than ever (4.2 billion) will go to the polls in 2024 to choose their leaders, including Donald Trump, who may return to the White House for an even more dysfunctional second term. Also in America, the Supreme Court will decide the fate of voting rights, consumer rights, taxation, the Internet, abortion again, and possibly the presidential election. National economies are beset by inflation, inequality, and spiraling sovereign debt. Geopolitical conflicts like those in Ukraine and Gaza have the potential to expand beyond their borders. Both global temperatures and billion dollar weather-related disasters have surpassed all previous levels, and will continue to rise. And the use of artificial intelligence — good and bad — will accelerate.

Compounding Conundrums

On their own. each of these dilemmas creates uncontrollable challenges that make outcomes tough to predict. Together they comprise what is known as a polycrisis. First conceived in 1999 by complexity theorists Edgar Morin and Anne Brigitte Kern, it is a description of multiple and compounding predicaments. The World Economic Forum embraced the concept last year, declaring “economic, global, political, and climate crises were converging to create tensions and a sense of instability to a degree the world had not experienced in some years.” Historian Adam Tooze, a professor at Columbia University and Director of the European Institute adds that the matter is not just a collection of crises, but a situation “where the whole is even more dangerous than the sum of the parts.”

Complex Global Challenges

Critics of the notion argue that society has encountered similar circumstances in the past and has always managed to endure. The Greatest Generation, for example, came through back-to-back global depression and world war. Baby Boomers grew up an era of cold war, civil rights, and the specter of nuclear destruction. Yet while complexity has dramatically increased, the brain’s basic wiring remains pretty much the unchanged, responding to uncertainty with many of its most primeval processes. The type of comprehensive and abstract thinking required in modern times is still relatively new in human evolution.

Accelerating Change

Nowhere is this more evident than in the relationship with technology. Systems and tools have historically advanced at exponential rates generally referred to as accelerating change. Each technological improvement creates the next generation of technology faster than before, with even faster and more profound changes still to come. Case in point, humanity currently creates an estimated 2.5 quintillion (the number 1 followed by 18 zeros) bytes of data every day. At that rate, it would take someone 181 million years to download all of the data on the Internet. By 2025, the amount of dally new data is expected to grow by a factor of more than 180.

But what happens when individuals and organizations can no longer keep pace and uncertainty becomes paralyzing? One option is to delegate that responsibility to increasingly sophisticated lines of ones and zeros.

Artificial Intelligence

Artificial intelligence (AI) is the science of enabling machines to process information and perform tasks as well as, if not better than, humans. There are computers today that can do calculations in a second that would take a person more than 31 billion years. A broad field of study, AI is comprised of multiple components, the most recognized of which are large language models (LLM) like ChatGPT and Microsoft’s Copilot that can simulate human conversation. These and other systems present both possibilities and perils, the consequences of which are anything but certain.

Many smart and accomplished people believe AI has the potential to significantly improve productivity, boost scientific research, and augment creativity. Some however, like venture capitalist and self-proclaimed “techno-optimist” Marc Andreeson, contend these achievements can only be attained by setting AI free to move as fast as possible, unencumbered by formal restrictions. “We are, have been, and will always be the masters of technology, not mastered by technology,” he proclaims. Moreover, “technology must be a violent assault on the forces of the unknown, to force them to bow before man.” Anyone who disagrees is a “liar”.

Many other experts have cautioned against such zealotry. They worry that artificial intelligence can endanger national security, threaten jobs, perpetuate prejudices, seriously subvert the online infrastructure, and far worse. A recent survey of more than 2700 AI researchers found that 58% figure there is a 5% chance of human extinction. What is more, over 350 scientists, engineers, and technology leaders — including representatives from OpenAI, Microsoft, and Google — signaled their misgivings in an open letter urging that “mitigating the risk of extinction from AI should be a global priority alongside other societal-scale risks such as pandemics and nuclear war.”

They are hardly the only ones concerned. According to polling by the Artificial Intelligence Policy Institute, a majority of American voters (83%) fear AI could cause a “catastrophic” event; and 72% want to slow down future AI development. Eighty-two percent also distrust companies to police themselves, preferring federal regulation by a three-to-one margin. Yet despite the public’s disquiet and doubts, AI executives show no inclination to hit the brakes. To the contrary, spurred on by the arms race nature of the industry and the vast sums of money to be made, it is damn the torpedoes, full speed ahead.

Responding With Resilience

If it is premature, and potentially disastrous, to trust artificial intelligence to manage complex problems, and humans are behind the curve in effectively dealing with uncertainty, what other options are there? One alternative is for people to enhance their capacities to adapt to such disruptions. In other words, to become more resilient.

The late Hans Rosling, a former professor of Global Health at Sweden’s Karolinska Institute, believed resilience is closely tied to cognitive processes, and that people think things are getting worse when, in fact, they are not. After testing thousands of subjects on what they knew about the state of the world, he concluded that most scored lower than chimps would have, because their natural instincts beget a form of ignorance he termed “outdated knowledge.”

Humans have evolved to conserve cognitive resources. Accordingly, it is more expedient to think in terms of automatic modes like heuristics and generalizations, which are mental shortcuts that enable instant judgements. At best these can be overly simplistic, and at worst, seriously inaccurate. Exacerbating the situation are digital algorithms that continually feed people information and ideas they already know and believe.

Updating knowledge, on the other hand, takes far more mental effort. But adapting to an ever-changing information landscape by understanding biases, questioning agendas, and learning to distinguish credible from unreliable sources is a more thoughtful way to navigate uncertainty.

Fighting Fear

Fear is also an instinct Rosling believed engenders ignorance. When people encounter the unknown, they naturally assume the worse-case scenario; and when information is not available, they fill in the blanks with bad tidings. Imagining doom is an ancestral state of mind. Eschatology is an age-old doctrine among Christian, Jewish, and Muslim faiths that ponders the end of times. A more radical dogma known as apocalypticism, portends the imminent demise of humankind.

Of course, people can just read the news. News can influence attitudes through multiple mechanisms that interface with cognitive biases, more readily steering focus toward what is wrong than what is right. Research has found that exposure to negative headlines — whether through traditional news outlets or social media — can undercut mental health. “Fears that once helped keep our ancestors alive,” noted Rosling, “today help keep journalists employed.”

If fear is an immediate and irrational reaction to uncertainty, reflection is a more practical response. Complex causes may be uncontrollable, but they do change over time. Ari Wallach is a futurist and the founder of Longpath Labs, an initiative dedicated to promoting long-term thinking. He reckons civilization is currently deep in a moment of continuous change, which he defines as an “intertidal period.” In nature, an intertidal zone is where the ocean meets the land. Located along marine coastlines, it is subject to constant and sometimes extreme variations as tides rise and fall.

Societies too, experience ebbs and flows that demand cognitive flexibility and new mental models. The same can also be said of crises. Though often perceived as negative events, they may present opportunities for resilience by exposing underlying problems, thus triggering a reevaluation of priorities, strategies, and assumptions to provide a clearer picture of what needs to be fixed or improved.

Uncertainty is a fact of life and forecasting what lies ahead is the province of futurists, pundits, and pollsters. If history is any indication, most will be wrong. For its part, OpenAI’s ChatGPT is more ambivalent than its human counterparts, explaining that any prognosis “involves assessing various global factors and trends, some of which are predictable, while others are less so.” Regardless, it is likely that this year will be no more certain than last, and possibly far less. Whatever the prospect, the human race will have to find a way to deal with it. “For all of its uncertainty,” observed the late Texas congresswomen Barbara Jordan, “we cannot flee the future.”

--

--

Howard Gross
Communicating Complexity

Making complex ideas easier to access, understand, and use