Why It’s So Hard to Make Sense of a Complex World

Howard Gross
Communicating Complexity
8 min readNov 2, 2020


This is the second in an occasional series of pieces on understanding complexity

Stop Making Sense is the title of 1984 concert film featuring the iconic new wave music group Talking Heads. It can also serve as a motif for much of the thinking in this early part of the 21st century. In a year overwhelmed by a global pandemic, an unusually bellicose election, racial strife and economic angst, thoughtful reflection is in short supply. It is, in part, the consequence of our innate inaptitude to fully decode complex ideas; and, in part, the inclination of those who readily exploit that weakness.

Of all the tools and technologies on which we regularly rely, perhaps none are more hardwired than our own brains. Initially programmed by our primal ancestors to endure hostile environments, many of their most essential survival impulses have since become intellectual handicaps. Collectively referred to as cognitive biases these systematic errors in judgment cloud our impressions of the world, prejudicing how we perceive, process, and act on information. Such flaws also leave us vulnerable to a host of questionable influences.

[Mis]information Overload

There are limits to how much information people can absorb. Still, we are continuously bombarded with far more data than we can possibly comprehend. IBM estimates that humans currently produce some 2.5 quintillion (the number one followed by 18 zeros) digital data bytes daily, with the vast majority created in just the past two years. One outcome of both the speed and scope of this barrage, according to joint findings by several European research institutions, is that our collective attention span — the basis for public opinion — is becoming both shorter and narrower. Additionally, studies out of the University of Birmingham in England have determined that as we gorge on information we have trouble recalling what we digested.

To make matters worse, much of this information is intentionally misleading. In addition to managing the Covid-19 pandemic, the World Health Organization (WHO) has had to deal with what it calls an “infodemic”: an overabundance of information amidst the crisis; much of it false. The nonprofit organization Avaaz reports that over the past year misinformation networks in at least five countries generated more than three billion views on Facebook. The top ten deceptive “superspreaders” garnered nearly four times the exposure than legitimate sources like WHO and the Center for Disease Control and Prevention (CDC).

Sadly, many of WHO’s concerns about the impact of such lies have come to pass, including making it difficult to identify accurate health information; engendering disinterest and distrust in health professionals and their messages; and leading people to take potentially dangerous advice.

Even more disconcerting is the fact that some of the designers of deceit reside at the highest levels of authority. Here in the United States, Cornell University researchers identified Donald Trump as the single most significant source of Covid-19 misinformation, accounting for about 38 percent of misleading claims. Beyond America’s borders, Russia has played a critical role in the 2016 and 2020 elections, while China, Iran and Saudi Arabia are among nations exploring ways to export their propaganda operations to the U.S.

Birds of a Feather

Because we are social creatures, our perspectives are not entirely our own. Aside from the inner voices that help us interpret information, our outlooks are defined by the communities we inhabit. The more integral they are to our existence, the more profoundly they shape our beliefs and biases.

Homophily is a tendency to associate with others like us. It is an instinctive bond based on age, gender, and race, or where we live, work, worship, or learn. Whether in our personal or professional lives it is often an advantage to surround ourselves with like-minded folks, as it can facilitate communication and increase the likelihood of cooperation. Nonetheless, it can also beget a redundancy of ideas — aka groupthink — which leads to consensus without critical reasoning, thorough examination of facts, or consideration of consequences. In a recent study of Washington political journalists, for example, researchers at the University of Illinois found that news people “may be operating in even smaller, more insular microbubbles than previously thought,” resulting in possible blind spots.

An ancestral and more extreme form of allegiance is tribalism, an unyielding loyalty to a group or ideology that provides physical, intellectual, and emotional security for its followers, often distinct from outsiders. Nowhere is this more evident these days than in the political polarization that is fracturing American society. After evaluating scores of research literature on partisanship, scientists at more than ten major universities identified what they believe is a dangerous variation of polarization. “The current state of political sectarianism produces prejudice, discrimination and cognitive distortion,” lead author Eli Finkel told Northwestern Now. “Along the way, it makes people increasingly willing to support candidates who undermine democracy and to favor violence in support of their political goals.”

The drivers of such polarity, however, may not be solely social. Recent research suggests that divergence in political thinking is also linked to disparities in the brain. In one instance, a University of California, Berkeley study found that the neural responses of liberals and conservatives to trigger words contrast considerably when exposed to language related to risk, emotion, and morality. Similarly, the largest functional brain imaging study of neuropolitics determined that nonpartisans use their gray matter differently than their partisan counterparts.

Divide and Conquer

Differences of opinion are characteristic of healthy societies, but not when they arise from distorted perceptions of reality. One way we try to make sense of complex matters is to seek out and embrace information that confirms our beliefs, no matter how illogical. Another way is to surround ourselves with people who share the same biases. Social media makes it easier to do both.

Despite rather dubious claims by some in government that social networks overly censor information, particularly from conservative sources (others contend that liberals are the real victims), their machinations are far more subtle. From behind the obscurity of their algorithms sites like Facebook eagerly deliver content that not only corroborates, but amplifies, biases, often shutting out dissenting points of view. At the same time, they make it easier for citizens to sequester themselves among “friends.” Whereas the tribes of the past were limited by the proximity of their members, the ubiquity of social media enables users to link up with kindred spirits anywhere on the planet.

While not as pervasive as their social peers, members of the mainstream media can still pigeonhole their audiences by targeting their programming to specific groups of viewers. The overriding objective, writes former MSNBC producer Ariana Pekary in the Columbia Journalism Review, is to boost ratings: “MSNBC calculates that the ideas of the far left will rate. Fox News calculates that the ideas of the far right will rate. And CNN calculates that those two teams arguing with each other will rate.”

Look in the Mirror

As compelling as it is to identify the various forces manipulating our awareness, if we are going to point fingers, we should do so looking in a mirror. When confronting the dual risks of complexity and uncertainty, we generally have two options. We can respond impulsively, a vestige of our evolutionary urge to survive. It is what psychologist Daniel Kahneman calls “thinking fast.” Or we can “think slow” by examining circumstances and weighing available evidence. Online, we often opt for the former.

A study conducted by researchers at the Massachusetts Institute of Technology and the University of Regina in Canada, suggests that many people shared misinformation about Covid-19 because they failed to sufficiently determine whether the content was accurate. On the other hand, those who took the time to evaluate what they read were less likely to pass on bad information. Moreover, note the study’s authors, the structure of social media, with its emphasis on rapid browsing of headlines — true of false — and recognition for those posting attention-grabbing news, also sidetrack accuracy.

Even when we think we understand complex phenomena we may be fooling ourselves into believing we know more than we actually do. This is known as the illusion of explanatory depth, a bias that thrives in the online environment where a reliance on technology can lead to an unwarranted conceit of expertise. In a series of experiments reported by the American Psychological Association, participants who searched for information on the Internet had an inflated sense of their own knowledge even when they couldn’t find what they were looking for. And as we increasingly outsource our intellect to an expanding array of digital systems, we risk forfeiting our ability to effectively think for ourselves.

It Can Get Worse

It is no surprise then that we find it daunting to wrap our minds around many of today’s complex issues; and it will get even harder. This year will end worse than it began. Coronavirus cases and subsequent deaths are projected to seriously rise over the next several months; Americans may be more divided after the election, no matter who wins; authorized violence against people of color shows few signs of abating; and millions of people are sliding back into poverty, including scores of women who have taken the brunt of the disruption.

Albeit some in the media — social and mainstream — have taken steps to address the steady stream of deliberate fiction, their business models still rely on keeping users engaged at almost any cost. Cognizant of this fact, barely half of consumers queried by the Pew Research Center for Journalism and Media believe journalist act in the best interest of the public. The Center also found that despite their many differences, a majority of both Democrats and Republicans distrust social sites like Facebook, Twitter and Instagram as sources of political news.

What consumers are far less conscious of is the role of technology. Generative text is a language program capable of composing types of content from tweets to long-form articles that mimic those written by humans. For their part, deep fakes can indiscernibly doctor videos by replacing or manipulating someone’s likeness, as in the case of Nancy Pelosi’s alleged intoxication. Leveraging techniques from artificial intelligence and machine learning these innovations are moving so rapidly they will soon be indistinguishable from the real thing.

Start Making Sense

Getting back to the mirror, that is where we can begin to tackle this seemingly insurmountable problem. We may not be able to discard biases embedded over millions of years of evolution. Nor can we dissuade those who benefit by manipulating our grasp of reality. But we can give more thought to how we think.

At the very least, we can start by recognizing the difference between knowing and understanding. The former is static and encompasses facts and ideas about which we are confident. The latter is fluid and requires continually seeking and assimilating new information. Here too, context is critical and enables us to make appropriate connections among multiple elements. It also helps to consider issues from the full spectrum of possibilities. And it demands we admit that we don’t, and probably won’t, have all the answers.

At a time of swift and ceaseless change it is seems impossible to “think slow”. But if we hope to deal with our ongoing predicament, we have little choice. Otherwise, it will be, in the words of Talking Head’s front man David Byrne, “same as it ever was.”



Howard Gross
Communicating Complexity

Making complex ideas easier to access, understand, and use