To Protect Democracy We Need to Upgrade our Cognitive Immunity

Nick Monaco
Jan 9 · 4 min read

By Marina Gorbis, Executive Director, Institute for the Future and Nick Monaco, Director, Institute for the Future, Digital Intelligence Lab.

We are living in an unprecedented information environment. Our attention is monopolized and fractured by a multitude of devices, applications, websites, and notifications. Manipulation has never been so easy or so refined. And mass data surveillance enables exquisitely accurate targeting of manipulative information. Traditional strengths of democratic systems — diversity and freedom of expression — are making us particularly vulnerable to a growing army of media and information manipulators.

Viewing such manipulators as parasites attacking democratic societies’ cognitive immune systems might be the best way for us to come up with defense strategies. Complex biological and ecological systems are under constant attack from intruders on the lookout for weaknesses in the system they can exploit in order to grow and propagate. Under slowly changing conditions, the system relies on immunity it has developed to ward off such attacks. But under conditions of abrupt change, established defense mechanisms are no longer effective against new forms of attack.

In fact, biological vulnerability is an apt metaphor for our body politic today. Dramatic technological changes are providing fertile ground for various types of media pathogens to undermine our democracies. They include everything from rapidly evolving armies of propaganda bots posing as humans to doxxing, hacking, trolling and other dirty tricks to damage reputations and attack opponents. These new forms of attack threaten our individual and collective cognitive immune systems — our abilities to distinguish truth from lies, important from marginal, real from fake, scientific proof from wishful thinking. Ultimately a society with a compromised cognitive immune system cannot make good choices that ensure the well being of its citizens.

Read the full map here.

The new media parasites exploit unique characteristics of global communications platforms and our own cognitive biases — cognitive adaptations we have evolved over millennia of living together, trying to make sense of the world, collaborate and synchronize actions. They include the Confirmation Bias — our tendency to selectively seek and interpret information so as to confirm existing beliefs. In-group bias makes us treat members of our own “tribe” more favorably, trusting information from “people like us” while discounting information coming from “others.” The bandwagon effect causes us to accept beliefs and values as more people jump on board — it is easier to join a crowd than to start one.

In order for us to safeguard a healthy body politic, our cognitive immune system needs an upgrade. But if we limit the upgrade to technological solutions — such as relegating moderation of coordinated harassment to algorithms and sophisticated machine learning systems, we are going to fail. We need to acknowledge that technological solutions perpetuate the arms race between the “good” and “bad” actors. To upgrade our cognitive immune system we need to recognize the complex system of technologies, institutional arrangements, beliefs, individual predispositions, and many other factors that shape social cognition and activate multiple levers. We call these levers “immunity activators” and here are some of the important ones:

● Independent Media Platform Review Bodies: we need to establish independent and credible oversight bodies for information and media platforms, which have become part of the critical communications infrastructure. Like basic utilities they should be subject to public oversight and regulation.

Public Media Platforms: private social media platforms are built upon the same core technologies as commons-based ventures like Wikipedia. However, commons-based or public platforms do not have access to needed capital and must rely on donations, grants, and relatively small philanthropic investments. We need to create and incentivize greater capital flows that support public and pro-social media platforms.

Data Ownership and Governance Rules: Treating data as a personal or public asset significantly changes the economics and operating principles of current social media businesses, removing some of the incentives for manipulation.

Education Beyond Media Literacy: In addition to teaching students about how media platforms work, we should be raising levels of awareness of our own cognitive biases and how these might impact our thinking. Our educational institutions should create literate individuals with good overall critical thinking skills, including knowledge of history and an ability to connect the past, the present, and the future.

Early Warning Systems: we need to close the knowledge gap between technology creators and policymakers. Instead of policy having to catch up with unintended and sometimes nefarious impacts of technologies, policymakers need to be able to anticipate longer term consequences before such technologies scale. This would help us develop policies and regulatory frameworks for preventing damages before they happen.

We need to act on multiple fronts and acknowledge complex dilemmas we have to navigate along the way: free expression vs. protecting vulnerable populations, cognitive cohesion vs cognitive diversity, rapid innovation vs deliberation. In the same way that a biological immune system can backfire or over-react, a heightened cognitive immune system can sometimes damage the system it is designed to protect. We have to accept these as dilemmas — things we have to live with and learn to manage.

In the paper titled Biology of Disinformation our IFTF colleagues note that, The power of both biological and media viruses reveal less about themselves than they do about their hosts.” Today’s media parasites are revealing vulnerabilities in our cognitive immune system and the only way we can deal with the attack is to upgrade the system for the new information environment and media manipulators.

To read the full report and map, visit

Digital Intelligence Lab

Exploring the impact of technology on society at Institute…

Nick Monaco

Written by

Director of the Digital Intelligence Lab (DigIntel) at Institute for the Future. Enthusiast of all things, disinfo, digital rights and linguistics.

Digital Intelligence Lab

Exploring the impact of technology on society at Institute for the Future  — we view technical problems through a social scientific lens. We blog on disinformation, election integrity, digital rights, and understanding the issues that emerge at the nexus of ethics and technology.

More From Medium

More on Disinformation from Digital Intelligence Lab

More on Disinformation from Digital Intelligence Lab

Related reads

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade