The Growing Urgency to Decode Information Disorders
Integrating diverse expertise is necessary for business leaders to understand the risks, prepare for them.
This is the first essay in a series to help business leaders assess and improve their understanding of information disorders — with disinformation as just the tip of the iceberg. No longer relegated to political campaigns and fringe ideologies, new risks will emerge from a complex information supply chain.
In 1997, New Line Cinema released ‘Wag the Dog,’ an award-winning satire about building alternative media realities for political gain. The film’s plot revolves around manufacturing a foreign crisis to divert public attention from a presidential sex scandal. Last year, Wag the Dog was a prescient reference point as people tried to make sense of a world awash in fake news, conspiracy theories, and sensational headlines.
I’ve been thinking about how producers create such award-winning films in a different context — not as a Dustin Hoffman wanna-be looking to manufacture misleading narratives, but a communicator and citizen interested in protecting people and organizations from them.
The motion picture production model is an apt framework for approaching disinformation research and response; both require activating wide-ranging, specialized talent to adapt to a fluid, constantly-changing context.
How so? Studios match stars and hundreds of niche specialists for each project, assembling a collective that stitches together scenes into a cohesive whole.
A “wicked” problem with no shortcut.
Sectors as diverse as automotive, healthcare, tech, video game development, and beyond follow Hollywood’s lead in the design and execution of complex projects. It’s now time for experts in information disorder to bring together similar superteams.
The nature of today’s information environment requires multidisciplinary specialists — in varying configurations and phases — to address the full range of today’s (and tomorrow’s) information conflicts. We need education experts. Experts in AI. Experts in signal spotting. Crisis experts. Experts in information flow. Business-as-usual, single-agency models won’t cut it.
An “algorithm” or analytics platform cannot solve the challenge.
Addressing this challenge will demand both near-and long-term business resilience. There simply isn’t a shield that can protect organizations from disinformation attacks. Organizations must address a process gap and intelligence deficits that lead to a new way of thinking and managing cybersecurity and communications risks. That is both a knowledge and design challenge.
It’s a high-stakes, organic one; the mechanics of how harmful information spreads is still relatively new and changes constantly. And practically speaking, too few are sufficiently skilled in this new field of intelligence and risk mitigation. Fortunately, cohorts within academia, public policy, technology, and national defense formed over the last decade to build necessary expertise. They decode the inner workings of narrative hijacks, evolving tactics, bot networks, toxic cohorts, deep fake technologies, synthetic amplification.
The equivalent of an integrated studio model can bring multidisciplinary teams together to bridge the widening knowledge and response gap:
In simple terms, these include:
- Language and practices: Understanding the difference between misinformation, disinformation, and narrative conflict.
- Systems at work: Visualizing how harmful narratives form, spread, and extend from digital networks into mainstream media and word-of-mouth propagation.
- Business vulnerabilities: Assessing risk and data deficits that insurgents focus on to advance their agendas.
- Media placement: Stopping brands from supporting or engaging with risky content or unsuitable environments.
- Disruptive methods: Monitoring the evolution of newer phenomenons like deep fake audio, video, images, and automated text generation created to inflict harm.
If you don’t think you’re at risk, think again.
According to Joan Donovan, research director of Harvard University’s Shorenstein Center on Media, Politics and Public Policy, “Disinformation has become an industry, which means the financial incentives and the political gains are now aligned.”
That spells trouble beyond politics. Leaders must move quickly to understand the stakes. This includes how disinformation campaigns work, ideologies or incentives behind narrative conflicts, and technical resources to help strategists prepare for (and when possible, avoid) potentially catastrophic effects.
In just a year, information disorder has gone from an issue primarily facing journalists and politicians to a crisis seizing public and corporate attention worldwide. Groups like QAnon now have highly-elevated visibility beyond just politics, and vaccination disinformation extends far beyond the healthcare sphere.
To add, we’ve seen CEO statements that were never made. Company-hosted events that never existed. And frightening scandals based on videos without context. Whether it’s misinformation, narrative conflict, deep fakes, or even safety-related to unfortunate ad placement, the danger for companies and communities they serve is too big to be ignored.
Wasim Khaled, the founder of Blackbird AI, a platform for disinformation detection, outlines the risks and new factors to address them: “Hostile nation-states, unethical entrepreneurs, and amateur anarchists have upended elections, damaged corporate and personal reputation. Threat actors create bot networks, impersonate news outlets and generate synthetic public outcry to create distrust among target audiences. The goal of disinformation is to create confusion and division.”
Information disorder is not only caused by bad actors and foreign opportunists. It spreads because of unwitting people like you and me, and we all have a responsibility. Companies included.
Much like movie studios had to imagine a system connecting disparate professionals, we must design new collaboration frameworks to address this infodemic.
The next essay will outline a design framework and cohorts that can help address it.