“Don’t answer the question you were asked, answer the question you wanted to be asked.”
It was my first day of public-relations training, and this was the first take-away from the very first session.
My instructor, a fast-talking Washington PR flak with a paper coffee cup glued to his right hand, went on to explain that nobody steps into the public arena just to have a friendly chat.
Instead, they put themselves out there in the public eye, leaving themselves open to ridicule on talk radio and 24-hour cable network news, to expose as many people as possible to specific messages that have been carefully crafted in advance.
With this surprising re-frame, so much of the absurdity I’d seen and heard in the three-ring media circus for most of my young adult life suddenly made so much more sense.
The back-and-forth squabbling… the talking past each other… trying to shift the topic of debate… “you didn’t answer my question…” an interview guest using some gratuitous transitional phrase like “to be perfectly clear” or “with all due respect” to bridge to a separate issue entirely…
Every experienced professional who appears publicly on television and radio, everyone who’s interviewed for newspapers and magazines, is answering the questions they’re prepared to answer instead of the questions they were asked. And many of them have taken a training course similar to this one.
It’s been nearly a decade since I finished my introduction to the world of public relations, but I still remember the way my instructor held his coffee cup steady at eye level and pressed play on his laptop.
“Are you doing battle in the information wars?” he said. “Then you’d better put on your armor.”
You’ve got to be prepared, in other words; you’ve got to rehearse before you sit down in front of the microphone, the camera, the keyboard. Otherwise you’re going to make a mistake.
Mistakes don’t happen often, but when they do, the polished discourse we’ve become accustomed to—a discourse like two skilled fencers dancing back and forth—comes to a halt; an unqualified hit is scored and we realize, suddenly, how much of what we see and hear is actually a kind of improvised professional theater.
Do you recall, for example, when former New York mayor Rudy Giuliani, acting as President Trump’s personal lawyer, acknowledged on Fox News in May 2018 that candidate Trump had reimbursed his lawyer, Michael Cohen, $130,000 in hush money to pay off his former mistress? The unforced error went against the President’s loud denials that he knew anything about the payment and went against Cohen’s own statements.
And what about the White House Chief of Staff’s jaw-dropping confirmation, in front of the press in October 2019, that military aid to Ukraine was indeed tied to President Trump’s demand for an investigation into Joe Biden? The screw-up, which seemed to provide the “smoking gun” evidence of the quid pro quo that President Trump had denied vociferously for months, provided some fantastic fodder for the impeachment hearings that followed.
Both episodes were unqualified disasters with legal implications for the Trump Administration.
And both speakers could have easily avoided these blunders with better training, better preparation, and — of course — the answer to a question they wanted to be asked, not the question they really were asked, ready on the tip of their tongue.
They’re Called Bullets for a Reason
By now almost everyone’s familiar with the phrase “talking points,” which refers to a series of pithy bullet-pointed statements that one can either read from, reproduce in writing, or—in the case of someone making a public appearance on television—memorize in advance and then recite confidently in front of the camera.
This is how you answer the question you wanted to be asked, not the question you were really asked.
Talking points are the linguistic bullets of the information wars, the basic weapon that repels the attacks of the enemy while (hopefully) scoring some hits for your team in the process.
But what many people don’t grasp is the amount of work and coordination that often goes into these pithy statements, and the extent to which they reflect strategic messages that have been carefully crafted and tested for maximum impact.
Let’s stop talking about all this in the abstract, though.
Let’s look at one type of scenario where talking points are commonly created and used.
The science of risk communication is one area where many examples have been made public, so it’s a good place to focus on.
What is risk communication?
It’s the job of talking to the public about really stressful, potentially life-threatening situations— natural disasters, terrorist bombings, public transportation accidents, and the like.
In my research for this article I discovered one fascinating document in the risk communication oeuvre, authored by the Center for Risk Communication in New York, which provides templates for risk communicators to develop talking points in “high concern situations.”
According to this document, risk communicators should break their core message down into three talking points that express Compassion, Conviction, and Optimism.
This “CCO” template provides a risk communicator with the following prompts to match each of the three suggested talking points:
- “I am very sorry…” (Compassion)
- “I believe …” (Conviction)
- “In the future…” (Optimism)
Does it sound familiar?
I am very sorry to hear about the train derailment. I believe our government can get to the bottom of what happened. In the future, our trains will be safer for everyone.
Or how about:
I am very sorry so many people died in the building. I believe what happened was not an accident. In the future, we will know who did this.
I’m very sorry you felt it necessary to rob a bank. I believe you’re still a good person. In the future, you’ll look back and realize you made a mistake.
It’s a little spooky, right?
Well, to get a sense of the massive planning effort that goes into the development of talking points like these, I recommend browsing a risk communications document published in early 2006 by the Department of Health and Human Services entitled “Pandemic Influenza Pre-Event Message Maps.”
Topping out at 74 pages, this official government document is brimming with hundreds of messages intended to calm the public’s fears and answer their questions during — you guessed right! — a pandemic.
It’s called a “pre-event” document because it was written before anything bad happened. This is just so fascinating and dark. With risk communications, you don’t need to wait for a tragedy to happen in order to come up with things to say to calm people down about it. Because it’s a scientifically proven field of research, you can follow a formula to craft your messages in advance.
In other words, the talking points created based on the above documents don’t simply reflect one person’s opinion about how to communicate publicly in a crisis — they’re the result of scientific testing that represents the most effective known way to communicate.
The Role of Validators
Risk communications is just one subset of the much larger discipline of Communication Sciences.
Political communications—the field that looks at messages disseminated by politicians, parties, and organizations—is another much-studied area.
But rather than attempting to calm people and convey information during a crisis, the field of political communications is focused on energizing people with a call-to-action or influencing them to change their opinions.
And this is where things go from being just a little spooky to being downright freaking haunted.
Because coming up with some slick, scientifically validated talking points doesn’t mean you’ve won the public battle.
You also need to boost the signal. Get the message out there.
You need to cut through hundreds of distractions people face in their everyday lives and find some way to get your talking points hooked into their heads.
To accomplish this, political parties, corporations, government agencies, and other special-interest groups use talking points in coordinated way, distributing their pre-developed messages to allies called “validators” in order to blanket media channels with different versions of the same basic information.
This wouldn’t be a problem if the connections between groups and their validators were clearly disclosed or, at the very least, easy to infer.
Unfortunately, validators often have secret agreements with message originators, and these agreements are not readily known.
In the case of a Democrat who puts forward legislation that is publicly and vocally supported by a Democrat business group, for example, there is no presumption of independence because they both share a Party affiliation.
But what about a special-interest magazine sharing a Tweet from a government agency or private corporation without disclosing that they have an agreement to share that information?
Or what about a seemingly independent Medium writer who has a secret agreement to share and “validate” talking points from the Republican Party, Russian disinformation groups, or both?
These connections may be ethically suspect, but in most cases they’re not actually illegal. The appearance of independence confers legitimacy on ideas, and manufacturing legitimacy is the key to most modern information campaigns.
One freaky modern strategy that relies on supposedly independent validators who really have a secret agreement with a central campaign manager is known as astroturfing.
Astroturfing is a clever name, a twist on the “grassroots” concept of activism whereby a groundswell of popular support touches off a shift in consciousness. But as the name implies, “astroturfing” is a fake groundswell. It’s the opposite of grassroots activism because all the messages and activities of a supposedly organic movement are centrally controlled and planned, with shell organizations and secret validators lined up to create the illusion of mass agreement and popular interest.
The “Reopen America” protests of Spring 2020 — in which armed domestic terrorists calling themselves “militia” took over state capitol buildings and blocked traffic with caravans of vehicles, all with the purpose of reopening the locked-down United States before the coronavirus pandemic had been definitively crushed— is a recent example of an astroturfing campaign in action.
As documented in several fascinating articles about the protests, security researchers who analyzed the electronic footprints of the protestors found that many of these supposedly organic groups were simply tied back to existing groups and individuals with connections to the White House, conservative lobbyists, and gun-rights organizations.
On account of this successful astroturfing campaign, we’ll probably never know how many lives were lost because a small minority of Americans were manipulated by business and political interests to protest against the coronavirus lockdowns just as they were really starting to work, thereby providing state governments with the needed political cover to enter “phased reopening” without any semblance of a national strategy in place.
Enter the Echo Chamber
What is the ultimate goal of all this artifice, all this jockeying for information supremacy?
It’s easy to get sucked into advocacy for its own sake, never stopping to think what victory in the information wars actually looks and feels like.
But as I wrote about in my essay “The Jackpot of the Availability Cascade,” good talking points are like a virulent infection.
They spread from one person to another, one mind to another, and if they are well-engineered and correctly dispersed they can infect enough people to change social, economic, and political behavior.
The similarity between “information viruses” and real viruses that are released into the world and then jump from one host to another has not been lost on researchers and communications professionals.
Unfortunately, those who seek to spread fake news and misinformation have also noticed the parallels, and they have seized upon the mass communication power of the Internet to boost the reach and the effectiveness of their manipulative talking points.
To understand the true scope of what is happening here, we can consider the real examples provided by UK-based game producer Ndemic Creations, which collaborated with professional fact-checking organizations Full Fact (UK) and Politifact (U.S.A.) to add a “fake news” scenario to its best-selling real-time mobile strategy game Plague, Inc.
In Plague, Inc. players create and evolve a pathogen to spread around the world and wipe out humanity. The game has been praised for its complex, realistic variables which impact the severity of the pathogen and how rapidly it spreads between countries.
For the “fake news” scenario, players create and evolve a conspiracy theory and spread it throughout the world until nobody is any longer informed about the truth.
If the term “echo chamber” has some familiarity but little concrete imagery for you, a few examples from the game should bring into colorful perspective the way seemingly unconnected people and sources can act in concert to provide the illusion of widespread agreement about dubious information and points of view.
Among the realistic validation tactics that can be used to spread misinformation, Plague, Inc. allows players to:
- hire fake experts in relevant fields to “come out in support of misinformation,”
- convince “established voices across mainstream media to explicitly produce content to lend credence to misinformation,”
- “fund teams to target users of social media with data-driven ads” and “target those most vulnerable to misinformation and most likely to spread it further,”
- “utilize bots to automatically boost accounts spreading misinformation with large numbers of followers, lending them credence,”
- “sign up influential personalities, journalists, and writers to support misinformation”; and
- become “part of a lifestyle/identity and develop strong emotional bonds which help people passionately believe in misinformation and reject contradictory information.”
Ironic, isn’t it?
To launch a fake conspiracy theory you first need to create an actual conspiracy.
And yet average Americans — even well-educated ones on every dimension of the political spectrum — don’t seem to realize the extent to which thousands of organizations, affiliations, and secretive communities use tactics like those simulated by Plague, Inc. to spread their misleading talking points far and wide.
I’m not saying that we need to do away with talking points.
For individuals, talking points are a great way to avoid embarrassment and get the point across.
I use them myself for all kinds of things — phone calls with a utility company, for example, and tough meetings at work when I want to make sure I ask the right questions.
When you zoom back and look at the big picture, however, the use of talking points on the mass scale begins to look dangerously like dystopian brainwashing — especially in an environment like the United States, where speech can be purchased and there are few limitations on what can be said.