Illusory Truth, Lies, and Political Propaganda

Joe Pierre
Curious
Published in
14 min readAug 13, 2020
Source: pngegg

“If everybody always lies to you, the consequence is not that you believe the lies, but rather that nobody believes anything any longer… And a people that no longer can believe anything cannot make up its mind. It is deprived not only of its capacity to act but also of its capacity to think and to judge. And with such a people you can then do what you please.”
Hannah Arendt

“The truth is always something that is told, not something that is known. If there were no speaking or writing, there would be no truth about anything. There would only be what is.”
— Susan Sontag, The Benefactor

The Illusory Truth Effect

Source: Pixabay

Many of us are familiar with the quotation, “Repeat a lie often enough and people will eventually come to believe it.”

Not ironically, the adage — often attributed to the infamous Nazi Joseph Goebbels — is true and has been validated by decades of research on what psychology calls the “illusory truth effect.” First described in a 1977 study by Temple University psychologist Dr. Lynn Hasher and her colleagues, the illusory truth effect occurs when repeating a statement increases the belief that it’s true even when the statement is actually false.[1]

Subsequent research has expanded what we know about the illusory truth effect. For example, the effect doesn’t only occur through repetition but can happen through any process that increases familiarity with a statement or the ease by which it’s processed by the brain (what psychologists in this context refer to as a statement’s “fluency”). For example, the perceived truth of written statements can be increased by presenting them in bold, high-contrast fonts[2] or when aphorisms are expressed as a rhyme.[3]

According to a 2010 meta-analytic review of the truth effect (which applies to both true and false statements),[4] while the perceived credibility of a statement’s source increases perceptions of truth as we might expect, the truth effect persists even when sources are thought to be unreliable and especially when the source of the statement is unclear. In other words, while we typically evaluate a statement’s truth based on the trustworthiness of the source, repeated exposure to both information and misinformation increases the sense that it’s true, regardless of the source’s credibility.

The illusory truth effect tends to be strongest when statements are related to a subject about which we believe ourselves to be knowledgeable,[5] and when statements are ambiguous such that they aren’t obviously true or false at first glance.[4] It can also occur with statements (and newspaper headlines) that are framed as questions (e.g. “Is President Obama a Muslim?”), something called the “innuendo effect.”[6]

But one of the most striking features of the illusory truth effect is that it can occur despite prior knowledge that a statement is false[7] as well as in the presence of real “fake news” headlines that are “entirely fabricated…stories that, given some reflection, people probably know are untrue.”[8] It can even occur despite exposure to “fake news” headlines that run against one’s party affiliation. For example, repeated exposure to a headline like “Obama Was Going to Castro’s Funeral — Until Trump Told Him This” increases perceptions of truth not only for Republicans but Democrats as well.[8] And so, the illusory truth effect occurs even when we know, or want to know, better.

In summary, psychology research has shown that any process that increases familiarity with false information — through repeated exposure or otherwise — can increase our perception that the information is true. This illusory truth effect can occur despite being aware that the source of a statement is unreliable, despite previously knowing that the information is false, and despite it contradicting our own political affiliation’s “party line.”

Big Brother is Watching; Source: The Thought Police, licensed under the Creative Commons Attribution-Share Alike 4.0 International license.

Illusory Truth and Political Propaganda

In the current “post-truth” era of “fake news” and “alternative facts” (see my previous blog posts, “Fake News, Echo Chambers & Filter Bubbles: A Survival Guide” and “Psychology, Gullibility, and the Business of Fake News”), the illusory truth effect is especially relevant and deserves to be a household word.

That said, the use of repetition and familiarity to increase popular belief and to influence behavior is hardly a new phenomenon. The use of catchy slogans or songs, regardless of their veracity, has always been a standard and effective component of advertising. For example, “puffing” is an advertising term that refers to baseless claims about a product that, despite leaving a company liable to false advertising litigation, no doubt often remains profitable in the long run.

In politics, repeating misinformation and outright lies have been powerful tools to sway public opinion long before the illusory truth effect was ever demonstrated in a psychology experiment. In Nazi Germany, Adolf Hitler famously wrote about the ability to use the “big lie” — a lie so outlandish that it would be believed on the grounds that no one would think anyone would lie so boldly — as a tool of political propaganda. Goebbels, the head of Nazi propaganda quoted earlier, is said to have likewise favored the repetition of lies in order to sell the public on Hitler and the Nazi party’s greatness.

Consequently, the political philosopher Hannah Arendt characterized the effectiveness of lying as a political tool in her seminal post-war classic The Origins of Totalitarianism:

“Society is always prone to accept a person offhand for what he pretends to be, so that a crackpot posing as a genius always has a certain chance to be believed. In modern society, with its characteristic lack of discerning judgment, this tendency is strengthened, so that someone who not only holds opinions but also presents them in a tone of unshakable conviction will not so easily forfeit his prestige, no matter how many times he has been demonstrably wrong. Hitler, who knew the modern chaos of opinions from first-hand experience, discovered that the helpless seesawing between various opinions and ‘the conviction that everything is balderdash’ could best be avoided by adhering to one of the many current opinions with ‘unbendable consistency.’

…the propaganda of totalitarian movements which precede and accompany totalitarian regimes is invariably as frank as it is mendacious, and would-be totalitarian rules usually start their careers by boasting of their past crimes and carefully outlining their future ones.”

In the novel 1984, George Orwell likewise portrayed a fictitious dystopia inspired by the Soviet Union under Stalin in which a totalitarian political party oppresses the public through “doublethink” propaganda epitomized in the slogan, “War is Peace, Freedom is Slavery, Ignorance is Strength.” “Doublethink,” Orwell wrote, consists of “the habit of impudently claiming that black is white, in contradiction to plain facts…it means also the ability to believe that black is white, and more to know that black is white, and to forget that one has ever believed the contrary.” In 1984, this is achieved through a constant contradiction of facts and revision of history to the point that people are left with little choice but to resign themselves to accept party propaganda:

“The party told you to reject the evidence of your eyes and ears. It was their final, most essential command.”

Needless to say, “doublethink,” along with Orwell’s description of “newspeak,” gave rise to the modern term “doublespeak” defined by Merriam-Webster as “language used to deceive usually through concealment or misrepresentation of truth.”

From Russia’s “Firehose of Falsehood” to Trump’s “Alternative Facts”

“Propaganda is the art of making up the other man’s mind for him.”

— Raymond Dodge (1920)

Doublespeak, defined as “language used to deceive usually through concealment or misrepresentation of truth,” was a well-known propaganda tool of the Soviet Union (the official newspaper of the Soviet Communist Party was called Pravda or “Truth”) and continues to be in Russia today. Researchers at RAND have described the Russian model for propaganda as the “firehose of falsehood” due to a core strategy that hinges upon the “shameless willingness to disseminate partial truths or outright fictions” in a way that is rapid, continuous, and repetitive.[9] They cite the illusory truth effect in explaining how this strategy has been so surprisingly effective, through a barrage of disinformation and a “hit first” mentality to breed familiarity and a false sense of credibility. They also note that the strategy involves the use of multiple sources without a consistent message (e.g. through state-controlled media outlets as well as the now-infamous “Russian web brigades” a.k.a. “troll farms” tasked with “setting Americans against their own government”), further exploiting how source ambiguity can strengthen the illusory truth effect.

According to a Vox interview by Sean Illing with Peter Pomerantsev, a Soviet-born journalist and author of This Is Not Propaganda: Adventures in the War Against Reality, a noticeable shift in the Soviet propaganda strategy emerged in post-Soviet Russia that focused not so much on presenting falsehoods per se as breeding disbelief in facts for its own sake:

“20th century Soviet politicians lied a hell of a lot, but they always made their lies sound very respectable, as if they were the truth… [In Russia today, it’s] not about proving something, it’s about casting doubt. Putin isn’t selling a wonderful communist future. He’s saying, we live in a dark world, the truth is unknowable, the truth is always subjective, you never know what it is, and you, the little guy, will never be able to make sense of it all — so you need to follow a strong leader.”

Which brings us, of course, to the striking similarity of Russia’s propaganda machine to the exploitation of the illusory truth effect in US politics today. As Illing points out:

“…we’re experiencing a brand of reality-bending politics that really began in post-Soviet Russia. It’s a politics built on a distinctive form of propaganda, the goal of which is to confuse, not convince.”

Pomerantsev describes how he has witnessed Russian propaganda strategies spread across the globe over the past decade:

“What was different about these new Russian politicians is that they just didn’t play this factuality game at all. They didn’t care about facts and didn’t pretend to care. So you couldn’t really call out their lies because they were never playing that game. And this is exactly what you see Trump doing right now.

…it’s spread from Russia across the world… The same kind of politics I saw in Russia years ago is the same kind of politics I’m seeing now in the UK and Brazil and the Philippines and the US.”

Indeed, a direct connection between Russian propaganda and President Trump’s presentation of “alternative facts” is impossible to ignore. Numerous authors, if not half of the country that didn’t vote for him, have been lamenting how President Trump and his administration have seemingly distorted facts as a means of deliberate propaganda (see my previous blog post “The Death of Facts: The Emperor’s New Epistemology”).

Sarah Kendzior, author of the forthcoming book Hiding in Plain Sight: The Invention of Donald Trump and the Erosion of America, hosts a podcast with Andrea Chalupa that frequently reminds listeners that the Trump administration is part of a “transnational crime syndicate masquerading as a government.” The podcast’s title, Gaslit Nation, refers to their assertion that the Trump administration is “gaslighting” America in precisely the way that Arendt, Orwell, and Pomerantzev have described, by repeatedly contradicting the facts and claiming that black is white. This assertion is supported by independent databases maintained by Politifact and The Washington Post that tally false claims involving President Trump. According to The Washington Post’s Fact Checker, President Trump has made 15,413 false or misleading statements (and counting) since taking office. Many of these, such as the false claim that he created “Choice” for Veterans (the federal program that allows Veterans to access care outside of the Veterans Health Administration was passed during the Obama administration) have been repeated again and again to the point that some no doubt believe it.

2020 it seems, is the new 1984.

A recent article in The Atlantic by Peter Nicholas details how President Trump has promoted and relied upon falsehoods in the form of conspiracy theories for “political and personal ends,” such as the claim that it was Ukraine and not Russia that interfered with the 2016 presidential election. It quotes Rice University history professor Douglas Brinkley as saying that until now, “we’ve never had a president who trades in conspiracy theories, who prefers lies instead of fact.”

Of course, Trump was promoting conspiracy theories well before he took office. In 2011, he became the “virtual spokesperson” of the “birther” conspiracy theory about President Obama. A national survey from that year revealed that 24 percent believed it with an additional 24 percent neither agreeing or disagreeing.[10] In retrospect, this was not only a powerful illustration of the illusory truth effect in action but a harbinger of things to come.

Political scientists Nancy Rosenbaum and Russell Muirhead, authors of A Lot of People Are Saying: The New Conspiracism and the Assault on Democracy, have argued that the Trump administration has since decoupled “theory” from “conspiracy theory” by repeatedly calling into question the facts in favor of contradictions based on unsubstantiated rumors:

“There is no punctilious demand for proofs, no exhausting amassing of evidence, no dots revealed to form a pattern, no close examination of the operators plotting in the shadows. The new conspiracism dispenses with the burden of explanation. Instead, we have innuendo and verbal gestures: “A lot of people are saying…” Or we have the assertion: “Rigged!” — a one-word exclamation that evokes fantastic schemes, sinister motives, and the awesome capacity to mobilize three million illegal voters to support Hillary Clinton for president. This is conspiracy theory without the theory. What validates the new conspiracism is not evidence but repetition.”

Whether or not the authors are aware of it, this passage perfectly illustrates the illusory truth effect. President Trump tends to employ short phrases of uncertain attribution (“A lot of people are saying…”) that contradict facts, repeating them ad infinitum (e.g. “Stable genius!” “Rigged!” “No collusion, no obstruction!” “Fake news!” “Witchhunt!” “Read the transcripts!”) until even those who identify with the opposing political party who know statements are lies might come to believe them on some level. Although President Trump has been ridiculed for speaking at a fourth-grade reading level and relying on Twitter to communicate not only to the American public but foreign entities, it may be that his vocabulary, syntax, and preferred communication medium are intentional if not taken directly from a page in the Russian propaganda playbook.

Restoring “Truth, Justice, and the American Way”

Superman Statue; Source: Joseph Novak/Flickr/Creative Commons Attribution Generic 2.0 License

The illusory truth campaigns of Russia and Trump aren’t about one big lie so much as they seem to be leading toward the death of truth by thousands of smaller but just as audacious untruths. The Arendt quotation at the top of the page bears repeating:

“If everybody always lies to you, the consequence is not that you believe the lies, but rather that nobody believes anything any longer… And with such a people you can then do what you please.”

Comparing Rosenbaum and Muirhead’s perspective with that of Pomerantsev, the tactics of Russia and the Trump administration may be similar not only in form but in ultimate objective. With former Trump campaign manager Steve Bannon calling for the “destruction of the administrative state,” eroding the very concept of truth seems to be part of a larger effort designed to tear down the authority of previously respected institutions or, as journalist Jesse Singal put it, to “undermine order itself.”

Although the erosion of truth and the rejection of expertise is often a core component of populist movements, Arendt and scholars of her work have highlighted that the path from populism to authoritarianism, totalitarianism, and fascism can be short, direct, and downhill (see here and here for accounts of how this has happened in recent history around the world). Whether that goal is Bannon’s, Trump’s, or Putin’s and how they are connected remains unclear. What is clear is the threat that goal poses to “truth, justice, and the American way.”

What can we do to counter this threat and how can we reverse or prevent the illusory truth effect? Researchers at RAND are pessimistic about “traditional counterpropaganda efforts,” noting that we shouldn’t “expect to counter the firehose of falsehood with the squirt gun of truth.”[9] Indeed, recent evidence suggests that there are several barriers to countering misinformation. One is that successfully correcting misinformation may require more cognitive ability than many voters have, regardless of political affiliation.[11] Another is that the illusory truth effect seems to include not only faulty perceptions of truth, but also a diminished sense that repeating disinformation is unethical[12] such that we risk becoming inured to the pervasiveness of lying, with liars in turn no longer bothering to deny that they’ve lied.

The internet also represents a major obstacle to countering the illusory truth effect. While the internet’s unprecedented democratization of knowledge has fueled populist movements across the world, the hope that truth would “rise to the top” has not been realized. On the contrary, research has shown that online disinformation spreads “farther, faster, deeper, and more broadly” than the truth.[13] Together with the illusory truth effect, the internet has allowed authoritarian political movements to spread rampant disinformation and erode trust in not only the “fake news” media but the concept of truth altogether by delegitimizing objective reality.

Still, recent research provides us with some pointers on how we might stem the noxious impact of disinformation. In one recent experiment, encouraging young people to act like “fact checkers” mitigated the illusory truth effect.[14] Similar studies have shown that “inoculation strategies” warning people about likely exposure to misinformation and beating misinformation to the punch, can reduce susceptibility to being taken in by disinformation.[15] Based on such findings, RAND researchers suggest that forewarning people about how propagandists exploit the illusory truth effect to manipulate audiences is likely to be more effective than specific refutations.[9] And so…

The illusory truth effect is real.

It is commonly exploited as a tool of political propaganda.

Consider yourself warned.

This article was originally published in two parts on my blog, Psych Unseen, at Psychology Today.

References

1. Hasher L, Goldstein D, Topping T. Frequency and the conference of referential validity. Journal of Verbal Learning and Verbal Behavior 1977; 16:107–112.

2. Reber R, Schwarz N. Effects of perceptual fluency on judgments of truth. Consciousness and Cognition: An International Journal 1999; 8:338–342.

3. McGlone MS, Tofighbakhsh J. Birds of a feather flock conjointly (?): Rhyme as reason in aphorisms. Psychological Science 2000; 11:424–428.

4. Dechene A, Stahl C, Hansen J, Wanke M. The truth about the truth: a meta-analytic review of the truth effect. Personality and Social Psychology Review 2919; 14:238–257.

5. Arkes HR, Hackett C, Boehm L. The generality of the relation between familiarity and judged validity. Journal of Behavioral Decision Making 1989; 2:81–94.

6. Wegner DM, Wenzlaff R, Kerker RM, Beattie AE. Incrimination through innuendo: can media questions become public answers? Journal of Personality and Social Psychology 1981; 40:5:822–832.

7. Fazio LK, Brashier NM, Payne BK, Marsh EJ. Knowledge does not protect against illusory truth. Journal of Experimental Psychology: General 2015; 144:993–1002.

8. Pennycook G, Cannon TD, Rand DG. Prior exposure increases perceived accuracy of fake news. Journal of Experimental Psychology: General 2018; 147:1865–1880.

9. Paul C, Matthews M. The Russian “firehose of falsehood” propaganda model: Why it might work and options to counter it. Santa Monica, CA: RAND Corporation, 2016. https://www.rand.org/pubs/perspectives/PE198.html

10. Oliver JE, & Wood TJ. Conspiracy theories and the paranoid style(s) of mass opinion. American Journal of Political Science 2014; 58, 952–966.

11. De keersmaecker J, Roets A. ‘Fake news’: Incorrect, but hard to correct. The role of cognitive ability on the impact of false information on social impressions. Intelligence 2017; 65:107–110.

12. Effron DA, Raj M. Misinformation and morality: Encountering fake-news headlines makes them seem less unethical to publish and share. Psychological Science 2019.

13. Vosoughi S, Roy D, Aral S. The spread of true and false news online. Science 2018; 359:1146–1151.

14. Brashier NM, Eliseev ED, Marsh EJ. An initial accuracy focus prevents illusory truth. Cognition 2020.

15. Cook L, Lewandowsky S, Ecker UKH. Neutralizing misinformation through inoculation: Exposing misleading argumentation techniques reduces their influence. PLoS ONE 12(5): e0175799.

--

--

Joe Pierre
Curious
Writer for

Dr. Joe Pierre is a professor of psychiatry at UCSF and author of the Psych Unseen blog at Psychology Today. Twitter @psychunseen.