Non-controversial statement: During elections there will be people saying things that might not, strictly speaking, be 100% true. For instance, an unscrupulous politician might try to convince you that “Whales speak French” to spin his amateur whaling hobby as anti-immigration credentials. Stranger things have happened.
Fortunately in that eventuality there will also be a legion of people ready to stand up and say, “Um, excuse me. The Office of National Statistics data doesn’t show that a single whale in British waters speaks French, let alone ALL whales”. We’ll all laugh derisively at the idea that anyone could get away with such blatant untruths.
This piece is about who has the last laugh.
The basic British approach to truth in politics is that everyone should just “fight it out”. Lord Bingham in R (Animal Defenders International) v Secretary of. State for Culture, Media and Sport gave a nice summary of this take on democracy:
The fundamental rationale of the democratic process is that if competing views, opinions and policies are publicly debated and exposed to public scrutiny the good will over time drive out the bad and the true prevail over the false. It must be assumed that, given time, the public will make a sound choice when, in the course of the democratic process, it has the right to choose. But it is highly desirable that the playing field of debate should be so far as practicable level. This is achieved where, in public discussion, differing views are expressed, contradicted, answered and debated. (2008:15)
The underlying idea is that bad information, whether maliciously or innocently entered into the debate, can be corrected with good information. In an active and vigorous political culture, lies will be punished and truth will rise to the top. In the political marketplace the voters are savvy shoppers.
Regrettably there’s a problem with this. It turns out that correcting the facts people have learned is incredibly difficult. Once someone has learned a fact, that fact is both hard to remove and (scarier still) attempts to do so can reinforce the initial incorrect fact. When you set out to debunk something there is a real risk of actually making the situation worse.
The big obstacle is that “any attempt to explicitly discredit false information necessarily involves a repetition of the false information” (Schwarz et al. : 2007: 146) and that by repeating a fact, it increases the fluency of that fact. This is the case even if the context of the repetition is “this fact isn’t true”. As Yoon and colleagues put it “when people find a claim familiar because of prior exposure but do not recall the original context or source of the claim, they tend to think that the claim is true.” (Yoon et al.: 2005:714)
This effect isn’t limited to repetition and extends to any technique which makes a fact more fluent. Reber and Schwarz found that statements presented in an easier to read format made participants more likely to accept the facts presented. (Schwarz et al: 2007: 146).Vindicating the history of political slogans, McGlone and Tofigbakhsh (2000) compared the judged truthfulness of two phrases with essentially identical means (“woes unite foes” and “woes unite enemies”) and found that phrases that rhymed were more likely to be accepted as true than statements that didn’t.
Even if your fact check is initially convincing, the effect can quickly wear off. While a person may initially seem to understand that a fact is false after being exposed to counter-information, given time the context (“This isn’t true”) will decay and the initial fact (“Whales speak French”) has been reinforced. For example a standard “Facts & Myths” flyer on vaccination information had an incredibly short period of effectiveness — after just 30 minutes the flyer had impaired “participants’ attitudes towards vaccination intentions, relative to controls who read no flyer at all.” (Schwarz et al.: 2007: 149)
Studies examining ‘adwatches’ (segments on US news programmes checking the factuality of political adverts) similarly found that “the effects of the adwatches on interpretation of the ad seem to decay quickly. Accuracy in interpretation of the ad drops off as time elapses between the adwatch and the posttest” (Cappella & Jamieson: 1994: 358). Importantly this isn’t just a case of the adwatch being forgotten as “people remembered what the adwatch said and retained its attitudinal implications but not the inferences they were supposed to draw” (Cappella & Jamieson: 1994: 358–359).
There seems to be an age effect involved in recollection of truth (which is particularly concerning given how the voting disparity between older and younger voters). Yoon et al. found that “After 3 days, older adults misremember 28% of false statements as true when they were told once that the statement was true but 40% when told three times that the statement was false” with no “parallel tendency to misremember true information as false” (2005: 718). This effect is explained by a decay in contextual cues surrounding the information that occurs faster in older adults “after a long delay, older adults are more likely than younger adults to have experienced a decline in context memory but not in familiarity” (Yoon et al: 2005: 714). Political information runs into further hurdles as a person accepting or rejecting information can depend on their existing partisan allegiance — in a US context Berinsky found that, “Republicans and Conservatives accept Democratic-based rumors and reject Republican-based rumors” with the reverse partisan relationship also existing to a weaker extent. (2012:16)
This is all on the face of it a bit fatal to our idea that public discourse can be an effective sorter for truth. Fortunately this field does point the way towards some techniques that might be more effective.
One suggested approach is to lead out with the correction and bury the original fact below, as Cook and Lewandowsky advise in their Debunking Handbook “the often-seen technique of headlining your debunking with the myth in big, bold letters is the last thing you want to do. Instead, communicate your core fact in the headline. Your debunking should begin with emphasis on the facts, not the myth.” (2011: 2) If possible the incorrect information shouldn’t be mentioned at all — Berlinksy found that simply providing the ‘correction’ alone led to higher rates of rumour rejection than presenting the rumour and correction together (2012: 46–47).
With political information the expectation of partisan bias can also be deployed to increase rejection rates, Berlinksy found that countering a rumour “with statements from an unlikely source can, under the right circumstances, increase the willingness of all citizens to reject rumors regardless of their own political predilections.”Addressing the ‘death panel’ rumour with a counterargument posed by a Republican politician increased the rates it was rejected by both Republicans and Democrats (2012:1).
So the situation isn’t hopeless, but is pretty bleak. Voters, far from judging conflicting facts objectively, are subject to powerful metacognitive effects that filter and change what they perceive as true depending on factors that are irrelevant to the debate at hand. This problem can be commuted but never completely avoided.
What language do whales speak again?
R (Animal Defenders International) v Secretary of. State for Culture, Media and Sport  UKHL 15
Berinsky, A. J. (2012) Rumors, Truths, and reality: A study of political misinformation. [online]. Available from: http://web.mit.edu/berinsky/www/files/rumor.pdf (accessed 15 July 2012)
Cappella, J. & Jamieson, K. (1994) Broadcast Adwatch Effects, A Field Experiment, Communications Research, 21 (3), 253–265
Cook, J., Lewandowsky, S. (2011), The Debunking Handbook. St. Lucia, Australia: University of Queensland. November 5. ISBN 978–0–646–56812–6. [http://sks.to/debunk]
McGlone, M. S., & Tofighbakhsh, J. (2000). Birds of a feather flock conjointly (?): Rhyme as reason in aphorisms. Psychological Science, 11, 424–428.
Schwarz, N. et al. (2007) Metacognitive experiences and the intricacies of setting people straight: Implications for debiasing and public information campaigns. Advances in Experimental Social Psychology. [Online] 39127–161.
Yoon, C. et al. (2005) How Warnings about False Claims Become Recommendations. Journal of Consumer Research. 31713–724.