The Fisher King: Solving Climate Change in a Post-Truth World

[This talk was originally given at MetaDada on May 5th, 2018 and again on May 8th, 2018 at Up.front. This is an article version adapted from the talk’s transcript]

In Arthurian legend, the Fisher King was pierced through the thigh by a spear and left with an unhealable wound. Impaired and unable to hunt, he spends his days fishing instead, and waits — dying — while his kingdom also wastes away. But the king has a secret: He’s the last in a long line charged with guarding the Holy Grail. If only someone would come along and ask the right question to earn the grail, he’d be healed — and so would his kingdom.

Psychological studies posit that belief in conspiracy theories, such as the idea that climate change is a hoax, generates “self-defeating”[1] outcomes, serving to suppress feelings of autonomy and willingness to act, among other things. The result is what one might consider a deeply compromised individual, resigned to impotency in the face of forces they deem more powerful than themselves — much like the Fisher King.

Increasingly, social media exacerbates these tendencies by supporting the proliferation of user-specific filters and social exclusion, as well as exposure to increasingly extreme conspiracy content via recommendation algorithms.

With an issue as pressing as climate change, we cannot afford inaction. Understanding the influence of algorithm-driven platforms like social media on belief systems, how might we ‘engage the Fisher King, ask the right question — and restore the land?’

There are three different claims pertaining to climate change skepticism:

  1. Climate change isn’t real
  2. Climate change isn’t caused by humans
  3. The magnitude of climate change is exaggerated

A 2013 poll found over a third of Americans believe that global warming is a hoax. And it’s not only the general public: Prominent politicians have expressed climate change skepticism. Senator James Inhofe, who served as the U.S. chairman of the Senate Committee on Environment and Public Works, famously said in the Senate that global warming is a hoax and that he was “going to expose the most powerful, most highly financed lobby in Washington, the far left environmental extremists.” In 2012, he published a book, “The Greatest Hoax: How the Global Warming Conspiracy Threatens Your Future.”

Scott Pruitt, the current Administrator of the Environmental Protection Agency in the U.S., rejects the notion that human-caused carbon dioxide emissions are a primary contributor to climate change and has falsely claimed there is no scientific consensus on climate change.

And of course, there’s U.S. President Donald Trump, who’s tweeted climate change skepticism and conspiracy theories over 115 times.

Around this time last year, it was reported in Politico that K.T. McFarland, the U.S. deputy national security adviser, had given Trump a printout of two Time magazine covers. One, supposedly from the 1970's, warned of a coming ice age; the other, from 2008, about surviving global warming.

Trump allegedly “quickly got lathered up about the media’s hypocrisy.” However, the 1970's cover was fake, part of an internet hoax that’s circulated for years. Staff reportedly intervened before Trump tweeted or talked publicly about it.

It’s clear there’s a strong draw to endorse climate change conspiracy theories. But why?

“Research suggests that people may be drawn to conspiracy theories when — compared with nonconspiracy explanations — they promise to satisfy important social psychological motives that can be characterised as epistemic (the desire for understanding, accuracy, and subjective certainty), existential (the desire for control and security), and social (the desire to maintain a positive image of the self or group).”[2]

Epistemic motives

Research suggests belief in conspiracy theories is stronger:

  • When the motivation to find patterns in the environment is experimentally heightened
  • Among people who habitually seek meaning and patterns in the environment
  • When events are especially large in scale or significant and leave people dissatisfied with mundane, small-scale explanations
  • When people experience distress as a result of feeling uncertain

Conspiracy theories appear to provide internally consistent explanations that enable people to preserve beliefs in the face of uncertainty and contradiction.

Furthermore, the need for cognitive closure is associated with beliefs in conspiracy theories for events that lack clear official explanations.

How well do conspiracy theories satisfy the epistemic motives that draw people to them?

Recent experiments reveal that presenting people with persuasive cases for conspiracy theories about climate change actually increases their levels of uncertainty.

Existential motives

Research suggests people are more likely to turn to conspiracy theories when they:

  • Are anxious
  • Feel powerless
  • Lack sociopolitical control
  • Lack psychological empowerment

People resort to conspiracy theories for compensatory satisfaction when their need to feel safe in their environment — and to exert control as autonomous individuals and as members of collectives — are threatened. Additionally, conspiracy theories may promise to make people feel safer as a form of cheater detection, in which dangerous individuals are recognised and the threat they posed is managed.

Furthermore, experiments have shown that conspiracy belief is heightened when people feel unable to control outcomes and is reduced when their sense of control is affirmed.

How well do conspiracy theories satisfy the existential motives that draw people to them?

Experimental exposure to conspiracy theories appears to immediately suppress people’s sense of autonomy and control and makes them less inclined to take actions that, in the long run, might boost their autonomy and control. Specifically, they are less inclined to commit to organisations and to engage in mainstream political processes such as voting and party politics.

Additionally, people were effectively persuaded by proconspiracy material but were not aware that they had been persuaded and falsely recalled that their preexposure beliefs were identical to their new beliefs.

Social motives

Research suggests people are more likely to turn to conspiracy theories when they:

  • Experience ostracism
  • Have low status e.g. because of income
  • Are on the losing side of political processes
  • Have prejudice against powerful groups and those perceived as enemies
  • Are narcissistic — an inflated view of oneself that requires external validation and is linked to paranoid ideation

Conspiracy theories may validate the self and the in-group by allowing blame for negative outcomes to be attributed to others. Thus, they may help to uphold the image of the self and the in-group as competent and moral but as sabotaged by powerful others.

Furthermore, conspiracy belief is also predicted by collective narcissism — a belief in the in-group’s greatness paired with a belief that other people do not appreciate it enough.

How well do conspiracy theories satisfy the social motives that draw people to them?

A feature of conspiracy theories is their negative, distrustful representation of other people and groups. Thus, it’s plausible that they are not only a symptom but also a cause of feelings of alienation.

Experiments show that exposure to conspiracy theories decreases trust in governmental institutions, even if the conspiracy theories are unrelated to those institutions. It also causes disenchantment with politicians and scientists.

In summary, despite the allure of conspiracy beliefs in satisfying certain epistemic, existential, and social motives, they may ultimately thwart those motives further.

So, we know “it’s possible that conspiracy belief is a self-defeating form of motivated social cognition.”[3] But, what happens when exposure to conspiracy theory content is algorithmically supported by our favourite social media networks?

While viewing YouTube videos of Donald Trump rallies during the 2016 American presidential election campaign, digital media theorist Zeynep Tufekci noticed the platform started to recommend and autoplay videos that featured “disturbing content” such as white supremacist rants and Holocaust denials.

As an experiment, she created another YouTube account and started watching videos of Hillary Clinton and Bernie Sanders. Quickly, she was being directed to leftish conspiratorial videos, including arguments about the existence of secret government agencies and allegations that the U.S. government was behind the September 11th attacks. As with the Trump videos, YouTube was recommending content that was more and more extreme than what she began with.

Tufekci even experimented with nonpolitical topics and the results were the same — a vector towards more extreme content.

“Given its billion or so users, YouTube may be one of the most powerful radicalizing instruments of the 21st century.”
— Zeynep Tufekci

A likely reason for this is the intersection of artificial intelligence and tech giant KPIs, generating ethics-agnostic algorithms that optimise for user engagement. And what’s engaging users? Content that’s more extreme than what they started with — even to the point of being incendiary.

The Wall Street Journal conducted an investigation of YouTube content with the help of Guillaume Chaslot, a former Google employee and YouTube recommendation algorithm whistleblower. It found that YouTube often “fed far-right or far-left videos to users who watched relatively mainstream news sources,” and that such extremist tendencies were observed with a wide variety of material.

Furthermore, in the lead-up to the 2016 election, Chaslot created a program to keep track of YouTube’s most recommended videos as well as its recommendation patterns. He found that whether you started with a pro-Clinton or pro-Trump video on YouTube, you were many times more likely to end up with a pro-Trump video recommended.

Whereas in the past, belief in conspiracy theories may have strictly been the result of the epistemic, existential, and social motives we explored earlier, now, mere human curiosity online can lead users down a rabbit hole of increasingly extreme content.

And, we’re all vulnerable.

In a study, it was found that participants who were exposed to a climate change conspiracy video were significantly less likely to think there is widespread scientific agreement on human-caused climate change, less likely to sign a petition to help reduce global warming, and less likely to donate or volunteer for a charity in the next six months. These results point to the socio-cognitive potency of conspiracies and that exposure to popular conspiracy theories can have unsavoury societal consequences.[4]

It’s been well documented that major social media platforms, like Facebook, employ algorithms which support the growth of user-specific filters and social exclusion.

Yet, it’s important to distinguish between the two phenomena at play here — which both serve as mechanisms that exclude information — epistemic bubbles and echo chambers. The former is when one doesn’t encounter a contrary opinion. The latter is what happens when one doesn’t trust people from the ‘other side’.

At an algorithmic level, social media platforms can create epistemic bubbles, which keep users isolated by mainly exposing them to content posted by their affinity network, meaning, content which confirms their worldview. This is due to the high level of personalisation available on these platforms pertaining to a mutually reinforcing cycle of actions taken by users themselves and user categorisations (which serve to optimise social media ads services via demographical, interest-related, and behavioural targeting).

Want to know how Facebook’s categorised you? Click on “Settings” within your Facebook account, then on “Ads”, and explore the sections under “Your ad preferences”.

When the predisposition to associate with those similar to us, called homophily, is exacerbated, epistemic bubbles can become echo chambers — the breeding ground for conspiratorial thinking, which actively work to discredit legitimate voices of opposition.

‘Post-truth’ is a political culture in which debate is largely framed by appeals to emotion, and importantly, by repeated assertions to which factual rebuttals are ignored. One digital manifestation of post-truth culture is the creation and spread of memes that feature factually incorrect information or information taken out of context, and serving as a powerful form of propaganda.

Climate change skepticism memes will often leverage logical fallacies — and outright lies — to stir extreme emotions like outrage.

This shift towards post-truth is due to many reasons, but mainly: Social media’s rapidly updating timelines combined with the 24-hour news cycle, which leads to information overload. False balance is another contributor and is the contrived framing of two opposing perspectives as equally credible, despite what the evidence may say, for the sake of ‘balance’ — a traditional journalistic value. Balance for the sake of balance, however, is woefully out of place in today’s media landscape as it assumes adherence to other traditional journalistic values, such as an earnest effort to remain objective as well as rigorous fact-checking.

Unsurprisingly, people are confused: A study revealed that conspiracy posts, as compared to science posts, were more likely to be liked and shared by users on Facebook.[5]

Let’s track back to the psychology of conspiracy theories we explored earlier, and ask:

How might post-truth culture and social media trigger the epistemic, existential, and social motives behind endorsing conspiracy theories?

Research suggests belief in conspiracy theories is stronger when the motivation to find patterns in the environment is experimentally heightened, among people who habitually seek meaning and patterns in the environment, when events are especially large in scale or significant and leave people dissatisfied with mundane, small-scale explanations, and when people experience distress as a result of feeling uncertain:

Globalisation, the internet, and social media have vastly broadened the scale of the news we receive. Climate change is a particularly large-scale issue — perhaps the largest — in that it affects all of civilisation. And, its scale is matched only but its significance, climate change poses a serious existential threat to all of humankind.

Furthermore, operating in a post-truth culture, people are overwhelmed with information and unclear about what’s true and what’s not.

Research suggests people are more likely to turn to conspiracy theories when they are anxious, feel powerless, lack sociopolitical control, and lack psychological empowerment:

The link between social media use and anxiety and other psychological disorders has been well-documented. Specifically, the compare-and-despair factor: Exposure to our friends’ carefully curated lives online can lead to anxieties around personal failure. Additionally, the fear of missing out can be triggered when exposed to content pertaining to events one was not invited to. It’s also been hypothesised that people who use social media more than average are actually more anxious to begin with, generating an unhealthy cycle.

Research suggests people are more likely to turn to conspiracy theories when they experience ostracism, have low status e.g. because of income, are on the losing side of political processes, have prejudice against powerful groups and those perceived as enemies, and are narcissistic — an inflated view of oneself that requires external validation and is linked to paranoid ideation:

Spending a lot of time alone, in front of a computer, is likely a reality for many people who are socially excluded. In order to feel included, users might turn to online communities, where anonymity can act as a social lubricant.

“On the Internet, nobody knows you’re a dog” is an adage about internet anonymity which began as a cartoon caption by Peter Steiner published in The New Yorker in 1993. The cartoon highlights the idea that signals contained within one’s physical appearance and presentation, like wealth, can be absent from online interactions and thus insulate the user from possible prejudice. Optimistically, if naively, the internet might be seen as a powerful democratising force, giving everyone a space to voice their opinions and have their opinions amplified. Thus, engaging online might be especially appealing to those looking to transcend ostracism, their perceived low status, etc.

As we’ve explored, post-truth culture and social media are uniquely poised to trigger the epistemic, existential, and social motives behind endorsing conspiracy theories.

Remember the Fisher King? Here’s how the rest of the story goes:

Sir Perceval comes across a strange, ruined land and meets the Fisher King, who invites him to stay at his castle. While there, he witnesses a procession: Youths carrying a lance and a maiden with a golden cup. Perceval wants to ask about these objects, but instead remains silent for fear of offending the king. The next morning, Perceval wakes up and discovers everyone is gone. He leaves the castle, which then disappears. Later, he encounters a woman who tells him the cup was infact the Holy Grail and chastises him for failing to ask the Fisher King whom the grail served, as this question would have earned him the grail and healed the king and his lands. If only Perceval had simply overcame his chivalric training.

When it comes to our own dying lands, can we apply some of these same lessons? And, how can we overcome our own social conditioning and homophily? Hard as it might be, maybe it looks something like destigmatising people who believe in conspiracy theories and engaging with them empathetically.

One need not endorse conspiracy theories for this to happen. It was found that users exposed to a conspiracy theory, and then asked whether they believed in it, reported they were less likely to believe in the conspiracy than the control group. Thus, asking questions could act as a subtle correction to a conspiracy theory.[6]

A study offers five suggestions on how scientists and environmentalists ought to respond to climate skeptics:[7]

  • Use lay terminology to better interact with the public and avoid charges of elitism
  • Better elucidate issues of scientific uncertainty and climatic unpredictability by focusing on the known elements, including known minimum risk
  • Ensure peer review processes are transparent, accountable, and welcoming of healthy skepticism
  • Scientists must also become debaters and public speakers, by developing rhetorical skills
  • Scientists must be prepared to counter criticism actively rather than letting their research offer its own defence

I posit these approaches may be successful because they undermine some of the key motivations for people to endorse conspiracy theories, namely, social exclusion, but also prejudice against powerful groups and those perceived as enemies, uncertainty, powerlessness, and lack sociopolitical control and psychological empowerment. I would also emphasise the need for these approaches to handled with radical empathy, and even, optimism.

Furthermore, it was found that an echo chamber doesn’t destroy its members’ interest in the truth; it merely manipulates whom they trust and changes whom they accept as trustworthy sources and institutions. This is encouraging because it means there might be a way to empower people with strategies for successfully parsing truth from fiction and for identifying trustworthy sources of information.

For this, I propose increased media literacy.

Media literacy scholar Todd Weibe carefully notes this involves more than just knowledge of how to search. Information literacy draws on a repertoire of critical inquiry skills. It involves knowing there are different types of information, each with its own origin and purpose, and habitually evaluating, questioning, and verifying what you find.[8]

Media literacy can also involve increased faith in credible news sources, especially those which regularly debunk fake news such as PolitiFact and Snopes.

Here’s an example of a widely circulated meme which was debunked on Snopes:

This image of a copper mine was falsely attributed to lithium at the top of the meme while the bottom photograph actually shows a type of oil sands drilling site that isn’t really comparable to a copper mine. Instead of selecting a picture of an open tar sand pit, the creator of this image chose a ‘cleaner’ photo depicting an oil sands facility which operates deep underground with little surface disruption.

How can we educate the public about media literacy? One obvious answer would be to add it to school curriculums or even just have an increased focus on critical thinking skills.

We could also attack the problem at its source, integrating media literacy directly into social media platforms, perhaps even as a necessity for using them. Such an endeavour would, of course, be up to the social media companies to adopt, but it’s conceivable as an act of corporate social responsibility. The incentives are there, especially in light of some of the more self-reflective dialogues — or perhaps less generously, mea culpa and virtue signalling as a PR strategy — emerging from large platforms like Facebook.

Following the 2016 U.S. presidential election, there was increased speculation that social media platforms like Facebook may be supporting the spread of ‘fake news’, creating misinformed voters, and ultimately, hindering democracy. Mark Zuckerberg released the following statement:

“We don’t want any hoaxes on Facebook. Our goal is to show people the content they will find most meaningful, and people want accurate news. We have already launched work enabling our community to flag hoaxes and fake news, and there is more we can do here.”
— Mark Zuckerberg

Indeed, more can be done — at an algorithmic level specifically. What if instead of supporting epistemic bubbles, social media algorithms opted for increased heterophily? And, this new heterophilic content algorithm could take a few forms. One might optimise for randomness. Another could try to bridge the partisan divide, specifically, by a regression towards the centre in terms of political content exposure.

And, we might yet have a solution for YouTube’s “radicalising” recommendation algorithm: It was found that even when exposed to factually incorrect information, the presence of correcting information in a ‘related stories’ section resulted in a decrease in factually incorrect user beliefs.[9]

The truth is mass opinion doesn’t lead policy, rather it’s the other way around, and — political predisposition is the most influential factor in determining belief in climate change. Thus, in addition to the aforementioned strategies, a pragmatic approach to climate change is proposed when communicating with policymakers.

The Niskanen Center, a libertarian-leaning Washington, DC, think tank is trying to build support for the passage of an ambitious federal carbon tax through discussions with Washington insiders, with a particular focus on Republican legislators and their staff. They’re doing so by crafting fact-based arguments and policies designed to appeal specifically to their political interests. Additionally, they also identify mutually beneficial strategies: Many of the same steps which will cut greenhouse-gas emissions will also promote technological innovation, energy independence, national security, air quality, health, jobs, etc.

The story of the Fisher King and Perceval was never finished, so the outcome of the knight’s quest isn’t known. Most critics believe he would have been ultimately successful, and most days, so do I. But I’m also haunted by the Fisher King. I think about that cut bleeding out; about the king’s lands dying relentlessly, and forever.

And, we don’t have forever.

There’s an increasing urgency for us to agree and act on climate change. But first, we need to do the hard work of lifting others — and ourselves — out of self-defeating behaviours. Radical empathy, media literacy, algorithmic change, and a top-down policy approach could indeed usher in a more open and inclusive social media culture — one which empowers its users online and off.

  1. ^ Douglas, M., Karen, Sutton, M., Robbie & Cichocka, Aleksandra. (2017). “The Psychology of Conspiracy Theories”. Current Directions in Psychological Science, (pp 538–542).
  2. ^ Douglas, M., Karen, Sutton, M., Robbie & Cichocka, Aleksandra. (2017). “The Psychology of Conspiracy Theories”. Current Directions in Psychological Science, (pp 538–542).
  3. ^ Douglas, M., Karen, Sutton, M., Robbie & Cichocka, Aleksandra. (2017). “The Psychology of Conspiracy Theories”. Current Directions in Psychological Science, (pp 538–542).
  4. ^ van der Linden, Sander. (2015). “The conspiracy-effect: Exposure to conspiracy theories (about global warming) decreases pro-social behavior and science acceptance”. Personality and Individual Differences, (pp 171–173).
  5. ^ Bessi, Alessandro, Coletto, Mauro, Davidescu, Alexandru, George, Scala, Antonio, Caldarelli & Quattrociocchi, Walter. (2015). “Science vs Conspiracy: Collective Narratives in the Age of Misinformation”. Plos One.
  6. ^ Einstein, Levine, Katherine & Glick, M., David. (2014). “Do I think BLS data are BS? The consequences of conspiracy theories”. Political Behavior, (pp 679–701).
  7. ^ Bricker, Jacob, Brett. (2013). “Climategate: A Case Study in the Intersection of Facticity and Conspiracy Theory”. Taylor & Francis Online, (pp 218–239).
  8. ^ Wiebe, J., Todd. (2016). “The Information Literacy Imperative in Higher Education”. Association of American Colleges & Universities.
  9. ^ Bode, Leticia & Vraga, K., Emily. (2015). “In Related News, That Was Wrong: The Correction of Misinformation Through Related Stories Functionality in Social Media”. Journal of Communication, (pp 619–638).