Do you think fake news can be killed with the truth? Think again

Sharon Pian Chan
ART + marketing
Published in
5 min readDec 8, 2016

By Sharon Pian Chan and Andre Alfred

The presidential election exposed deep divisions in the country, among our families, friends, in the workplace and in the classroom.

Buzzfeed’s recent findings about the power of fake news is particularly troubling. The 20-most read fake stories got more traffic than the top 20 stories reported by credible news organizations that verify facts and validate stories.

In fact, people writing fake news are making more money than journalists committed to reporting the truth, according to Seattle Times columnist Danny Westneat, who talked to a fake news site in Seattle called Bipartisan Report.

Fake news sent a man with an assault rifle to a pizza shop in Washington, D.C., searching for a fictional child sex ring connected to Hillary Clinton. (Check out The Washington Post’s story.)

What are the forces behind the creation and, let’s face it, widespread consumption of lies?

We decided to combine our engineering and journalism backgrounds to do a deeper dive on how fake news feeds on itself using systems thinking, a tool we have been studying at MIT the past year.

Our goal is to spark a more constructive dialogue about societal problems.

Sharon is an editor at The Seattle Times. Andre works at Microsoft and manages complex information technology systems. Both of us are also full-time students, pursuing Executive MBAs at MIT’s Sloan School of Management.

What we found is this: The truth is not enough to kill fake news.

Gird yourself for some math and science and we’ll show you why.

Systems thinking on fake news

So what is systems thinking? System dynamics is a method of dissecting the forces behind systems —social, managerial, economic or ecological. It can be used as a tool for examining the complex causes behind seemingly intractable problems such as climate change, homelessness and terrorism — and identifying possible interventions.

The basic building block of a systems map is a causal loop. Here is a causal loop about how the demand for partisan news sources drives Facebook filtering usage and reinforces confirmation bias filtering. Confirmation bias is how human brains react to information that contradicts our beliefs — rather than force us to reconsider our belief, the new contradictory information causes us to cling even more passionately to our previously held beliefs.

The plus signs indicate that as the source of the arrow, the “demand for partisan news sources” increases, the target of the arrow, “Facebook filtering usage” also increases. As “Facebook filtering usage” increases, “confirmation bias filtering” increases. Confirmation bias filtering further drives up the demand for partisan sources. This is a reinforcing loop, so we’ve labeled it “R.”

This loop shows that fake news persists not because of Facebook, but because of how human brains behave.

We identified a second reinforcing loop in the diagram below:

As demand for Facebook filtering usage increases, hyperpartisanism increases, which then increases confirmation bias filtering.

These two loops reinforce one another, which leads to exponential growth across both loops.

Stocks and flows

The next building block of a systems map is stocks and flows. If you’re an engineer like Andre, then think of this as a transformation of state at a variable rate of change.

If you’re not an engineer like Sharon, then think of stocks as bathtubs and flows as the faucets and drains that increase or decrease the water in each tub.

Here is a stock and flow diagram of fake news and real news.

The stock is indicated by a rectangle. A flow is indicated by the valve. Fake news is a stock. The flow that determines the stock of fake news is the fake news creation rate.

Relevant quality news is a stock. The flow that determines the stock of relevant quality is the journalism creation rate.

The stocks of fake news and real news flow into the stock of total media.

Now you put it all together into a systems map

When we add the causal loop diagrams to the stock flow, we can see the relationship between the demand for partisan news sources, Facebook filtering, confirmation bias and how the system feeds on itself to drive the creation of fake news.

The demand for partisan news source creates a third reinforcing loop that drives up the fake journalism creation rate. The fake news created then increases the confirmation bias filtering.

Demand for partisan news sources drives down the demand for truthful journalism, as the map shows. As demand goes down, the number of journalists creating it decreases, and the amount of truthful journalism continues to decrease.

This is why fake news cannot be taken down by the truth. The multiple reinforcing loops keep feeding the creation of fake news.

So what’s to be done? The contra force is balancing loops. We would like to explore that in our next post — with your help. We want your ideas or solutions that would make this system better for society at large. Email us at mitloops@outlook.com

Thank you to MIT System Dynamics Professors John Sterman and Nelson Repenning for their feedback on the creation of this map. Also, these maps were created by a software package called Vensim, created by Ventana Systems, Inc.

Want more system dynamics? It was created by MIT Professor Jay Forrester, who recently passed away in November at the age of 98. His New York Times obituary gives a good overview of what system dynamics is about and how it gave rise to computer modeling that examines the behavior of social systems.

Want even more? Sign up for email alerts when we write about systems thinking again. Sign up here.

Correction: In an earlier version of this story, the systems map incorrectly showed a negative correlation between “demand for independent journalism” and “investment in independent journalism.” The map has been updated to show that the two are positively correlated.

Sharon Pian Chan and Andre Alfred are Executive MBA students at MIT Sloan School of Management. Sharon is a Deputy Managing Editor at The Seattle Times. Andre is a Principal Program Manager at Microsoft specializing in Cybersecurity. The views expressed are his own.

--

--