OSPI Explains: Misinformation, Disinformation, and How to Stop the Spread

Misinformation and disinformation can be found across the internet and social media, often blending in with reliable information. So how can you tell what’s true?

There are practices that all content consumers can put in place to safeguard against misinformation and disinformation. And it’s increasingly important for everyone to take part in stopping the spread of falsehoods and questionable content.

Below, the Office of Superintendent of Public Instruction (OSPI) defines misinformation and disinformation, what to do when you encounter questionable information, and the tactics that content creators use to pass off falsehoods as reliable.

What is misinformation?

Misinformation is false or misleading information.

People who share misinformation often do so unintentionally, and don’t realize that the content is incorrect. The spread of misinformation commonly stems from the intentional efforts behind disinformation.

Some examples of misinformation include:

  • Decontextualization — taking text or an image out of its original context and putting it into a new, false context, thus changing the original meaning
  • Sponsored content — advertising or marketing content disguised as unbiased and factual
  • Rumors — gossip and other unverified stories

What is disinformation?

Disinformation is false or misleading information that is deliberately spread for a malicious purpose.

People who spread disinformation are intentionally misleading audiences, attempting to discredit a person or organization, or generating fear or anger. Disinformation harms the public by promoting distrust in verified sources of factual content.

Some examples of disinformation include:

  • False studies or articles — publishing false information under the guise of credible academic journals or sources
  • Fake news — purposefully fabricated content that mimics the format of reputable news
  • Deepfakes — a type of image, video, or audio that has been created using machine learning or artificial intelligence to generate content that never actually existed
  • Conspiracy theories — spreading false information to grow distrust in government and democratic institutions.

What should I do when I encounter information that seems questionable?

Disinformation campaigns play into our psychology. It’s not possible to sort through the massive amounts of information in our world, so our brains rely on cognitive shortcuts. These shortcuts lead us to assume that if something is familiar, it must be good and trustworthy. Our brains also tend to believe information that we’ve been exposed to in the past.

To avoid getting swept up by misinformation and disinformation, we must stop, investigate the source, find better coverage, and trace the original content.

This is known as the SIFT method. Created by research scientist Mike Caulfield, this method aims for us to slow down and prevent cognitive shortcuts by taking a closer look at the information presented.

Here’s how to practice the SIFT method:

  • Stop. Ask yourself some questions: Do you know this website or source of information? What is the reputation of both the claim and its source? If you don’t have this information, proceed with the rest of the steps in the method.
  • Investigate the source. The idea is to know what you’re reading or viewing and where it’s coming from. Do some quick searching to learn whether the source produces truthful content, whether it has any political biases, and how it’s funded.
  • Find better coverage. Leave the source you originally landed on and search for others. This is also known as lateral reading, which helps you find more reliable sources and a better variety of them.
  • Trace the original content. When you encounter quotes, images, or videos that have been taken out of context, find their original source to learn what was presented before or after that content.

What tactics do agents of disinformation use?

Some of the common tactics found in disinformation include impersonation, exploiting emotion, polarization, amplifying conspiracies, discrediting opponents, and trolling.

Impersonation is when someone imitates a reputable source, which may look like using a name that’s similar to an existing organization or creating a new name that sounds convincing.

Disinformation agents will also exploit our most base emotions — namely, fear and anger. Negative information is more likely to go viral because it automatically activates our threat responses.

Polarization happens when issues are presented as clear-cut and can often be politically motivated. Disinformation agents will choose a side and demonize their target as much as possible.

Conspiracy theories, which attempt to explain significant events as secret plots, can shape an individual’s entire worldview. Disinformation agents capitalize on this by aligning their content with conspiracy theories, ultimately manipulating audiences to lose trust in the institutions that protect them.

Disinformation agents will also discredit their opponents — namely, reliable sources that make truthful claims. Disinformation agents will attack the sources themselves with fully fabricated claims that nonetheless influence audiences to spread disinformation.

Trolling is when social media users create posts or comment on posts with the sole intention of upsetting others. Trolling can include the types of disinformation mentioned above, such as fake news and deepfakes.

To learn more about these tactics, play the Bad News game developed with scientists from Cambridge University.

How does Washington state prepare students for encountering questionable information?

OSPI has supported Washington schools in teaching media literacy and digital citizenship since 2008, when the State Legislature tasked OSPI with developing learning standards for technology literacy and technology fluency (RCW 28A.655.075). The current OSPI project to review the state’s learning standards is looking at ways to integrate media literacy and digital citizenship into the learning standards for other content areas, such as English language arts.

School districts are required to provide opportunities for students to develop the knowledge and skills needed to integrate technology literacy and technology fluency — including information skills and digital citizenship. To support educators in their teaching, OSPI offers professional development opportunities and open educational resources (teaching and learning materials that are available in the public domain for free use or adaptation) on media literacy and digital citizenship.

This information was compiled by Chelsea Embree, Director of Publications and Engagement Strategy at OSPI, and Zac Murphy, Director of Multimedia and Information Strategy at OSPI, with support from OSPI staff. You can contact the Communications Team at commteam@k12.wa.us.

--

--

The Office of Superintendent of Public Instruction
Office of Superintendent of Public Instruction

Led by Supt. Chris Reykdal, OSPI is the primary agency charged with overseeing K–12 education in Washington state.