12 Keywords: Unlocking Internet Manipulation

Users are turning into amateur sleuths amid a muddied digital landscape, always on the search for the “real” truth.

Miriam Attal
Foundation for a Human Internet
4 min readNov 24, 2020

--

Extremists, conspiracy theorists, and hate groups have dozens of strategies to spread their message online, and most of us have seen them a time or two. What makes their tactics so hard to combat is that manipulators know how the Internet works — its structure and rules — and they purposefully exploit them to their advantage.

Graphic by Olivia Velten-Lomelin

But, as we’ll see, manipulation is a two-way street, and activists are starting to take up these strategies.

Understandably, it can get a little overwhelming. Here’s some lesser known terms to get you discussing media manipulation like a pro:

Keyword Squatting: Creating content or accounts around specific SEO terms to engineer the search engine results of that term. Online traffic is directed to these pages instead of authentic content.

In the wake of Stephen Hawking’s death, the March For Our Lives FaceBook page changed its name to “RIP STEPHEN HAWKING 2018.” Users who searched for the physicist would be redirected to this “imposter” content.

Taking Over Hashtags: Similar to keyword squatting, manipulators flood a hashtag to make it trend with counter content, often through fake accounts.

Recently, LGBTQ+ positive photos drowned out hate content by violent far-right group, #ProudBoys.

Astroturfing: Concealing campaign operators to make the campaign seem grassroots with widespread support, often by way of multiple online identities and fake groups.

Data Voids: Search terms with few results that can be populated with curated content to push a narrative. For example, if the search “is Obama Muslim?” has no results, a manipulator can make content affirming this statement with no contradicting narrative.

Deepfake: Machine learning that generates human faces, bodies, or voices, often mimicking real figures. Deepfakes need advanced technology to produce and are, thus, expensive.

Video comparing a deepfake and cheapfake of Nancy Pelosi. Source: The Guardian

Cheapfake: Media altered through affordable and commonplace technology, like iMovie or Photoshop.

Information Laundering: Term describing a 3 step process to make false information appear credible.

a. Placement: Post false content online

b. Layering: Spread the disinformation to credible sources

c. Integration: Disinformation becomes part of mainstream and credible sources

Trading Up the Chain: How do layering and integration work? By trading up the chain. Content is sowed in less credible sources (like small blogs), which is then amplified by increasingly credible sources, until it appears valid.

Misinfographic: Infographics with false facts or data, sometimes borrowing established brand aesthetics to appear authentic.

Misinfographic on Instagram with false information about child sex trafficking

On Instagram, infographics with #SaveTheChildren raise awareness about child sex trafficking. However, these posts stem from a QAnon conspiracy theory that Democrats and celebrities run a child sex trafficking ring. Not only are these infographics full of false statistics, they took over the hashtag from non-profits using it to raise awareness about child emergency relief.

Evidence Collages: Collages of screenshots and text compiled into a document and presented as evidence. This encourages sleuthing, but often ends with online harassment and false claims.

After the Unite the Right Rally, evidence collages abounded on conservative Reddit in an attempt to find the “leftist” car attacker that killed protester Heather Heyer. This collage misidentified the assailant, who suffered from online harassment even after he was cleared.

Evidence collage from the Unite the Right Rally. Identifying information has been obscured. Source: mediamanipulation.org

Viral Sloganeering: Creating short, memorable phrases to spread an idea. They highlight or sow social divisions, and spread virally through memes, hashtags, posters, and videos.These slogans are especially useful when combined with keyword squatting, or if they fill data voids.

This is a very popular tool. To name a few: #SaveTheChildren (QAnon), #LockHerUp (targeting Hillary Clinton), or #SendHerBack (targeting Ilhan Omar). “It’s OK to be white” flourished on 4chan, turning into a flier campaign. This was taken up by white supremacists and continues to be a popular slogan.

Butterfly Attack: Like butterflies that mimic the fluttering patterns of other species, manipulators copy the social behaviors of a group and infiltrate their communities to sow disinformation and divisive rhetoric.

Many of these strategies rely on gaining exposure and attention, because coverage by the mainstream media serves any manipulator’s cause (whether you agree with them or not). Users are turning into amateur sleuths amid a muddied digital landscape, always on the search for the “real” truth. The problem is, when we don’t know what to trust we turn to our own emotions and bias.

But, here’s what we can do: question the Internet infrastructure that allows these strategies, use tools that ensure one digital identity per person, and encourage journalists to practice strategic silence. It may be tempting to cover the absurdity of new and viral misinformation but mostly it just fuels the fire.

What’s humanID?

humanID is a new anonymous online identity that blocks bots and social media manipulation. If you care about privacy and protecting free speech, consider supporting humanID at www.human-id.org, and follow us on Twitter & LinkedIn.

All opinions and views expressed are those of the author, and do not necessarily reflect the position of humanID.

--

--

Miriam Attal
Foundation for a Human Internet

University of Michigan 2020 | Communications & Media Studies | Research and Marketing at humanID.