Can the Tools of Fake News Fight Fake News?

Michael Braithwaite
The Tilt
Published in
10 min readOct 30, 2017

By Any Other Name

Propaganda has gotten quite the profile boost in 2017, and a whole new brand as Fake News with an accompanying trending hashtag filled to bursting with misinformation, disinformation, and angry confusion. That’s the key to propaganda’s makeover: tech-driven tools that make a huge difference in amplification and scale, and that have in turn enabled the “post truth” era.

This story starts with your data. Each one of us is constantly leaking massive amounts of data about ourselves that’s being collected by servers all over the world, mined, and exploited for an infinite number of reasons. Think adtech is only used to get luxury sheet posts into your Instagram feed?

While there are an incredible number of factors contributing to the propaganda makeover, I want to talk specifically about Facebook because of its immense digital reach and the fact that it’s one of the primary tools being used by the fake news machine. In particular, Facebook’s like (installed on sites far and wide), share, and reaction functions don’t just help fake news spread, but provide much of the big data infrastructure that feed the 21st Century’s fake news business.

Disclaimer: I’m not a data scientist, I’m a writer and human rights storyteller. Because I’m not a data scientist, I highly recommend clicking through the links to actual data experts I provide throughout this article in order to get a deeper understanding of what’s driving the fake news phenomenon and how. Because I’m focusing here on the roots of fake news — namely, weaponized data and anemic-to-nonexistent privacy laws — I don’t discuss media literacy or individual privacy strategies.

Now. Into the abyss.

It’s a Great Time to be Propaganda

Unlike the propaganda of old featured in movies about the big wars of the 20th Century, the propaganda/fake news of the 21st Century is insidious and constant, fed by big data and enabled by the well documented, extensive distrust of government, media, business, and even NGOs. This distrust is the fallout of nearly a century of neoliberalist thought and economic practices that have reordered the world into market-based, competition-fueled groups of economic “winners” and a whole lot of “losers.” Fake news rides this wave of mass distrust and growing outrage and mixes it with something new: digital dissemination that capitalizes on human tribe mentalities — “if someone I know and trust shared this, then it must be true; if someone I know and love is outraged by this, then I’m outraged.”

I’m part of the newly distrusted NGO category. I’ve spent the last decade writing for publications and working with foundations and organizations of every stripe to find ways to leverage language for the greater good in the form of facts-turned-compelling-stories, all with the goal of holding governments and various public systems accountable. The fake news phenomenon represents a significant systemic problem for those of us fighting for more justice for more people because it’s turned the simple notion of facts on its head. Take recent examples from Southeast Asia, where government leaders are following Trump’s lead and using fake news as a weapon:

The region already has a scourge of fake local news stories clogging Facebook and messaging apps. But political leaders’ unverified claims and lack of supporting evidence are making it even harder for news outlets and citizens to distinguish the real news from the fake.

[In the Philippines] a pro-government online group, the Duterte Cyber Warriors, collects the names of Facebook users who criticize the government and name them purveyors of false information. Their crowdsourced complaints have pushed Facebook to suspend the accounts of television anchor Ed Lingao and The Philippine Daily, which was taken down by the social network three times.

Note the prevalence of Facebook. Note that Facebook also dominates the adtech world.

Adtech is Your New Truth

Source: David Carroll, New School (https://docs.google.com/presentation/d/1KfDEskzpA2LBtUo2REnmLNyZGNpEJJFma4URLIum604/edit#slide=id.p)

It’s an advertiser’s world and the game is won by those who can get their facts or “facts” to the top of the search results, into the most newsfeeds, shared the widest, and Facebook-reacted to with a passion, because that’s what creates more data to mine, exploit, and create behavioral predictions from.

Much of my work is focused on helping human rights defenders get their messages in front of the audiences that can help them leverage change. In the last five years, that strategy has moved increasingly into the realm of Facebook ads. This is in part because the days of reaching large audiences through Facebook for free are over by the platform’s own design, but also because Facebook collects an incredible amount of data on its users that it then makes available to advertisers. For $100, you can use this data to target the precise demographic you want to reach, and within a few weeks 30,000 of the right people have seen your advocacy video, new report, etc.

We now know just how much these social ads also played a critical role in helping to spread fake news/propaganda in last year’s U.S. election. I refer you to Berit Anderson, the CEO and editor-in-chief of Scout.ai, which explores the near-term implications of technology by combining analysis and near-term science fiction:

By leveraging automated emotional manipulation alongside swarms of bots, Facebook dark posts, A/B testing, and fake news networks, a company called Cambridge Analytica has activated an invisible machine that preys on the personalities of individual voters to create large shifts in public opinion.

Creepy!

As the Research Director at Columbia’s Tow Center for Digital Journalism, Dr. Jonathan Albright has also spent more hours than I care to imagine researching who and what actually hacked the election. If you want to really go down the rabbit hole of just how much adtech is now in charge of the information we get, check out his enlightening/disturbing findings. Here’s a choice selection:

Scores of highly sophisticated technology providers mostly US-based companies that specialize in building advanced solutions for audience “identity resolution,” content tailoring and personalization, cross-platform targeting, and A/B message testing and optimization — are running the data show behind the worst of these “fake news” sites.

These firms and the technologies they design specialize in creating our life bubbles — and it’s not just based on a few links we click on or an article we share. It’s based on everything we buy, what we talk about, when we talk about it, everywhere we go, what time we eat lunch, what size of Starbucks we order, the GPS location we read a political news story, the way we hold our phones, and the people we associate with. Online and offline. It’s based on the way we live.

Yikes, right? Next stop, The Matrix.

So if adtech is the new court in charge of shaping public opinion, can we advocates and storytellers use the same strategies to sway public opinion towards justice?

That was the question I had in mind when I attended the Reality Jamming session at the New School’s NYC Media Lab several weeks ago. Dr. Albright was there, along with Joan Donovan from Data + Society; Matt Jones from the Columbia Department of History; and Sam Thielman, a journalist at Talking Points Memo.

We Are All Fake News in a Mirror in a Mirror Forever

Source: New Media Advocacy Project (https://www.instagram.com/p/BVFgnlEhTu9/?taken-by=newmediaadvocacyproject)

Both Donovan and Albright agreed with the consensus among data scientists: Big data is at the center of the fake news firestorm, and there are two big fuel sources feeding the fire.

The first, is the total lack of data privacy inherent to our capitalist tech culture (clue: in the U.S., your privacy and personal info is regulated by the Federal Trade Commission, making it a commodity). The lack of any right to data privacy means that all of our information, down to our daily movements, are up for grabs to anyone with the funds and skill set to interpret it and turn it into grist for the weaponized adtech mill.

Second, Albright’s research found that at the center of the fake news machine is Facebook’s like and share empire. After spending years installing their like button across nearly every site on the internet (look to the left of this screen—oh the irony!), Facebook has essentially helped to create a shadow network of trackers and behavioral data collection infrastructure that gathers deep data and behavioral insights on:

1. The millions of naive people who can be coerced into creating, distributing, and defending this network of “fake news” and psy-propaganda and;

2. The millions of people who end up fighting it on Facebook, Twitter, in person, in the news, op-eds, news comments, on TV, in media, popular music — and especially — with their own wallets.

In short, all that perpetual Facebook outrage and opinionating that we’ve come to think of as a core part of our social media culture is in fact enabling and worsening the fake news problem by providing ever more data to be mined and exploited in increasingly sophisticated ways. To put it bluntly: You are fake news. We all are.

How do you like uncle Jim’s outrage now?

Source: http://weknowmemes.com/tag/facebook-big-brother-comic/

The entire panel agreed that the only potential one-stop-shop solution to the problem of fake news and online propaganda is, somewhat counter-intuitively, to totally stop playing the social networking game, Facebook in particular.

Stop arguing and sharing politically-oriented thoughts and news content on social networking sites. Stop liking things on Facebook. The shadow is large: it doesn’t matter if posts are private or public: whether it’s a Facebook message, a group Snap, or whether it is a link shared in a private subreddit, it’s all being captured. All of it.

In other words, data starve the most popular source for weaponized adtech. This is an unpopular stance in a society that equates vocalization on social platforms with activism and advocacy.

Let’s also not forget that the flip side of data-starving Facebook to dry up fake news is that many people who are marginalized and/or in repressive countries would lose one of the few avenues they have for publicly raising their voices and advocating for themselves in the face of government-sponsored fake news. For all of the clear problems with our current big-data propaganda problem, social networks remain one of the only avenues for many human rights defenders, grassroots activists, and publicly maligned communities to counter narratives that literally put their lives at risk.

So what if we rights-focused storytellers don’t data starve, but instead get bigger and better about using the same data mining and exploitation strategies used by fake news to sway public opinion towards justice? Can we use the tactics that work in culture jamming to subvert fake news in all of its forms?

That brings us to both a logistical and an ethical crossroads. Logistically, fighting the battle on equal terms would require creating a massive data collection infrastructure for an army of white hat fake newsers.

Ethically, it’s important for human rights advocates to consider the implications of rallying around things like the universal right to privacy while simultaneously mining people’s data, even if it’s to reach them with the information they need to protect their own rights. But like all great quandaries, there’s always a flip side.

I’ve worked on a number of campaigns, in Colombia especially, where the only way to get the truth about human rights advocates to the same audiences the government targets to smear those advocates is… targeted Facebook ads. Here in the States, social platforms played a big role in helping Black Lives Matter get their message out without the discriminatory lens of mainstream media; it helped the Standing Rock Sioux Tribe build public support and garner media attention for Dakota Access.

Hello, paradox, you old friend.

IRL: The Wave of the Future?

Source: New Media Advocacy Project (https://www.instagram.com/p/BUAkyHXh60t/?taken-by=newmediaadvocacyproject)

In the end, the answer to my question is, sadly, no. No, you can’t use the tactics of fake news to fight fake news because doing so only adds to the problem of big data.

That’s why groups like Access Now and Privacy International advocate that governments develop comprehensive frameworks to protect people’s fundamental rights to privacy and data protection. Access Now cites the EU’s recently adopted General Data Protection Regulation (GDPR) as an example because it:

enhances users’ rights, and it includes a reference to a “right to explanation” in the law. This language is aimed at ensuring that we are informed about the logic of the algorithms used to make decisions about us. This right-to-explanation concept is intended to improve transparency and accountability for machine-assisted decision-making, but how it impacts human rights will depend on how national courts across Europe and the European judicial institutions interpret it.

We’ve all grown accustomed to associating impact with stats like engagement and reach. However, as big data, adtech, and AI increasingly become weaponized by governments and other groups with any number of motivations, rights-focused media makers and storytellers must grapple with the fact that using digital spaces to reach audiences also adds more data fuel to the fake news firestorm.

That means the time is also ripe for us to explore alternative modes of media dissemination and distribution. One place we might begin is what I’ll call the surface web, already known as…Real Life. As much as we’ve all been convinced that you have to have an online presence, the big secret of this technological moment might in fact be a return to face-to-face interactions. At least in part.

In the coming months, I’ll explore the viability of non-social distribution in the form of mobile screening kits, encrypted media dissemination strategies, video halls, pirated DVDs, and even pop-up screenings on the back of motor bikes.

Advocacy tours, anyone?

--

--

Michael Braithwaite
The Tilt

Writer, Human Rights Advocate, Narrative Change Strategist, Editor for The Tilt.