Your Facebook Is Making Fake News Money And You Look Stupid

And That’s What You Wanted

All the conspiracy theories that are fit to post online

After last week’s election upset, there’s been a lot of finger-pointing over who’s to blame for the rise of the only person to be elected president without any experience in government or the military.

Is it the DNC? The Electoral College? The Supreme Court’s 2013 ruling that dismantled a key provision in the Voting Rights Act that led to the closure of thousands of polling locations?

Yeah, probably, but let’s just take a second to look at how your Facebook feed influenced the information you were most likely seeing. You’ve probably seen a lot of think pieces this week on the red vs. blue social feeds and wondered where that all comes from.

Spoiler Alert: It’s all curated, mostly biased, and probably super fake.

Back in October, a Buzzfeed News analysis found that over a two-week period, 38% of posts published on three popular right wing Facebook pages contained false or misleading information. For three left-wing pages the number was 20%.

So, how are people supposed to know what’s a real news site and what’s fake? They both use Facebook the same way — to reach people who will share and repost these stories for page views and advertising dollars.

And that’s exactly what’s been happening. 67% of people use Facebook as a news source, Twitter is second with 45% — that’s a huge percentage of people reading articles through a gate-keeper like Facebook.

During an election, publishers push their articles onto these platforms hoping to lure casual readers back to their website, and if everything goes according to plan, turn these readers into subscribers.

And it works! Social drives traffic — consumers are more likely to visit site after reading content through a social platform because they allow for users to keep up with current events in a fast changing world. News is following users wherever they go, as these companies invest entire teams to maintain their mobile apps for optimal user experience.

But, here’s the kicker — in a report from last March, research group Digital Content Next found that 43% of those surveyed said they did not know exactly who created the content they shared.

However, these users also valued trustworthiness of the content — 85% of them, in fact — and want to see more high quality content on their feeds.

So….can we have both?

No, not really. At least, not in the current iteration of social feeds. Although Google is pledging to stop serving ads on fake news sites, they have been quiet about promoting fake news in its search or AMP pages. Facebook’s Chief Executive Mark Zuckerberg has denied that the site influenced the outcome of the election, despite swirling criticism and Facebook’s own employees asking one another if — or how — they’ve shaped the minds of voters during the Presidential Elections.

Facebook can’t brag that it influences what people buy and then deny it changes how people vote or follow current events.

Social media is additive behavior and social platforms are now the natural evolution of information consumption. Social platforms like Facebook are very good at keeping people on the platform to convey the value of their services, and users value having their content finding them where they already are so they can share it to their feeds.

People aren’t going to stop sharing articles from random sites just because it inflicts bias, and to be frank it’s not just on them to know where this information is coming from — quality control should be on the shoulders of sites like Facebook to filter the fake news from the factual news.