Facebook Is Trying To Kill ‘Clickbait,’ And That’s Weird
The people behind Facebook’s algorithms, like the editors of old newspapers deciding on right rails, believe they know what’s best for you when it comes to your media consumption.
To that end, the company made an interesting announcement Thursday outlining a renewed effort to diminish “clickbait” in your News Feed. You probably missed it unless you’re a media wonk, because Facebook makes no effort to publicize updates such as these via its core product (which exists to deliver information).
…We’ve heard from people that they specifically want to see fewer stories with clickbait headlines or link titles. These are headlines that intentionally leave out crucial information, or mislead people, forcing people to click to find out the answer. For example: “When She Looked Under Her Couch Cushions And Saw THIS… I Was SHOCKED!”; “He Put Garlic In His Shoes Before Going To Bed And What Happens Next Is Hard To Believe”; or “The Dog Barked At The Deliveryman And His Reaction Was Priceless.”…
…With this update, people will see fewer clickbait stories and more of the stories they want to see higher up in their feeds...
… websites and Pages who rely on clickbait-style headlines should expect their distribution to decrease. Pages should avoid headlines that withhold information required to understand what the content of the article is and headlines that exaggerate the article to create misleading expectations.
The social network made a similar move in 2014.
I have two immediate reactions to this update: It’s good, because certain types of clickbait really are awful, and it’s weird, because it is yet another explicit acknowledgement from Facebook that it knows how you should want to consume information.
The hypothetical clickbait headlines described by Facebook’s announcement are not so far-flung, and they represent a type of online content that’s roughly equivalent to tabloid rags in the checkout lane — the information is purposefully misleading to swindle people and make money for the outlet’s proprietors. You could take the analogy further and imagine this algorithm change is like the owner of a grocery store replacing the National Enquirer with daily editions of The New York Times or Wall Street Journal.
And that’s a decision that is worth chewing on, because, well, don’t a lot of people enjoy reading the National Enquirer even if they know it’s not good for them?
Let’s break down Facebook’s announcement just a little bit. This line:
We’ve heard from people that they specifically want to see fewer stories with clickbait headlines or link titles.
Okay, makes sense. But Facebook constantly states that its News Feed is designed, using a type of artificial intelligence called “machine learning,” to deliver content that users want to see. You can get a taste of what this means — albeit an exaggerated one — via the Wall Street Journal’s “Blue Feed, Red Feed” feature.
Anyway, it being the case that Facebook users want to see less clickbait actually suggests that a lot of people are interacting with clickbait to begin with, which is what allows it to consistently surface across News Feeds. If people really hated clickbait, it wouldn’t be a “problem” on Facebook.
See how that works? In a very basic sense, more interactions on a given piece of content generally means more exposure for that content on social media— it’s the simple principle behind “viral content.”
Put an entirely different way by my former colleague Mark Gongloff:
The social network’s algorithms are directly responsible for this problem to begin with — insofar as it really is a “problem,” anyway. People who say they want to see less clickbait on Facebook might just mean they want to see less of their most annoying online “friends” who, in interacting with that content, force it to spill over into other News Feeds. But to address that would belie Facebook’s mission to connect you to your family and friends.
You very well might roll your eyes at “The Dog Barked At The Deliveryman And His Reaction Was Priceless,” but who are we to judge someone who just wants the simple pleasure of watching a canine yelp at someone? 🐶 Kind of snobbish, right?
Meanwhile, clickbait-ish video content published directly to Facebook — you know, the sort of content the social network has pressured media outlets to produce — is doing just fine.
Hey, a dog and a mailman — what a weird coincidence! 1.25 million views and counting.
One last point: Because of how Facebook’s algorithms have rewarded “clickbait” in the past, legitimate media outlets have built headline strategies around a so-called “curiosity gap” that encourages people to click through to articles. This is where things get murky.
Let’s use an example I wrote while I was at The Huffington Post. “Facebook Will Allow 900 Million People To Keep Messages Secret.” I’ll make no excuses for it: I chose this headline because it’s a little vague, sort of intriguing and because I wasn’t sure the widest possible population of my readers would understand a headline like “Facebook Will Enable End-To-End Encryption On Messenger.”
Is this clickbait? Probably not, measured against the examples listed in Facebook’s blog post, but you can see how similar principles apply — and certainly people have complained to me before that news headlines like this do amount to clickbait. Will Facebook formally categorize stuff like this clickbait? Shrug. But respected media outlets often do publish more straightforward clickbait headlines to bolster traffic numbers/pay writers/keep the lights on, so one imagines this will have an impact on them one way or another.
Don’t misunderstand: This new update probably will amount to a net positive for news consumers, and no one is mourning the loss of some great art form here. But as always, it’s worth examining the evolution of Facebook — a communications force without much in terms of progenitors, and one that millions of people in the United States alone use specifically to consume news.