Facebook Just Came for Low End Publishers, And It’s Going to Get Ugly

“Guilty By Association” Just Took On A Whole New Meaning.

Blair Shepard
The Contempo Blog
4 min readMay 12, 2017

--

Earlier this week, on May 10, Facebook announced in no uncertain terms that it was coming after “low quality” web publishers. For some time, Facebook had been enforcing increasingly stricter policies about what could appear in ads in the Facebook Newsfeed. Now, Facebook has made clear that it would be applying many of those same standards to organic posts.

Here’s what Facebook said:

“[W]e reviewed hundreds of thousands of web pages linked to from Facebook to identify those that contain little substantive content and have a large number of disruptive, shocking or malicious ads. We then used artificial intelligence to understand whether new web pages shared on Facebook have similar characteristics. So if we determine a post might link to these types of low-quality web pages, it may show up lower in people’s feeds and may not be eligible to be an ad. This way people can see fewer misleading posts and more informative posts.”

This statement is remarkable for a number of reasons.

First, Facebook revealed that it actually had been making subjective assessments about the content and user experience on hundreds of thousands of low quality web pages. This went far beyond its past inquiries about how they presented on Newsfeed. Interestingly, this also meant that Facebook had decided to permit bad elements to persist and continue to pollute the Newsfeed for some time — so that it could gather data about them.

Image Credit: Giphy

Second, Facebook clarified that the standard by which it judged these web pages was essentially the same it uses to judge its ads. Unpacking this a bit, pages with “little substantive content” (especially in relation to the number of ads) or which themselves were accompanied by “disruptive, shocking or malicious ads” now comprise the suspect set. Per its announcement, Facebook will now judge posts not just by whether they had clickbait headers or felt spammy to users, but by whether the content experience on the other side of the click was a poor one, especially with respect to ads.

Third, Facebook disclosed that the “bad” pages would form a template against which all posts would be assessed, with Facebook using machine learning to train its algorithm to identify similar offenders. In other words, Facebook apparently now has high confidence that it can identify a post that is likely to lead to a poor user experience after the click, having surveyed all the bad pages out there and drawn some baseline rules.

This will likely carry some devastating consequences for sites that have tried to game the system by luring readers with baited headlines and then delivering weak or ad-infested rabbit holes with their content, all for the sake of additional revenue. Certain notorious offenders are likely to see their organic reach per post plummet as Facebook shoves their content to the back of the line. In many ways, this is comparable to what Google once did with respect to giving low scores to pages that used nefarious means to game SEO. It didn’t end well for those sites either.

In our own platform, Contempo, we have long imposed strict content and advertising guidelines that emphasize a better user experience, while banning or restricting publishers who appeared too eager to squeeze more ad dollars out of a visit. Thus, we have long forbidden slide-show formats that require one click per image, clamped down on aggressive pop-ups and redirects, malicious ad redirects, and de-prioritized sites with high ad-to-content ratios that interfere with the enjoyable consumption of content.

It seems those considerations are now being codified by Facebook; we couldn’t be happier. For too long, some of the worst websites out there have been able to offer very high payments for clicks to their sites in exchange for a user experience that, frankly, was awful. On top of this, those same publishers previously used clickbait tactics to drive engagement, giving them a leg-up on Newsfeed positioning. With Facebook now applying its algorithm to underweight posts from such publishers, both because of how they present and how they deliver, we are starting to see the effects, not only on the sites themselves, but on the business pages that share their content.

This chart compares the George Takei page, which shares to articles optimized for content and has been able to keep its average reach on link posts healthy and growing through 2017, to a similarly sized page which shares primarily to articles with high ad dollar returns and has seen its reach collapsing over the same period.

With Facebook now training its algorithm to disfavor those who game the system both before and after the click, better quality publishers can finally thrive without low-quality weeds smothering their content placement on the Newsfeed. We believe pages that continue to share good content and align themselves with Facebook’s goals of authentic engagement will reap the benefits of this change.

--

--