When Facebook Sneezes, Why Do Pages and Publishers Catch the Flu?

For the third time in a month, Facebook announced a change to its Newsfeed algorithm.

Jay Kuo
The Contempo Blog
4 min readMay 31, 2017

--

If you’re a publisher or influencer who shares publisher content, Facebook’s changes to its algorithm cause traffic patterns that are downright scary. It’s easy to wonder, “Is the other shoe dropping? Are we ever going to climb out of this?” Of course, Facebook doesn’t always immediately announce that an algorithm change has occurred, so we’ve developed some early warning signs that often indicate one is underway. This is important, not only for our own sanity, but to make adjustments in how we value certain posts and how we formulate content strategy during times of massive upheaval. Fortunately, we have weathered many an algorithm change and have learned what to expect when they occur and how to best respond.

1. Organic post reach sharply declines across the board during algorithm changes.

One of the most obvious symptoms is a steep drop in organic post reach. Through each change, we recorded significant, temporary traffic dips across a variety of Facebook influencer and publisher pages, most visible on large ones (2.5M + fans) that share a lot of content, such as George Takei. An algorithm change can result in drops of 40–50 percent — sometimes higher — across all page types. So don’t worry, it’s not just you. Facebook might actually be keen to limiting reach during times it rolls out changes to its Newsfeed; lower reach means fewer users potentially affected by any problems the new system might encounter. Generally, we’ve seen post reach climb back to normal levels within 1 to 2 days.

2. Test results temporarily do not correlate with existing data.

A good sign Facebook may be tinkering with the algorithm is out-of-whack test results. Larger pages, such as George Takei, typically select content that has performed well — i.e.; shown relatively high engagement data — on smaller pages with similar demographics. During an algorithm change, however, we’ve noticed that typical engagement rates do not hold up and real-time data becomes fuzzy. Under normal circumstances, a strong signal on a piece from a smaller page correlates with strong performance on a larger one. During times of algorithm mayhem, this is definitely not something we can presume.

We hypothesize this may relate to machine learning (ML). Our own platform, Contempo, utilizes ML to learn what types of content will perform well with specific audiences, improving its predictions over time and outpacing mere human intuition about which pieces of content audiences will engage with.

Here’s the catch: As efficient ML may be at improving the rate of predictive success, feedback loops will quickly form if human input is not maintained as part of the mix. Every now and then, we have to let our own decisions inform and tweak the ML predictive patterns if we want to ensure that it evolves beyond potential feedback loops to generate better, more reliable content recommendations.

3. Facebook may be tweaking its ML whenever it adjusts the algorithm.

Since day one, Facebook has used computational models to determine what content it should deliver to users based on their past behavior. But if left to its own devices, the model would stagnate in self-perpetuation, showing people only what it thinks they want to see. The ML would, thus, only have pre-selected sets of data to review, succumbing to its own confirmation bias by continuing to exclude everything that was already excluded. So, to prevent these feedback loop scenarios, we theorize Facebook must every now and then see what people do in the total absence of — or at least how they behave in the absence of select parts of —the algorithm.

If Facebook wants its machines to learn from unbiased or unfiltered human behavior, that means at least some of the time, the algorithm should not be in place. That would explain sudden drops in things like testing correlation. Normally, Facebook is pretty good at delivering the kind of content its ML believes each user wants to see. Facebook can keep engagement high, even as it shrinks organic reach, because it’s getting better at delivering content to precisely the audiences who want to see it. But turn off or dampen the filters, and the results go out of whack.

This is of course just a hypothesis. We can’t ever fully know precisely what Facebook is doing with its machines, or even if that decision — i.e. when to turn off the filters and let the chips fall where they may — is itself being made by a computer, and not a person. It would stand to reason, however, that with every algorithm change, Facebook would want users to interact without its strong content filters in place so that it could accurately gauge true user engagement before reenabling the machines to learn from what just happened. If it never did this, it would fall victim to its own limited data set.

4. Don’t panic.

The bottom line is to remember that, as unnerving and frustrating as Facebook’s changes to its algorithm may be, these effects are likely only temporary, albeit frequent. Order always arises from the chaos, and change is the only constant. So think of Facebook as constantly creating chaos in order to make changes in what it strives to do ever better: put the right content in front of the right users so that people use Facebook even more than before.

Interested in learning more about how Contempo helps you find the right content? Contact us at info@the-social-edge.com or create an account HERE.

--

--

Jay Kuo
The Contempo Blog

Co-founder and CCO of The Social Edge, composer of Allegiance on Broadway, appellate litigator