Facebook’s punishment of slow websites is a bad idea

Technical factors shouldn’t affect a website’s reach.

On August 2nd, Facebook announced a change in its mobile app which will reward fast websites and punish the slow:

During the coming months we’re making an update to News Feed to show people more stories that will load quickly on mobile and fewer stories that might take longer to load, so they can spend more time reading the stories they find relevant. […]
Factors such as the person’s current network connection and the general speed of the corresponding webpage will be considered. If signals indicate the webpage will load quickly, the link to that webpage might appear higher in your feed.

Facebook say they are doing this for the user’s sake: Everybody is sick of slow web pages. And the change is no doubt meant to improve user experience, but has Facebook thought this through?

A website’s (or online media) spot in the Facebook News Feed (the front page which we are all navigating (and living our lives) after), means a lot. The rank and thereby placement in the feed can determine whether an article (or other type of content) reaches the readers and potential readers out there. I think it’s a shame to let this be affected by speed.

I just don’t understand why some sites are to be punished by a lower ranking in Facebook’s News Feed. Yes, slow pages are annoying. As the Facebook employees write:

As many as 40 percent of website visitors abandon a site after three seconds of delay

But come on… let the 40 percent abandon that site. Whether you want to click in on and visit a site should depend on whether or not you are interested in the content — not the technical specifications of the site or server (or other kind of technology) powering the site.

Yes, technical performance is important — but that shouldn’t decide or influence whether a website, blog or online medium/publisher reaches its audience!

Everybody can suffer from this

And Facebook’s obsession with speed can affect all site owners, large and small. For example, a lot of those publishing on smaller sites probably don’t know how to improve performance. The same goes for bloggers like me. Sure, I can deactivate some WordPress plugins and do a better job at compressing images, but I don’t have access to server setup or similar things that can both improve and worsen the performance of a website.

Facebook has also published their ‘Best Practices to Improve Mobile Site Performance’, but take a look at that page (once you’re done reading this article, obviously); browser add-ons, analytics tools from Google, JavaScript cleanup, asynchronicity and my personal favorite:

“Dynamically adjust the content for slower connections/devices”

Seriously… that’s not something every website owner just does.

Imagine 60 year old Karen who runs a site with cookie recipes. Who’s going to tell her to keep and eye on her YSlow Score and limit synchronous JavaScript and overall requests?

But it gets better. Google is also famous for taking performance and speed into (some kind of) consideration when ranking websites. Moz.com writes:

Google has indicated site speed (and as a result, page speed) is one of the signals used by its algorithm to rank pages. And research has shown that Google might be specifically measuring time to first byte as when it considers page speed.

You don’t need to look any further than Wikipedia’s article on ‘Time To First Byte’ to find out that it “is a measurement used as an indication of the responsiveness of a webserver or other network resource.”

This means that it’s on the ‘backend’ of the performance, so to speak; it’s something that can be optimized by changing the server setup — it has nothing or very little to do with the sort of optimization that goes on at the actual website layer. Which means it’s not just something that you and I can fix (unless you’re a server-IT-person, of course).

This is Google. As of this moment, nobody outside Facebook can really be sure how Facebook measures this. Good performance is a lot of things: Page Speed, Fully Loaded, Speed Index (a measurement of perceived performance) — even among those working with performance there isn’t any clear definition that says *this* is what good performance is.

Good luck, Karen.

Late to the party… 3rd party

Big sites mostly have their own performance under control. But here there are other challenges, because a lot of bigger sites rely on ads. And having ads on a site is much more complicated than just showing an image and some text. User profiles are being matched with advertisers and campaigns, tracked, automatic auctions are taking place and much, much more — which all affects performance in a negative way.

Ads have such a huge influence on the technical performance on ekstrabladet.dk (one of the biggest and busiest websites in Denmark — and also the place where I work in the development department) and other places that I wrote in July last year that “Your ads performance is your performance”. Besides the ads, there is a lot of other third party technology (scripts, video players, embedded elements etc.) which the owners of a website have very little control over.

And this isn’t just the big sites: Imagine the blogger who wants to have ads on his blog. Is he going to call Google and tell them that the ads are loading too slow for him to get a decent Facebook ranking? (I’ve never heard of anything getting good customer service out of Google, by the way…)

This means that the by far biggest performance “sinners” are out of reach for the website owners.

Charge!

The Washington Post has (as I wrote in April) grabbed the ad bull by the horns when it comes to performance (that article is in Danish). Just look at this:

“We go to our partners and say, ‘This is how fast things need to be executed; if you don’t hit this threshold, we can’t put you on the site,’” said Jarrod Dicker, the Post’s head of ad product and technology. “We found that vendors we do use are ones that went back to their engineering teams and found out how to expedite their loads. … The vendors that haven’t been able to come to the table with faster solutions, we no longer integrate with.”

But no all online media has the weight of The Washington Post and as much clout in the market.

The publishers can’t really do anything else but forward Facebook’s punishment to the advertisers and other suppliers of poorly performing third party technology. But it will require a coordinated effort, which I doubt we’ll see. And will it ramify all the way out to the small companies designing and building the ads that jump around carrying 5 times as many scripts and analytics stuff as needed? I remain skeptic.

The Instant Articles ghost

I understand why Facebook want to improve the mobile user experience — but other there might be other factors in this as well.

Facebook’s own take on speedy articles on the mobile phone (Instant Articles, articles published directly at Facebook) isn’t quite the success, Zuckerberg and Co. have hoped for.

In May, Nieman Lab reported that Facebook will allow conversion of articles between Instant Articles and two competitors, Google AMP (which I am so tired of) and Apple News (which isn’t yet available in Denmark but is already a pretty good traffic source in the lightweight version we do have).

From the Nieman Lab article:

The move comes after a number of high-profile publishers have stepped back from Facebook’s distributed-content offering, preferring to direct Facebook mobile users to their websites.

The performance demand from Facebook is one way to try and tighten the grip on the media (which Zuckerberg repeatedly says he’s very fond of) and say “okay… you don’t want Instant Articles… well, look at this” like a fourth grader nobody wants to play with.

Of course, the publishers can make the ultimate performance optimization and let Facebook keep a copy of the content which they can show as an extremely fast loading Instant Article — which they of course can profit from.

As I have written several times, a better user experience is obviously a good idea, and performance is one of the most important factors when it comes to this.

But this is about more than just that:

This is about what types of content (and websites/publishers) people (both as users and citizens in a free, open and enlightened society) are shown when they open Facebook’s app, which is a starting place for a lot of people.

The user’s own decision

If people choose to not visit a site, it should be their own decision — at the very least, they should be allowed to see the link to that page. If they grow tired (maybe even angry) after waiting for three seconds with nothing to look at but a white screen, then they are more than welcome to return to where they came from. That doesn’t change the fact that the particular site might gain something from performance optimization, but it shouldn’t be with a punishment from Facebook hanging over its head.

Cass Sunstein writes about Facebook’s algorithms in his new book, ‘#republic’ — a discussion, Eli Pariser opened back in 2011 with his book, ‘The Filter Bubble’. Sunstein compares Facebook to open places (like parks) where everybody has the right to give a speech and deliver their viewpoints.

If we follow and build upon Sunstein’s example, we can have a park with five places for speakers, where the best spots are giving to those who can talk the fastest and deliver their message after a very short time span, while the bad spot over in the corner (next to the bushes which all the lazy park guests use as a urinal) goes to the guy, who may have something important and sensible to say but just talks way to slow and takes forever to get to his point. Is that fair?

Online media should focus on performance and their websites should load as fast and reliable as possible. But we should do it because our users demand it (or can benefit from it) — not because Facebook threatens us to do it.

Facebook’s announcement is only good news for Facebook and performance consultants — which are no doubt adding slides to their PowerPoint presentations these days.

(This article was originally published in Danish at Medieblogger)

Read more:

» Facebook Newsroom: News Feed FYI: Showing You Stories That Link to Faster Loading Webpages [August 2nd, 2017]

» Facebook Media: Best Practices to Improve Mobile Site Performance [August 2nd, 2017]

» Moz: Page Speed & CEO

» Wikipedia: Time To First Byte

» Lars K Jensen: Your ads performance is your performance [Medium, July 5th, 2016]

» Medieblogger: Performance: The Washington Post griber annonce-tyren ved hornene [April 26nd, 2017 — in Danish]

» Lars K Jensen: Why I deactivated AMP and Instant Articles [Medium, July 10th, 2017]

» Lars K Jensen: Platforms: Is journalism ready for what’s coming? [Medium, July 19th, 2017]

Show your support

Clapping shows how much you appreciated Lars K Jensen’s story.