How much should you worry about the Google PageSpeed test?

We were recently contacted by a client of ours who had read this post on Civil Society News and had tested their site, newly built by us, using the Google PageSpeed Insights test. Their site received a rating of 60 / 100 on the mobile speed test with the headline “Nearly half of all visitors will leave a mobile site if the pages don’t load within three seconds”, and they were understandably concerned that a new site should receive such a low rating. I wrote them a response to put these results in context, and they thought it might be worth writing an article, as many people will be using this test and may be worrying unnecessarily.

What does 60/100 actually mean?

Firstly, this isn’t a rating of your site’s actual speed, but a rating based on a variety of factors that Google considers important to site speed. It doesn’t mean that your site is only 60% as fast as it could be. When looking at site speed we use the more detailed WebPageTest which makes it easier to break out the various factors in Google’s tests. Here it becomes clearer that, for example, a score on the Leverage browser caching test (more about this below) isn’t about raw speed, but about what percentage of a page’s assets have a short cache time without any reference to how large the files are or fast they actually load.

Browser caching and external services like Google Analytics

One small trade-off that becomes apparent in these results is the use of external services like Google Analytics, Google Tag Manager and Stripe, as well as tracking services such as Facebook Pixel, Twitter Website Tag and LinkedIn Insight Tag. If you embed a Twitter feed, for example, you’ll see in the Leverage browser caching section the script https://platform.twitter.com/widgets.js, which is used to embed the Twitter feed. Other Twitter scripts can get flagged here too, and they’re to do with loading the tweets — after all, if you have an out-of-date Twitter feed it rather removes the point of including it in the first place. Rather more irritating, given that this is a Google test, is when you see that the Google Analytics script itself is flagged in this section. Again, if you want to use Analytics without quite a bit of extra custom development time, you’ll need to accept this trade-off. The reason these are flagged by Google is that “cached” content is stored in your browser for a length of time specified by the sending site (in the case of the Twitter script that loads the tweets this is one minute) so that each time you visit the site it doesn’t have to be downloaded again. The longer the cache time, the longer your browser waits until it tries to download that file again and the quicker your pages will load on average. For less-used scripts, such as the huge variety of social sharing plugins, it’s worth considering whether you really need those specific scripts or whether they could be replaced by some simple custom code (the answer to this is almost always yes). Ubiquitous scripts like Google Analytics (2 hours) and Facebook Pixel (20 minutes) are a different matter as they’re likely to be in visitors’ browser caches from visiting other sites, meaning that they’re unlikely to be a major factor in real-world site speed.

Make sure you get image formats and compression right

Two things that too many people get wrong, which make a big difference to page load times, and which are very easy to control, is not picking the right image format and not compressing images before uploading them. Photos should only be uploaded as JPG files, whereas graphics such as logos should be PNG files. We build all of our WordPress sites so they automatically resize and crop to the maximum size at which an image will be used, but if your CMS doesn’t do that you’ll also need to do that before you upload. When it comes to compression, make sure you’re using a simple drag-and-drop compression app before you upload images, like ImageOptim (Mac), FileOptimizer (Windows) or the ImageOptim online service. Compression is a trade-off between file size and image quality, so it’s worth spending a bit of time adjusting the settings of compression apps until you find the right trade-off for you.

First and subsequent page views

Something to consider when looking at measuring page load times with a single test is that, if you’ve got your caching right, the page load time when someone visits the site for the first time will always take longer than subsequent page views. This is because all the files that the browser caches on the first visit to the site, and which are used throughout the site (such as the site logo, scripts and stylesheets), don’t need to be downloaded again for the subsequent pages you visit on the site. I’m sure everyone reading this can recognise that feeling of waiting a bit longer when you first visit a website, and it means that a single page load time on its own doesn’t necessarily reflect the average experience of your visitors. For this reason WebPageTest shows two sets of results on the main page after a test, the first and second page load times. This shows you much more clearly the effect that any caching is going to be having on your site visitors.

Where does “three seconds” come from?

It’s also worth digging a bit deeper into the “three seconds” headline from the PageSpeed test. This headline leads to a footnote that in turn mentions this study where it becomes clear that these tests are based on online shopping behaviour. (Also worth bearing in mind is that this study was commissioned by Akamai who, as a content distributor, have a vested interested in convincing people that their sites are too slow). Online shopping is a very different online space in terms of behaviour than the third sector. If all a site visitor wants to do is buy a readily-available commodity for a reasonable price with minimal fuss then they have a plethora of options available. On the other hand a visitor to a campaigning website is going to have a closer engagement with the content of the site, and is therefore likely to be marginally more patient. I should say very clearly here that this isn’t a reason for third sector organisations to be complacent. Visitors abandoning slow sites is a real problem. However, getting the trade-off right between spending development money on optimisation versus using that money for fundraising is a key one and, unless you’re selling cheap flights with all of your traffic coming from Google searches, at a certain point that extra optimisation time is not going to be worth it.

Do you know what your actual page load times are?

The final thing to consider in light of these results is what your actual page load times are. Google Analytics by default measures page load times for a 1% sample of page views, up to a maximum of 10,000 per day. This means that sites with less than 100,000 page views per day tend to get very sporadic page speed data, which makes it impossible to draw any reasonable conclusions. When we build our sites we always set the sample rate to 100%, meaning that we get a consistent and stable rating of page load times to measure performance. In the case of the concerned client who contacted us originally, the kicker to their story was that their average page load time was just 2.5 seconds!

Talk to us

There are many more technical factors when it comes to site performance, such as using a performance-tuned hosting platform and using a content delivery network (CDN) to deliver assets from servers that are physically close to each site visitor. A few weeks ago at The Bureau we took on on a client site and within a fortnight we’d halved the average page load time from six to three seconds, with more to come. If you’d like to talk to us about building your new WordPress site, how we can make your existing WordPress site faster, or about any of our other services such as branding or digital strategy, drop me a line at adrian@thebureaulondon.com.

Like what you read? Give Adrian Toll a round of applause.

From a quick cheer to a standing ovation, clap to show how much you enjoyed this story.