Need for Speed: Faster Page Load

Andrew Sellers
6 min readJan 8, 2015

Chances are at some point you’ve closed a browser tab because “it’s taking forever to load”. Users want access to the content they’re after as quickly as possible, nobody likes waiting around.

Page speed is important, the time it takes from initiation of the page view to load completion in the browser can have a huge impact on the success of a website. It even has an impact before a user knows your website exists, as Google includes website speed in their search rankings.

According to HTTP Archive, as of December 2014 the average web page size is approaching 2mb. Content is getting richer and as developers we need to make an effort to help reduce the transfer size and make pages load as quickly as possible.

There are two main components that can affect page load 1) browser time, and 2) network and server time. This article mainly covers the former - how can we ensure pages are parsed as quickly as possible.

Get a Heads Up

From too many HTTP requests to the size of assets, there’s lots to consider. A good place to start is taking a look at some of the websites you’ve recently developed, this can highlight potential problem areas that could be improved.

There are some great tools to help with this that analyse the content of a web page and generate suggestions to make them faster:

The Network panel in Chrome Developer Tools is also a useful resource to evaluate performance, at the very least you can get a quick snapshot of the total number of requests, data transferred and total time a page takes to load.

It can also be very in depth, providing information including detailed timing data, HTTP request and response headers, WebSocket data, and more: https://developer.chrome.com/devtools/docs/network

Full Throttling

When developing locally it’s important to keep an eye on file size. A large stylesheet will always be quick to load locally, but this won’t always be the case for users as we can’t rely on their connection to be fast.

To help with testing we can recreate slower environments including Edge, 3G, and even offline using network throttling within Chrome Developer Tools, this artificially limits the maximum download throughput.

To enable throttling, open developer tools, select ‘Toggle device mode’ and then choose a connection from the preset Network dropdown.

Minimise HTTP Requests

Once the browser parses your HTML page it goes off and looks for any additional requests for images, scripts, stylesheets and so on. Each of these is another HTTP request to the server and more requests means a slower page load.

It’s possible to minimise the amount of HTTP requests, for a start always concatenate your JavaScript into a single script. This can be automated with Gulp using the gulp-concat task. The same can be applied with your stylesheets too, Sass imports helps with this massively and gulp-ruby-sass can be used to assist with compiling.

It’s important to make sure the concatenated JavaScript and CSS files have as smaller footprint as possible. The files should be minified and uglified, this reduces file size massively. Sass has an output option of ‘compressed’ for this and Gulp can be used to uglify your JavaScript.

An argument is that including all JavaScript or CSS for the website in one file is unnecessary, as it’s usually not all needed at once and varies from page to page. You can always have core functionality in one file and page specific only code loaded per page — reducing the number of trips to the server as much as possible is usually a good idea though.

You can even go as far as gzipping your files to make them even smaller. Your server will need to be configured to enable GZIP compression first, but compression rates can be as high as 70–90% for larger files.

For icons and smaller images a single spritesheet can be used, this stops separate HTTP requests per image. Manually creating and maintaining these can be a painful process, but again this can be automated using gulp-sprite. Compass Spriting is a good tool for generating spritesheets too.

Optimise Images

To go further with optimising images than just “File > Save for Web…”, a great app is ImageOptim. Just drop images or folders into the app and it will optimise compression parameters, remove redundant metadata and unnecessary colour profiles. It’s not unusual to see reductions in file size of 50% and over.

Another way to optimise images is to use gulp-imagemin, the best thing about this is it’s all automated and can be added to your deployment task.

Avoid scaling down images where possible by making sure they are created using the correct dimensions. This can be problematic for responsive sites though, as commonly images are shared across breakpoints. The HTML Picture Element can be used to help by serving different images per breakpoint and display DPI, currently it’s only supported in Chrome.

Limit Render Blocking Scripts

When your HTML is being parsed and the parser comes across a script it has to stop rendering until it’s been loaded — this is known as a render blocking script. An easy way to avoid this is by always including your JavaScript at the bottom of the page. Adding the attribute async to external scripts is a possibility too, this won’t block the parser but means you can’t guarantee the order of execution.

Including scripts at the bottom of the page isn’t always possible either, for example the Modernizr library needs to be loaded within the <head>. When this is the case try and use a custom build including only the features you will need, this will keep the file size down.

Another option to avoid render blocking scripts is to remove the HTTP request altogether and have the script inline. This however will result in a larger HTML file size and the code isn’t reusable across pages, so it’s only recommended to inline smaller scripts and functionality if at all.

Remove Redundant Code

Over time the scope of a website can change, updates to the layout are made and features are revised or removed completely.

Try to keep your code base inline with these changes — any unnecessary CSS and JavaScript should be removed from the build. A quick way to check for unused CSS selectors is on https://unused-css.com/, these selectors can then be removed from your code. Alternatively this can be automated with the gulp-uncss task.

On a side note make sure everything is archived within version control before removing, as you may need it back for a future release.

Third Party Apps

There are lots of third party applications and libraries out there that can be used to add functionality and further enhance your website. Facebook Login, Google Analytics and YouTube are all good examples of this.

Be aware of the extra page weight that these can bring though. As an example a simple YouTube or Twitter embed will add an iFrame to your page, you have no control over the size of this.

Don’t not use these for this reason, just be conscious of the implications they can have on page load if overused or used unnecessarily.

Overall there’s lots of different areas to think about, but once you start to see the success and benefits improved page speed performance brings you’ll see why it’s worth it.

Get as many insights as you can and use these to fine tune your website. Many of the recommendations can be automated using build tools such as Gulp or Grunt, the best thing is that once these are setup they can be reused on future projects.

If you start to make use of the suggestions and techniques listed above you’ll have faster pages and a lot less “it’s taking forever to load” moments.

--

--