Making web pages go fast.

I wrote 2 things (1, 2) on here about static site generators a week or two ago, and was really surprised at the level of interest, and I had some good discussions, and I think I’m actually probably going to do a project trying to create a really good static site generator that anyone can use. (I may do a Kickstarter to do this and I’m sticking an email notify list form at the bottom of this post).

I got talking to a friend of mine about the article, and the whole idea of making a web site run fast (which is one of the great benefits of static site generators), and they asked for some tips, so I am going to stick those in an article here as well. This is not a definitive list, and some of these may be a little on the impractical side, so don’t have a cow if they seem overkill. The easiest, fastest way to speed up a website may, in many cases, just be to upgrade the server it’s running on, but that’s not really fun.

These ideas are in no particular order, and they’re geared to someone who runs a website but is more front-end, so I’m leaving out most of the stuff like tweaking Nginx settings, etc. I also am going to assume you can pick which of these you’re interested in, and Google further for specific software that suits your environment.

Kinda basic: Minify your HTML, CSS and JS

These 3 formats are just text, and the files inevitably have a lot of blank space, in the form of tabs, newlines, etc. This stuff is necessary for people working on the files, but a web browser doesn’t need all this.

So what you do is this: Sometime after you’ve finished editing one of these files, you use some sort of minifying software, which will strip out all the extra whitespace, and use that version to actually stick on your website. You obviously keep the original, and continue working with it when you edit your code — you just minify it every time you actually push it to your server.

Here is an image that should give you the idea of how minifying works:

That’s a short version of a full web page that I minified, which took it from 13 kb to 10 kb. Saving 3 kb may not be the biggest win in the world, but if you minify all the text files you can, including Javascript and CSS, you can eke out a speed improvement, especially if you have a site with minimal images.

But guess what, this step hardly matters at all if you do the next thing I’m about to suggest.

Much more effective: Enable Gzip compression on your server

Okay, so I said I would skip server stuff, but this may actually already be enabled on your server, and if not, it’s not that tough to get working. (You can google “Check if your website is gzipped” to see if your website already has this turned on).

How this works is simple: Once this is turned on, every time your webserver sends a web page out, just before it sends it, it compresses it, in the gzip file format, which makes it a lot smaller. Then, every web browser will instantly decompress the file and display it to the person who requested the page. The whole process is completely transparent to your readers, but means that web pages are a lot smaller as they travel across the internet.

How much does this compress a text file? Quite a lot: That 13 kb file I used earlier gzips down to 3.1 kb, and this is about how much it reduced larger pages I tested.

As I mentioned in the minification section, using Gzip makes minification almost pointless. When I gzipped the minified version of that 13 kb page, it reduced down to 2.9 kb, instead of 3.1 kb, so it’s pretty inconsequential.

Don’t let CSS slow down your document: Inlining

This is something that could be quite easy to change, or tricky, depending on your workflow, but I’m going to describe the basic concept, and let you figure out how it fits into your site.

The first issue here: Everybody is taught to use external stylesheets, for numerous good reasons. When your site loads, however, it will usually not render (display on screen) until it has loaded all the CSS information needed. If all your CSS is inside external files (as it is on most sites), the page must load those files and process them, which slows down the page load a small amount. This may slow down the page a fair bit if, for some reason, the CSS files are on a server that is much slower than the server the HTML is on (this is probably extremely rare).

The second issue here: If you have 1 HTML page, with one or more external CSS files, the reader’s browser has to make multiple HTTP requests. This isn’t necessarily a huge, slow bottleneck, but it’s still somewhat unnecessary, and if you inline your CSS, you can reduce the number of HTTP requests, which is a win.

So you may wonder: Am I recommending that you completely give up external stylesheets and write all your CSS inline? No! That would make for a really disorganized workflow that was hard to maintain.

What you can do however, if you want the quickest page loads possible, is to continue maintaining your CSS files in the exact same way you always have, but then automate it so that when your publishing tool (your CMS or whatever you’re using) actual creates your site, it sticks the CSS inline, instead of linking to an external stylesheet.

I have a tiny Jekyll blog that I did this on quite easily. Inside the template where I would normally link my external stylesheets, I instead did this:

You can see that the code on lines 4, 5 and 6 is pulling in the entire contents of my CSS files, and they’re being inlined into the static HTML file that Jekyll creates. So when a reader goes to that page, all the CSS loads inside the HTML file, and only 1 HTTP request is needed. But I still edit and maintain my CSS files normally, as their own files.

And maybe this goes without saying, but if you have a lot of different areas on your site, and some sections use vastly different CSS than others, try organizing your CSS files so that every single page doesn’t load every single CSS file, if it’s not needed.

Another way to reduce your HTTP requests: base64 encoding Data URIs

Okay, so to me, this is one of the most fun things, but I should also mention that it kind of clashes with the CSS inlining that I already mentioned, so you need to be careful about using both at once — I’ll explain that later. But this is how this works:

If you have an image that you display on every page, like a small logo of your site’s name, the normal thing to do is to save it on the server and load it every time someone comes to your site. So for instance, I was just working on a site where I put this in my CSS:

So, when this stylesheet loads, it will go and grab this PNG file and display it, which creates 1 additional HTTP request which must complete before the page displays.

If you want, though, you can actually embed the image data directly into your CSS file, as a data URI. The image will then load with the CSS, and show up without needing an extra HTTP request. This is how my CSS looks after doing that:

This code should all be on one line, without line breaks, or it won’t work, but I soft-wrapped it so that you could get an accurate idea as to what it looked like.

So how do you convert your image into this base64 string? You can easily just google something like “image conversion base64 data-uri”, or if you’re on a Mac, run this command in the directory that your image is in:

openssl base64 -in logo.png -out CODE.txt

(that example takes an image called logo.png, and then outputs the big ugly string to a file named CODE.txt).

A warning about using the two preceding techniques together:

So, the tip where you inline all your CSS means that your web pages will be a bit larger in size, and it’s most useful if you have a website with a fairly small number of pages. If you anticipate people only reading 1 or 2 pages on your site, inlining the CSS is a good way to speed up those loads.

If someone is going to stick around and read 50 pages on your site, and you inline your CSS, it means they’ll load your CSS 50 times, so it’s probably better to actually use an external file, and set up browser caching correctly, so that after loading your external CSS file once, the reader’s browser doesn’t keep grabbing it over and over and over.

This is even more important though if you do choose to use base64 data-URIs to stick an image inside your CSS. If you embed an image in your CSS, and you inline your CSS, the reader will be loading that image every single time they load an HTML page. So be aware of that before you make your choice.

If you just have a simple 1 or 2 page website that you want to load as fast as possible, inlining all your CSS, and embedding an image as a base64 data-uri inside your CSS is a great way to reduce HTTP requests, but you need to understand the issues.

Set up browser caching correctly

Okay, so this is a bit more of a server-related one, but it can really speed things up to do this right.

Certain files are unlikely to change in the future — most images are a perfect example of this, along with a lot of Javascript and CSS files. For these files, it’s well worth setting the Cache-Control HTTP Header to a large value, so the reader doesn’t keep loading those same files over and over.

How to alter these is going to vary quite a lot based on what your hosting/server setup is, so I’m going to go ahead and pass the buck and tell you to Google it and check your host’s documentation, etc. But this is very important!

Optimize your images as much as possible.

Honestly, this article is sort of aimed at people who should understand what this means, and how to do it, so I am not going to go into the nittiest of gritties, but it’s well worth looking into this.

If you use the same images in a few different places on your site, but at different sizes, there are quite a few choices on how to resize and compress them efficiently, and some blogging/CMS software like Wordpress will do a lot of the work for you.

If you’re creating something from scratch though, and you want a quick and easy solution, look into something like Imgix, which resizes your images on the fly, based on some stuff you stick inside the img src tag.

I think I already said get a fast server

This is worth saying again though — if you host your site somewhere lousy, that has lots of downtime, then your users are going to have a bad experience.

My current go-to recommendations when people ask me about hosting are: Digital Ocean, Chunkhost, and Linode. All 3 have affordable plans for any site. (2 of these automatically assign people referral URLs, so I stuck mine in fwiw, but they would be my recomendations anyway).

Make sure your DNS servers are up to snuff

Since I’m sort of veering to server-related things, something I don’t hear a lot of people talk about is flaky DNS servers. A lot of the issues I’ve seen in the past with them is that often, a site will use the default DNS servers provided by their hosting company, without thinking about it, and occasionally, these servers will crash or have some other trouble for several hours (or more). This is rare, but when it happens, your site is essentially dead to the world.

It’s good practice to use 2 DNS providers for your domain — usually 2 servers from Company A as your main servers, and then 2 from Company B as your backup. Personally though, I am lazy, and I just use Route 53 from Amazon Web Services (AWS). I’ve been very happy with it for years, and it costs something like $6/year per domain — well worth imo.

Use a dang CDN I guess

A CDN will spread your files to many servers, all across the globe, and server them to your readers from a location close to them. They reduce load on your server, which lets it serve requests faster in general. There are a trillion CDN companies.

Cloudflare is very popular, and has a free tier that will suit most small projects. I found Cloudflare very easy to set up.

Amazon Cloudfront is paid, but also has a free tier if you’re new to AWS. I’ve never used Cloudfront (and I’ve always found AWS in general to be extremely user-friendly), but it’s a popular choice.

There is some software that will do a lot of this stuff for you

So my article is sort of geared towards hobbyists who want to know what the issues and options are, but if you’re using something like Wordpress I think there are various plugins that will accomplish some of these goals, especially caching.

Google has a nice product called PageSpeed Module that aims to solve most of this stuff. I installed it a year or two ago on a server I had running Apache, and it worked decently, but it is pretty darn technical to install for a lot of people. If the words “Install and configure a web server module” are confusing to you, then this is almost certainly something you would want someone else to do for you.

Okay that’s it for now.

Okay that’s it for now. Since I assume people reading this will probably have come from my aforementioned Static Site Generator articles (1, 2), I’m going to link to this newsletter signup where I’ll let you know when my own Static Site Generator project is announced.

I enjoy reading comments on Medium, but I slightly prefer talking on Twitter so feel free to follow me there.