The Web Needs Faster Websites.

Elliot Forbes
7 min readFeb 28, 2016

--

I’ve been working on my main site for about a year now and it’s hit over 30,000 unique visitors now. Impressive, I know… but I’m proud of it as a part-time side project none the less. But since it’s inception, I’ve had one thing that’s been bugging me that I’ve been meaning to fix.

A little background.

Of the 30,000 odd visits to my site, a grand total of 2,658 came from India. This makes it my second largest market and after a little bit of research (one google query), I found out that the average internet speed in India is about 2mbps. Now you are probably sitting there with speeds of 10x or even 100x that figure depending on where you live but sadly due to my rural lifestyle, I’m currently envious of every one of you and I’m sitting at below this Indian national average.

in-depth research on broadband speeds.

These slow speeds means that some sites can take forever to load and that you can just about write off streaming media if you live in a house which is inhabited by more than one person.

Another thing to consider is how long it takes for sites to load on mobile devices, for my site it’s not a huge deal as I don’t know many people who program whilst on the go but I do still see 8% of my traffic coming from mobile devices. Mobile speeds are considerably slower and if you create huge bloated sites, mobile users will hate you.

So let me ask you this, why wouldn’t I want to optimize my site for my second largest audience and potentially earn a couple pennies more with adsense?

For anyone who deals with sites of substantial traffic and a higher earnings then I recommend you read the study Amazon performed in 2012 that found that for every 100ms of latency on their website loading time, it cost them roughly 1% profit — Hacker News Link: here. This really made me think about every little script I added to every subsequent site and question it’s worth and the article has ultimately resulted in me writing this tale of how I tried to optimize my site.

The Goal

Reduce the size and load times of my site in order to provide better usability for those of us unfortunate enough to be lumbered with the equivalent of dial-up.

The Game Plan

I’m going to try to identify the slowest aspects of my site and address them one by one until I’ve got one of the fastest and smallest websites on the web.

Step 1 — Move from Apache2 to Nginx

Where better to start than the foundation that your site rests upon? I’ve read up a few articles on the performance benefits of Nginx and how it was initially designed to be an answer to the C10K problem. This problem was essentially — How do we get our servers handling ten thousand concurrent connections?

There are loads of articles detailing the pros and cons to both Apache2 and Nginx but I felt this article here was the tipping point for me as to what I should go with.

Side by side comparisons showed the following interesting results:

Apache2 Load Times
Nginx Load Times

So from this primitive test the load times seem to have been improved by a whole 2.31 seconds. I’m taking these results with a massive pinch of salt though but it seems like an improvement, if anyone knows a far more scientific way to calculate these things fairly then I’d love to know.

Step 2 — Let’s get rid of our unnecessary Images and Optimize Necessary Ones

In my initial build of my website I thought it’d be brilliant to have a main article image which basically has some cool graphic and the name of the article on it. But in all honesty, it doesn’t really add much value to the articles I write so I ended up taking them out. I did a quick calculation using some fancy unix magic and found that the average size of these images was 162356 Bytes or 162Kilobytes to be less precise. This seems like a huge win for something that doesn’t really help the user. Unix magic below:

ls -l | gawk '{sum += $5; n++;} END {print sum/n;}'

I will still be including images in my articles but only those that are necessary for demonstrating things etc. Every subsequent image I will add will be optimized and compressed as far as losslessy possible.

Optimizing the Necessary Images

After a quick search I was able to find a command line tool that will automatically look at every jpg file in a directory and compress that file down to as little as possible. I was able to install it like so:

apt-get install jpegoptim

And after making a quick backup of all my images I then proceeded to run this on all the images in my directory. These were the results:

As you can see I managed to save about 2% for every image, not much but still a saving none the less!

Step 3 — Reduce the load on my server

So far I’ve moved from apache2 to nginx to better handle thousands of concurrent requests but wouldn’t it make sense to reduce the number of requests actually making it to the server in the first place?

Currently my website makes 50 requests on every page load, not all of these go to my server as I’m using things like disqus for my comments etc but 50 does still seem like a hell of a lot of requests for one measly web page. If I reduce the load on the server then it’ll be free to handle other, more important requests.

my old request list viewed via google chromes console.

Minification To the Rescue…

Looking at the code I see that I’ve got a string of css inputs in the head section to every page and another 6 javascript files being loaded in just above the closing body tag. If I minify the css and the js into 2 separate mega-files then I’ll in theory be cutting 10 requests to my server for files every time a page is loaded. If I look at the daily average for visitors to the site this equates to roughly 2,500–3,000 requests my server no longer has to cater for every single day.

Thankfully Laravel has a little something called Elixir which is a pretty sweet way to define gulp tasks. The official documentation can be found here: Laravel-Elixir and I highly recommend you check it out.

DON’T USE CDNs FOR FILES

Seriously, I’ve seen the impact these things have on load times and it isn’t pretty. If a web page has to query 4 other servers for everything it needs then it’s going to have a bad time. In minifying my files I noticed that I was indeed a perpetrator of this terrible crime and I swiftly downloaded all the necessary javascript and css files and included them in my minification script.

The Results of Minification and Image Optimization

I’m starting to doubt the validity of these tests…

As you can see the number of requests has been reduced by 33% in total and the page size has actually gone up, I think it’s time to switch tool…

After looking at the number of requests made in google chrome I see that the number of requests is actually accurate and we have successfully managed to shave off 10 requests per page load.

Enabling HTTP Caching and GZIP Compression in Nginx

So I’ve done some basic optimizations and when checking my website on https://developers.google.com/speed/pagespeed/insights/?url=tutorialedge.net&tab=desktop I see that it’s telling me to enable browser caching. Another bit of research shows that this can be done through the nginx config file on my server.

I added the appropriate config details according to this tutorial: http://serverfault.com/questions/672844/enable-caching-on-nginx

The Final Results

This is most likely going to be part one of a series of speed optimization posts as I don’t quite think that this will satiate my desire for a faster, smaller website. None the less I have managed to considerably reduce the number of requests on my server, optimize the way my server can handle these requests, optimized all my images using a command line compression tool and finally enabled caching in my nginx configuration files.

I’m hoping that over the next few days I’ll be able to see a slight drop in the bounce rates as more people will now be able to access the site faster than ever.

Follow Me For More

If you found this an interesting read then I’d be delighted if you could show me your support and hit the recommend button and follow me for more technical related articles.

--

--