Marketing Site Hacks: Taking Site Speed to Its Extremes

Aleksandr Guidrevitch
darwinapps
Published in
4 min readSep 27, 2017
Photo by Romain Peli on Unsplash

Disclaimer: This trick is not applicable in some situations. See more information on limitations and drawbacks at the end of the article.

We all know that the load time of your site greatly affects both user experience and google rankings, and most speed-up tricks are already known — gzip everything, concatenating (chaining together) scripts and css, optimize images, use CDN, etc. Google has even created a module for the most popular web servers that does all these things automagically — it’s called the PageSpeed Module.

But, in reality, for most sites, putting all these improvements in place does not help much. WebPageTest will still rate you as ‘F’ for the TTFB (Time To First Byte), as TTFB has little to do with gzipping, concatenating and the rest. TTFB for dynamic pages really depends on the speed of your CMS (WordPress, Drupal, whatever) rendering the HTML of a page. And the speed of your CMS, plugins, and theme combined together is something you cannot address without improving code, database structure and/or hardware. However…

There is a trick

Use caching. I know this is not necessarily a new and fresh idea, but here is the particular novel detail of the trick:

ALWAYS serve content from cache

The entire plan of attack is in actuality a little bit sophisticated:

  1. Always serve content from the cache, AND
  2. Refresh just served content in background

That is , always send a cached version of a page to a visitor, and when done sending, fetch fresh content from the CMS and cache it for the next visitor. This way, each visitor refreshes cache for the next visitor to the same page.

If you set cache expiration to 1 second, the site will be 1-second-old all the time, but will be served at the speed of static content. Here is a production example, where the main page of a site takes only 40ms to serve:

How to do this?

Use NGINX. Since version 1.11.10, it has a special directive to update a cached page in the background:

Syntax: proxy_cache_background_update on | off;Allows starting a background subrequest to update an expired cache item, while a stale cached response is returned to the client. Note that it is necessary to allow the usage of a stale cached response when it is being updated.

Here is the link to the full description of this directive, and here are the two lines that will force your NGINX to always serve content from the cache (if caching is already configured):

proxy_cache_background_update on;
proxy_cache_use_stale updating;

Drawbacks and limitations

Everything comes at a price, and here it is:

  1. As I’ve already mentioned, your site will be 1-second-old all of the time.
  2. 1-second-old is the best case scenario.
  3. The server load stays pretty much the same, as each page will still be fetched from the backend server in background, taking the same amount of resources. So, although the site may “feel” faster, it is not actually working faster as if the speed issue has been solved at its source.
  4. This trick is not applicable to the areas of the site that serve “personalized” content.
  5. This trick is not applicable if your site depends on server-side cookies, see below.
  6. If not implemented fully correctly, some regular site functions could break or malfunction (i.e., if the cookie disclaimer above is not noted, and the site continued to rely on cookies to show a pop-up, the pop-up will display every time to the same user whether or not the “x” is clicked.).

Here is alternative way of improving site speed — just fix what’s broken.

If you’re curious about the speed of your site, test the URL speed at WebPageTest.org. And, if it’s scores a “B” or lower grade, email us today at aguidrevitch@darwinapps.com to discuss how best to attack your speed problems — free of charge.

Please let me know either in the comments or by email if you would like to see the exact NGINX configurations broken down in this article.

--

--