Set Up Cache Policies for Service Workers to Optimize Performance

Marfeel Labs
Marfeel Labs
Published in
4 min readSep 7, 2018

Written by Miquel | Frontend Jedi

If you’re reading this I’m going to take the liberty and assume you’re already familiar with Service Workers (SW) and the value they bring to both users and developers.

If not, here are some links to get you up to speed:

At Marfeel we were an early adopter for Service Workers; we started using them as soon as they began to be supported to take advantage of the control over network messages, push notifications over the web, and most important, caching.

Since Marfeel is a platform for publishers, when we decided to take on Service Workers, we started with the caching system first to be able to deliver content faster and offline.

Service Workers cache API

The SW cache provides an API to use the application cache. It’s like a key-value store which uses request objects as keys and the responses as values. The best thing about it is that devs can choose what to cache and when to consume it or not.

In our case, we wanted to have control over the resources that we are serving from our origin and cache them according to type.

Checking every request

To do this, we have to intercept every and all requests. So for every request issued from the application, we check if it goes to a Marfeel domain.

Here’s a simplification of the code we have:

If it’s not a Marfeel resource being requested, the browser proceeds with its normal flow.

But if it is a Marfeel resource that’s requested, we have different strategies or policies depending on the type of resource. For example:

  • Navigation requests: Since the pages that use our solution are publishers, the navigation requests go to articles and article sections. In this case, we want to be as fresh as possible. Here, the strategy we use is to always get the resource from the network and cache it so that it’s available even if the network isn’t (for example, when the user is offline).
  • For some JS bundles that are not constantly updated: Cache first. This is quite the opposite. We want to get the content from the cache so the response is immediate, but then “in the background,” issue the request and update the cache so it’s fresh for the next access.
  • Race: In most cases, we want to try to get the resource from the network, cache at the same time, and return the first one to respond. It should always be the cache response but in case it has not been cached yet or it’s slower, you have the network response.
  • Also, there are some resources that won’t change over time (some image assets like logos for example) that you can cache as soon as the cache gets initialized, using the addAll() method.

So with very little code and a lot of debugging and testing, you can have a few cache policies set up.

Now let’s see what’s changed and if it really is worth it.

The results

With the previous code we achieved a couple of things:

Result #1

The first one and the one we aimed to accomplish — an improvement in performance.

Now, the resources we mentioned earlier are being cached, so most of the time, instead of retrieving them from the network, we will get them from the application cache. This will reduce the network time (to 0s) and also the network usage (0 bytes downloaded which on mobile networks should be taken to account).

Now that the resources are coming faster, the web application’s performance will naturally increase.

Network times without service workers
Network times with service workers. Notice the difference with the from ServiceWorker

Result #2

The second result is content availability when offline.

Even when the user has no internet connection, most of the resources that got cached are going to load.

This is super cool because even with network failures you will be able to use the product. Maybe offline isn’t the ideal scenario but imagine when traveling, using the train, or you simply have a bad signal… some of the resources might fail over normal network conditions and the page would crash or not work. But with this, you won’t even notice.

But, we can’t forget that…

  • Even though Service Workers and the right caching policies ultimately improve performance, we are adding overhead. Even if it’s very little, intercepting all the requests, and introducing some logic before getting to the network has some costs.
  • All of this will fully work on the 2nd and subsequent loads of the page. On the first load, most of the resources have already been downloaded by the time the Service Workers start.
  • The files will be cached, so if you want to deploy a bugfix or something like that, it will take a while unless some cache cleaning policy is implemented.

--

--

Marfeel Labs
Marfeel Labs

Discover Marfeel's engineering and tech culture | Systems architecture | Web development | Java | JS | CSS | UX