Cache sandboxed HTTP requests with Service Worker

Service Worker (SW) can be used to control network layer of a web app. That said, at a high level it can intercept network requests and cache HTTP responses. So it’s possible to cache anything and make truly offline web application. Here’s how this model looks like.

There’s SW between your app and network, which may talk to browser’s cache and decide what you’ll get in return on every network request. An implementation of the above model could be the following.

If you are not familiar with SW, read Introduction to Service Worker first.

// sw.js
// listen for outgoing network request
self.addEventListener('fetch', (event) => {
  // try to find response object in the cache
// associated with current request
.then((cachedResponse) => {
      // if there's cached response, give it back
if (cachedResponse) {

return cachedResponse;
      // if no, try to fetch it from the network
return fetch(event.request.clone())
.then((networkResponse) => {
          // if response is “bad”,
// just pass it back into the app
if (!networkResponse || networkResponse.status !== 200 ||
networkResponse.type !== 'basic') {

return networkResponse;

// if response is ok, cache it and
// give it back into the app
.then((cache) => cache.put(
event.request, networkResponse.clone()));

return networkResponse;

This is aggressive caching, it will cache every network request. You may not want to use it like this. Because if there’s data changed behind some end-point, you’ll never know. But the code itself is a good intro in SW.

So SW is straightforward to work with and what about the real world? I was building a web app which relies on third-party JavaScript library to make HTTP requests to API. The problem arises in a fact that the lib makes all of the requests from within the iframe. Because of security considerations SW cannot intercept sandboxed network requests. Still it is possible to cache those responses, because anyway in the end all the data is being passed into your code. However some manual work is required. Let’s take a look at another diagram.

What is going on here? Let me break it into steps:

  1. Request data from the network via sandbox
  2. Cache the response
  3. From this point, request data from SW whenever you need it

To make it work even better we could request data via fake end-points which is controlled by SW. Seems like a back-end routing on the front-end! For every end-point in the sandbox there can be an end-point controlled by SW. Now the algorithm looks different.

  1. Request data from SW using fake end-point
  2. If request fails (empty cache), make another via sandbox
  3. Cache the response

How is it different? Instead of two entry points (sandboxed and SW) now we have one: SW with sandboxed fallback wrapped inside. Here’s how it looks like on application side.

// request data from SW using fake end-point
// cache hit, cached response
// cache miss, empty cache
.catch(() => {

// make request via sandbox
.then((data) => {

// cache data
.then((cache) => cache.put(
new Request('/fake/data'),
new Response(JSON.stringify(data), { status: 200 })));

return data;

To create an entry in browser’s caches I’m using Request and Response constructors. Those are a part of Fetch API, you can explore detailed description of all of the API parts on MDN. Caching code here is very similar to the one from the first example, except here it is used on application side and request/response objects are being constructed instead or reused/cloned. Now let’s see what’s new inside of SW.

self.addEventListener('fetch', function(event) {

// if peoper url, try to read from cache and
// respond in case of cache hit
if (event.request.url.includes('/fake/data')) {

return event.respondWith(caches.match(event.request)
.then((response) => {

if (response) {

return response;

A funny thing is that the above code could be easily represented as following.

router.get('/fake/data', (req, res) => res.send(caches.match(req)));

Now you have Express server on the client-side! SW becomes very powerful technology when you start thinking of it as another back-end.