Progressive Web App Libraries in Production

Addy Osmani
Dev Channel
Published in
10 min readFeb 27, 2017

--

Two years ago, our team at Google started work on JavaScript libraries to reduce the friction for building Progressive Web Apps.

We started with Service Worker tools like sw-precache and sw-toolbox — now used by 1000s of brands to power offline caching & instant loading (on repeat visit) in their production mobile sites:

In 2017, if you aren’t taking advantage of Service Workers, you’re leaving performance wins for returning users on the table.

Let’s compare the before/after Timeline strips for the CNet’s Tech Today and Housing.com PWAs. We can see first view taking a few seconds over average 3G. Look at the 3–4 second improvement Service Worker caching their App’s Shell and data made to their loading times:

Wooo. They’re almost instant :) This approach has helped sites load and get interactive far more quickly with Service Worker. This replicates a desirable performance characteristic of native apps — once the (web) app is installed, up-front costs for reloading are amortized and don’t have a variable delay.

Service Workers are about Reliable Performance. Not just “Offline Support” — Alex Russell, Chrome

Large sites like Twitter.com, who recently shipped 100% of their mobile web traffic to their PWA with Service Worker, an Application Shell architecture and the PRPL pattern are also seeing similar wins:

This isn’t an optimization that only applies to mobile and PWAs. Service Workers can improve the load performance of your desktop sites too.

For example, Flipkart cache their static assets so on repeat visits First Meaningful Paint occurs 1.5s faster than first load:

Flipkart.com on desktop using sw-precache to cache static assets to shave seconds off their repeat visit load times

As covered in JavaScript Start-up Performance, a Service Worker also opts you in to V8’s code caching on first execution of your JavaScript so you’ll get faster start-up times for JS too.

Service Workers can help with more than just caching.

We also shipped a library for Offline Google Analytics, powered by Service Worker and IndexedDB. When a user is offline or has a flaky network connection, we’ll queue up their analytics and post them once they return online. This is used by sites like eBay Classifieds in Mexico to minimize the loss of useful stats when users are on the go:

After successfully dogfooding the idea in the Google I/O 2015 site we found it useful enough that we wanted to generalize it so anyone could use it.

A nice compliment to the offline analytics library is Autotrack — a helper to make it easier to track analytics events most people care about. It has plugins for PWA/SPA URL changes, element visibility, user scrolling, media queries, page visibility & more. These plugins help production sites like 1Password easily track important events without the boilerplate overhead:

Next, we started work on a Web Push Notifications library, but an opportunity arose to collaborate with Firebase on a much nicer solution so we also helped ship Firebase Cloud Messaging. It’s a cross-platform messaging solution that can send message or data based notifications and works great with PWAs.

Alibaba is just one of the production PWAs using FCM today:

We also contributed to the web-push library by Mozilla, an alternative folks can look at in this space.

With Service Worker being a core part of many of our libraries, we
also needed some utilities to help unit test them. We created selenium-assistant for end-to-end testing across multiple browsers using Selenium. We also wrote sw-testing-helpers to manage Service Workers in tests.

Getting started with our JavaScript libraries

Google Developer Codelabs for sw-precache, sw-toolbox & offline-analytics are freely available.

Service Worker generation

sw-precache (which also works great with Webpack) generates a Service Worker for you. At it’s simplest, you can provide it with a “dist” directory and it will provide sane defaults for caching any static assets offline, so they instantly load from the Cache Storage API on repeat visits:

$ sw-precache --root=dist

You can verify files are being correctly cached by using the Chrome DevTools Application panel. Look for ‘Cache Storage’ after loading your page and you should see entries corresponding to the directory supplied:

There’s also support for passing complex configurations using --config <file>. Any of the options from the file can be overridden through a command-line flag. We recommend using an external JavaScript file to define configurations using module.exports. For example, assume there's a path/to/sw-precache-config.js file that contains:

module.exports = {
staticFileGlobs: [
'app/css/**.css',
'app/**.html',
'app/images/**.*',
'app/js/**.js'
],
stripPrefix: 'app/',
runtimeCaching: [{
urlPattern: /this\\.is\\.a\\.regex/,
handler: 'networkFirst'
}]
};

We can pass the file to the command-line interface, also setting the verbose option:

sw-precache --config=path/to/sw-precache-config.js --verbose

This provides the most flexibility, such as providing a regular expression for the runtimeCaching.urlPattern option. On successfully running sw-precache, it will also summarize the estimated size of assets being precached to help you stay aware of user data-plan usage:

With the Webpack plugin, a typical setup for precaching static assets might look as follows:

const SWPrecacheWebpackPlugin = require('sw-precache-webpack-plugin');module.exports = {
// ...
plugins: [
// ...
new SWPrecacheWebpackPlugin({
cacheId: 'my-cache',
filename: 'service-worker.js',
staticFileGlobs: [
'./public/images/**/*.{png,jpg,gif}',
'./public/scripts/**/*.js',
'./public/styles/**/*.css',
'./public/partials/**/*.html'
],
stripPrefix: './public/'
})
]
};

Integrating sw-precache into a gulp build system

To use sw-precache in gulp, we first import the plugin at the top of our gulpfile:

const swPrecache = require('sw-precache');

We then create a gulp task and call write on swPrecache as follows:

swPrecache.write(filePath, options, callback)

filePath is the location of the file to write the service worker to. options is an object that defines the behavior of the generated service worker (see the documentation on Github for the full list of options). The callback is always executed. This is for gulp to know when an async operation has completed. If there is an error, it is passed to the callback. If no error is found, null is passed to the callback.

Let’s look at an example:

gulp.task('generate-service-worker', function(callback) {
swPrecache.write('app/service-worker.js'), {
//1
staticFileGlobs: [
'app/index.html',
'app/js/bundle.js',
'app/css/bundle.css',
'app/img/**/*.{svg,png,jpg,gif}'
],
// 2
importScripts: [
'app/node_modules/sw-toolbox/sw-toolbox.js',
'app/js/toolbox-script.js'
],
// 3
stripPrefix: 'app/'
}, callback);
});

We call the gulp task 'generate-service-worker' and pass a callback to the function to make it asynchronous.

swPrecache.write generates a service worker with the following options:

  • The resources in staticFileGlobs are precached, meaning the generated service worker will contain an install event handler that caches the resources.
  • The scripts in importScripts are included in the generated service worker inside an importScripts method. In the example we are including the sw-toolbox module and a script containing our routes.
  • The app/ prefix is removed from all file paths in staticFileGlobs so that the paths in the generated service worker are relative.

Runtime Caching

sw-toolbox is a complimentary library that enables you to intercept network requests in the Service Worker and perform a caching strategy with the response. It works off of routes, which behave like fetch() event listeners.

A route intercepts network requests matching a URL pattern and HTTP request method. It then responds based on the rules in the request handler. sw-toolbox has about 5 built-in handlers for covering the most common caching strategies:

If you’re familiar with Express, sw-toolbox supports URL patterns using a similar syntax to its routing syntax.

toolbox.router.get('img/**/*.{png,jpg}', global.toolbox.cacheFirst);

This will intercept GET requests for any PNG/JPG file under the img folder. It handles requests according to the cacheFirst strategy, first checking the cache for a response. If that fails, the request gets sent to the network. If that succeeds, the response gets added to the cache.

Full domains can also be used here, e.g this will cache your Google Fonts:

toolbox.router.get('https://fonts.googleapis.com/', toolbox.cacheFirst);

We can also intercept GET requests to another domain using Express-style routing. We just define an ‘origin’ property in our options (a string or RegExp) which gets matched against the full origin of the URL.

toolbox.router.get('/(.*)', global.toolbox.cacheFirst, {
origin: /\.googleapis\.com$/
});

A RegExp object can also be used. Here we’re defining a route for POST requests that start with “https://www.googleapis.com”:

toolbox.router.post(/^https://www.googleapis.com\//, global.toolbox.networkFirst);

Tip: When inspecting Cache Storage, you can differentiate what sw-toolbox is caching as it manages the $$$toolbox-cache$$ namespace.

More granular control

sw-toolbox also gives us the ability to granularly control caching characteristics. In addition to specifying an origin, we can also customize the cache as follows:

  • We give it a name (“products”)
  • We give it a maximum size of 12 items (using the maxEntries parameter)
  • We set the content to expire in a day (24 hours = 86400 seconds)
toolbox.router.get('/(.*)', global.toolbox.cacheFirst, {
cache: {
name: 'products',
maxEntries: 12,
maxAgeSeconds: 86400
},
origin: /\.products\.com$/
});

You can find tutorials on sw-precache & sw-toolbox in our Progressive Web Apps Instructor Led training material.

Offline Google Analytics

As mentioned earlier, offline Google Analytics can relay analytics requests a user performed offline when a network connection is available again. To add this to your Service Worker involves just two lines of code:

// Import offline analytics into the SW global scope:
importScripts('path/to/offline-google-analytics-import.js');
// initialize it
goog.offlineGoogleAnalytics.initialize();

Boom. That’s it!

It’s also possible to supply an object with custom parameters that will be included with each request that is replayed:

goog.offlineGoogleAnalytics.initialize({
parameterOverrides: {
cd1: 'Guacamole',
cd2: 'So much cheese'
}
});

Note: the main use case for passing in an object of parameter overrides is detecting when hits are sent normally (vs replayed by Service Worker).

Autotrack.js

Setting up Autotrack is relatively straight-forward. In addition to including analytics.js in your page, also async load in the Autotrack library. Next, update your default tracking code to require any Autotrack plugins needed:

<script>
window.ga=window.ga||function(){(ga.q=ga.q||[]).push(arguments)};ga.l=+new Date;
ga('create', '<!-- your google_analytics_tracking_id -->', 'auto');
// Autotrack plugins availablega('require', 'urlChangeTracker');
ga('require', 'cleanUrlTracker');
ga('require', 'eventTracker');
ga('require', 'maxScrollTracker');
ga('require', 'outboundLinkTracker');
ga('require', 'pageVisibilityTracker');

ga('send', 'pageview');
</script>
<script async src='https://www.google-analytics.com/analytics.js'></script>
<script async src='/public/js/autotrack.js'></script>

Note: Some of the Autotrack.js plugins work without specifying configuration options (e.g outboundLinkTracker) while others don’t (e.g clearUrlTracker). Be sure to check the docs to see what options the various plugins take :)

Selenium Assistant

As mentioned, Selenium Assistant helps us get a list of browsers available on our machines, get a web driver instance for them and then run some tests.

You install the Web Driver modules for browsers you want to test (npm install chromedriver etc) and can then iterate through the list of these browsers and control them as needed. The assistant also works great with Sauce Labs, should you need to test browsers that aren’t installed on your local system.

Firebase Cloud Messaging

After adding Firebase to an existing project, there are a few extra steps involved in adding support for Web Push notifications:

1. Add the FCM gcm_sender_id to your Web Application Manifest (manifest.json) file:

"gcm_sender_id": "103953800507"

2. Create a new firebase-messaging-service-worker.js file. We’re going to give
it access to FCM by importing the Firebase Messaging libraries into this file:

importScripts('https://www.gstatic.com/firebasejs/3.6.10/firebase-app.js')
importScripts('https://www.gstatic.com/firebasejs/3.6.10/firebase-messaging.js')

Then initialize the Firebase app in Service Worker. Pass your
messagingSenderId (from Firebase Project Settings) to do so:

firebase.initializeApp({
'messagingSenderId': '<-- your sender ID goes here -->'
});

Next, retrieve an instance of Firebase Messaging to handle background
messages:

const messaging = firebase.messaging();

and request permission to show notifications. You may want to wait until
an appropriate time to do rather than doing this when the page boots up:

messaging.requestPermission()
.then(function() {
console.log('Notification permissions granted.');
// ...
})
.catch(function(err) {
console.log('Permission denied', err);
});

Now when the user receives a message from FCM, a notification is displayed
if they granted permission to enable this.

What’s next?

We’re currently working on the next big version of our Service Worker libraries, expanding our explorations to also cover Background Sync, Service-Worker based HiDPI image-switching and smarter analytics for PWAs. We look forward to sharing more as beta releases for these libraries become available.

We’re also planning on a new post over on our Sustainable Loading channel talking about Service Workers in production on Google.com.

Until then, we hope our libraries prove useful, regardless of whether you’re building a PWA or just trying to improve performance on your site :)

With thanks to the awesome members of our team — Jeff Posnick, Matt Gaunt, Taylor Savage, Joe Medley, Prateek Bhatnagar, Lucas Mullens, Phil Walton, Alex Russell and former member Mat Scales for their contributions to our small family of open-source libraries.

Resources

--

--