Run Lighthouse Performance Audits on every Pull Request

Joseph Khan
Frontend Weekly
Published in
8 min readApr 10, 2020
Run Lighthouse Performance Audits with every Pull Request

Performance is a very key aspect of modern day web applications. And especially if your application users are on mobile devices with slow connections. Running performance audits and checks every day is not easy and we often don’t do that. So why don’t we automate this stuff and add checks at necessary places? This is exactly what we will do by learning how to run Lighthouse performance audits on every pull request.

The plot

So, I am working on a GatsbyJS web application and I am very specific to having a good performance. I want my application to score high on Lighthouse metrics. I can keep running performance audits every now and then, but I want more control over which piece of code commit caused my application performance to degrade. Rather than doing these tests manually, why don’t we diagnose and run tests at pull request time? We can do this using Continuous Integration (CI) isn’t it. And it’s a fantastic way of catching regressions before merging.

This is exactly what we will achieve in this article. With all the modern tooling out there at our disposal, it’s not too difficult actually. Let’s get started.

What is Lighthouse?

Lighthouse is an open-source, automated tool from Google for improving the quality of web pages. It has audits for performance, accessibility, progressive web apps, SEO and more.

You can run Lighthouse in Chrome DevTools, from the command line, or as a Node module. You give Lighthouse a URL to audit, it runs a series of audits against the page, and then it generates a report on how well the page did.

Lighthouse Performance Audits with every Pull Request

For us, we will be using the Lighthouse Node module.

Context

The application that my tutorial uses is a GatsbyJS web application. So the NPM scripts (to run tests) that you will see here are Gatsby specific. But the learnings from this article is not limited to Gatsby. You can apply the same technique to any app — React SPA, React SSR, Vanilla HTML/CSS/JS app. Anything that runs on the browser.

CI/CD Pipeline

For running Continuous Integration builds I am using CircleCI. CircleCI works right out of the box with Github. My project is already on Github and every time I make a commit to a Pull Request, the Lighthouse performance audit will run on a CI container hosted by CircleCI.

So the first thing to do would be to create a free account on CircleCI. Once you login to CircleCI, it will show up all your Github projects (including private repo). Go ahead and set up your project.

How to Run Lighthouse Performance Test

Getting Started

Fork out a branch from the master branch.

git checkout -b feature-branch

We will create a PR from this branch to master.

Next, create a new folder in the root of your project named .circleci and create a new file named config.yml inside it.

add config yaml file

Now, copy the YAML config below and paste it inside config.yml file.

version: 2.1aliases:
restore_cache: &restore_cache
restore_cache:
name: Restore node_modules cache
keys:
- yarn-cache-{{ checksum "yarn.lock" }}
install_node_modules: &install_node_modules
run:
name: Install node modules
command: yarn --frozen-lockfile
persist_cache: &persist_cache
save_cache:
name: Save node modules cache
key: yarn-cache-{{ checksum "yarn.lock" }}
paths:
- ~/.cache
jobs:
test-lighthouse:
docker:
- image: circleci/node:10-browsers
steps:
- checkout
- <<: *restore_cache
- <<: *install_node_modules
- <<: *persist_cache
- run: yarn clean
- run: yarn build
- run: yarn test:lighthouse
workflows:
version: 2
performance-audit:
jobs:
- test-lighthouse

So this will be the build instructions to CircleCI. There are three important commands towards the end of the file.

- run: yarn clean
- run: yarn build
- run: yarn test:lighthouse

yarn clean cleans up the production bundle.

yarn build generates the production bundle again.

These are Gatsby specific commands. Replace with the commands that define your production builds for whatever project you are working on.

The last command runs the Lighthouse test on the CI/CD container. Just to reiterate we run Lighthouse audits on production builds. I will talk more about this command in the sections below.

Let’s move onto the next step and write our test code.

NPM modules we need

We need the following node modules/packages for our set up.

  • lighthouse: to run performance audits for us. This is the main tool.
  • chrome-launcher: to launch Google Chrome with ease from code.
  • jest: for running tests. It’s a Testing framework.
  • cli-table: to show tabular logs in the terminal.
  • start-server-and-test: starts server, waits for URL, then runs test command; when the tests end, shuts down the server.

Go ahead and install these as dev dependencies inside your project

yarn add --dev lighthouse jest cli-table chrome-launcher start-server-and-test

Your package.json should look like this after installation,

"devDependencies": {
"chrome-launcher": "^0.13.1",
"cli-table": "^0.3.1",
"jest": "^25.3.0",
"lighthouse": "^5.6.0",
"start-server-and-test": "^1.10.11"
}

The versions of individual packages might differ when you are trying this out.

Writing the Test File

Now let’s write our tests. Create a new file named lighthouse.test.js inside the project root.

touch lighthouse.test.js

Copy the code below and paste it inside the test file.

const lighthouse = require('lighthouse');
const chromeLauncher = require('chrome-launcher');
const fs = require('fs');
var Table = require('cli-table');
const flags = {onlyCategories: ['performance']}; //if you only need performance scores from lighthouse// instantiate cli table
var table = new Table({
head: ['Key', 'Score']
});
function printCLITable(scores) {
Object.keys(scores).forEach((key, index) => {
table.push([key, scores[key]]);
});
return table.toString();
}
function launchChromeAndRunLighthouse(url, opts = {}, config = null) {
return chromeLauncher.launch({chromeFlags: opts.chromeFlags}).then(chrome => {
opts.port = chrome.port;
return lighthouse(url, opts, config).then(results => {
// use results.lhr for the JS-consumable output
// https://github.com/GoogleChrome/lighthouse/blob/master/types/lhr.d.ts
// use results.report for the HTML/JSON/CSV output as a string
// use results.artifacts for the trace/screenshots/other specific case you need (rarer)
return chrome.kill().then(() => results)
});
});
}

test('Lighthouse Prformance Audit', async () => {
const { lhr, report } = await launchChromeAndRunLighthouse('http://localhost:9000'); //flags

//create reports
//fs.writeFileSync('./report.html', report);

//lhr.categories is an object
const scores = {};
const categories = lhr.categories;
for(let key in categories) {
scores[key] = categories[key].score;
}
//console.log(scores); //eg. {performance: 0.98, seo: 0.97, accessibility: 0.99..}
console.log(printCLITable(scores));

expect(scores.performance).toBeGreaterThanOrEqual(0.95); //95%
// expect(scores.accessibility).toBe(1)
// expect(scores['best-practices']).toBeGreaterThanOrEqual(0.93)
//expect(scores.seo).toBe(1)
}, 30000);

We will use JEST to run this file.

Let me explain the important bits. We import lighthouse and chome-launcher node modules. We start our lighthouse test by launching a Chrome instance and passing the URL to test to the lighthouse instance. We get the scores that come inside the lhr.categories property. We loop through it and read the scores for performance, SEO, accessibility and other metrics. We print the scores in a tabular log and finally run our JEST assertion. Lighthouse scores are reported between 0 -1. 1 represents the highest score of 100. So in our example above, I have been a little soft on the benchmark. You can modify it as per your needs and performance standards.

You can read more from the official lighthouse git repo.

http://localhost:9000 is the URL where Gatsby serves a production build. You can replace this with yours accordingly.

Now you may ask me — who starts a server inside my CI/CD container? CircleCI will do it for you. It will run the build inside a Docker container so it has everything necessary it needs. And that’s exactly where we are going next.

The test report log will look like this,

lighthouse test report log

Adding necessary NPM scripts

Alright, onto the last part now. Add these NPM scripts inside package.json file.

"scripts": {
"test": "jest",
"test:lighthouse": "start-server-and-test serve http://localhost:9000 test"
},

test:lighthouse is the command that will run our Lighthouse perf audit. If you remember, we have added this inside config.yml file. This command has a few important parts to it. Let me break it down.

  • start-server-and-test: is the NPM module we installed earlier. It will start a server, wait for the URL, run a test and exit.
  • serve: is the gatsby serve command which is part of scripts inside package.json. This serves a production bundle at localhost:9000
  • http://localhost:9000: is the URL to test. Our production app.
  • test: runs the test command inside scripts. This will invoke JEST to run the unit test written inside lighthouse.test.js file

The two NPM scripts added above are additional scripts to what Gatsby already provides us by default. Example below.

"scripts": {
"build": "gatsby build",
"develop": "gatsby develop",
"format": "prettier --write \"**/*.{js,jsx,json,md}\"",
"start": "npm run develop",
"serve": "gatsby serve",
"clean": "gatsby clean"
},

Now let’s run our test. Finally 🙂

Run the Lighthouse performance audits

You can run the test on your computer once. Just to try things out. Run the command below in the terminal from your project root.

yarn test:lighthouse

It should launch Google Chrome, run the tests and generate a report.

Now, commit your changes to Github and create a Pull Request to master.

git commit -m "my lighthouse test"

CircleCI will automatically trigger a build. Meanwhile, your Pull request will show the status of the test

lighthouse test in progress github pr

Once the test runs successfully and it passed, you will see the test logs inside CircleCI dashboard.

lighthouse test in progress circleci pr

Your Github Pull Request will also update accordingly

lighthouse test in progress github pass

Let’s fail the performance audit

To test out if our assertions are correct, I intentionally added some large JS & CSS libraries to my index.js page and degraded the performance of my Gatsby app.

import _ from 'lodash'; 
import 'bulma/css/bulma.css';

I also increased my performance benchmark.

expect(scores.performance).toBe(1);

Commit the changes to Github. CircleCI triggers the build again. Lighthouse comes back with negative results.

lighthouse test in progress circleci fail

Github PR also shows the negative status

lighthouse test in progress github fail

Conclusion

This is a fantastic way to detect performance regressions and exactly find out which commit or PR caused your app’s performance to degrade.

I hope you find this useful and would definitely give this a try.

Cheers!

If you enjoyed this post and excited about it, you can do 2 things —

  1. Give me a shoutout and say a Hi on my Twitter handle below.
  2. Check out my Blog or Subscribe to my newsletters for similar articles.

Originally published at https://josephkhan.me on April 10, 2020.

--

--

Joseph Khan
Frontend Weekly

Engineering Manager | Author & Speaker. I work with one of the largest OTA (Online Travel Agencies) in the Middle East https://josephkhan.me/