Measure the performance of a web application with Lighthouse CI in a GitLab CI pipeline
--
Before deploying the code in production, a best practice is running all required tests: unit, integration, and E2E tests. But why don’t you also test the performance and the accessibility of your web application?
In TUI Musement, every team implements unit and integration tests and we have a QA team dedicated to the automation of E2E tests, to run on a pre-production environment. So we have decided to add a quality test to maintain an excellent level of performance, accessibility, SEO and best practices for our products.
This guide will show you how to test the performance, accessibility, SEO, best practices, and PWA rules of your web application. In order to do this, we will add Lighthouse CI to a GitLab CI pipeline. Lighthouse CI uses Lighthouse, the tool integrated into Chrome-based browsers.
We will run the Lighthouse job on Merge Request creation and update and commit to the master branch, but you can change this easily, according to your needs.
✅ You can find the complete repository of this guide here: a sample Nuxt app (but you can use any framework you want) with Lighthouse CI and GitLab CI integrated.
Install Lighthouse CI
The first step is installing Lighthouse CI in your web app. You can install it globally or locally as a dev dependency. I prefer the second way:
npm install --save-dev @lhci/cli@0.7.2
# or
yarn add --dev @lhci/cli@0.7.2
⚠️ I used version 0.7.2 of Lighthouse CI, so I am sure that this version works for all parts described below. But you can try the latest 0.x version, and it should also work.
Configure Lighthouse CI
The second step is configuring Lighthouse CI, to do this you need to create a new file called lighthouserc.js
in the root folder of your project.
👉 You can also configure Lighthouse CI using YAML or JSON, as described here.
collect
The first section of the configuration file is relative to the collect
step: how Lighthouse runs the tests and collects the data. You can find detailed documentation here.
The property numberOfRuns
indicates the number of runs that Lighthouse CI makes for each URL. Every run is independent of the others. The Lighthouse team suggests running it multiple times to reduce the variability of the results: a good value can be 5.
The property startServerCommand
is the command to run the server before Lighthouse runs. In my repository, I used Nuxt and I needed to run the Nuxt server before, so I have used npm start
and it starts nuxt start
:
The property url
is the list of the URLs Lighthouse tests; each URL will be tested numberOfRuns
times.
If you don’t specify the property onlyCategories
, Lighthouse CI will test all categories (Performance, Accessibility, Best Practices, SEO and, Progressive Web App) on your web app, unless you specify exactly what categories you want to test. In this case, we are testing all categories except the PWA audits, because our web app will not be a Progressive Web Application:
You can use the property chromeFlags
to pass flags to the Chromium browser Lighthouse uses:
no-sandbox"
to ChromeYou can set extra headers or cookies using extraHeaders
: it can be one of the options to run Lighthouse on authenticated pages:
assert
The section relative to the assert
step describes which assertions Lighthouse uses and how it manages any failures. You can find detailed documentation here and here.
The configuration of the assertions is very important because a failure can stop your pipeline and, consequently, you cannot merge your branch or deploy your code to production if one or more assertions are not respected.
If you don’t need a custom configuration, you can use a preset (e.g., lighthouse:recommended
), otherwise, you can add custom assertions starting from a preset:
You can set an assertion on:
- single audits (e.g.,
first-contentful-paint
); - entire categories (e.g.,
categories:performance
);
For each assertion, you need to set:
- a level between
off
(the audit is not checked),warn
(the audit is checked but the error generates only a warning, and the pipeline is not stopped) anderror
(the audit is checked, the error returns a non-zero exit code, stopping the pipeline), as described here; - a
minScore
under which the assertion fails; - an
aggregationMethod
betweenmedian
,optimistic
,pessimistic
, andmedianRun
, as described here.
In this case:
- the performance score must be greater or equal to 90 in the best run
- the accessibility score must be equal to 100 in all runs
- the best practices score must be equal to 100 in all runs
- the SEO score must be equal to 100 in all runs
👉 You can find:
- a description of how assertions work here;
- a list of performance, accessibility, SEO, best practices, PWA audits;
- and a practical list of audits and categories.
upload
The last section upload
indicates where Lighthouse saves the report data. You can find detailed documentation here.
You can choose to save them:
- on a temporary store with
target=temporary-public-storage
. You can access the reports by clicking on the links at the end of the test
PROS: easiest and fastest way;
CONS: it’s public and temporary; no history; - on a Lighthouse CI server with
target=lhci
PROS: historical archive of all tests;
CONS: time/resources needed to create a server, as described here and here; - on the filesystem with
target=filesystem
PROS: save (and download) reports as artifacts in the CI;
CONS: the access to the reports is less immediate than other methods; no history.
We proceed with the easiest way, for now, using a temporary store.
⚠️ The temporary store is public, so take care not to share sensitive data.
Run Lighthouse locally
At this time, you have configured Lighthouse and you can run it locally. You only need to add this new NPM task to your package.json:
Now you need to create your production bundle and run Lighthouse:
npm run build
npm run lighthouse// or with Yarn:
yarn build
yarn lighthouse
It prints something like this:
✅ .lighthouseci/ directory writable
✅ Configuration file found
✅ Chrome installation found
Healthcheck passed!Started a web server with "npm start"...
Running Lighthouse 5 time(s) on http://localhost:3000/
Run #1...done.
Run #2...done.
Run #3...done.
Run #4...done.
Run #5...done.
Running Lighthouse 5 time(s) on http://localhost:3000/detail/1
Run #1...done.
Run #2...done.
Run #3...done.
Run #4...done.
Run #5...done.
Done running Lighthouse!Checking assertions against 2 URL(s), 10 total run(s)All results processed!Uploading median LHR of http://localhost:3000/...success!
Open the report at https://storage.googleapis.com/lighthouse-infrastructure.appspot.com/reports/1621867616820-1661.report.html
Uploading median LHR of http://localhost:3000/detail/1...success!
Open the report at https://storage.googleapis.com/lighthouse-infrastructure.appspot.com/reports/1621867617506-21870.report.html
No GitHub repository slug found, skipping URL map upload.
No GitHub token set, skipping GitHub status check.Done running autorun.
- Lighthouse ran
npm start
using the configuration propertystartServerCommands
; - when the server was ready, it ran the tests;
- in this case, all assertions were valid, unless it returned a non-zero exit code;
- it uploads the report on the temporary public store
https://storage.googleapis.com
.
Run Lighthouse on GitLab CI
It’s time to run Lighthouse on your GitLab CI, creating .gitlab-ci.yml
in the root folder.
We used cypress/browsers:nodeXX.XX.X-chromeXX-XX
as Docker image
, as suggested in the official documentation of Lighthouse CI.
We defined only one stage called test
and it contains a job called lighthouse
:
- with
allow_failure: false
, if it fails then the pipeline fails; - using the property
script
, it installs all dependencies (npm install
), creates the build (npm run build
), and runs lighthouse (npm run lighthouse
).
We defined a workflow
, as suggested here, to define how the GitLab CI pipeline works:
- it runs a merge request pipeline when a merge request is created/updated;
- it runs a branch pipeline when a commit is pushed on any branch, but there is not a merge request opened on that branch.
In this way, using the rules
property, the job lighthouse
runs on Merge Request creation/update or commit on the master branch:
Create a Merge Request
It’s time to test all creating a Merge Request.
We have a Nuxt application with 2 pages where we fetch a todos API. I created a Merge Request from a branch where I added a fake timeout of 5 seconds before fetch:
In this way, the Lighthouse will find a blank page for 5 seconds and we expect the performance score will be decreased.
As we set GitLab CI, the Lighthouse job ran on our Merge Request pipeline… and it fails:
We can discover why it’s failed, by opening the job in detail:
$ npm run lighthouse...Checking assertions against 2 URL(s), 10 total run(s)1 result(s) for http://localhost:3000/ :✘ categories.performance failure for minScore assertionexpected: >=0.9found: 0.83all values: 0.82, 0.83, 0.83, 0.81, 0.831 result(s) for http://localhost:3000/detail/1 :✘ categories.performance failure for minScore assertionexpected: >=0.9found: 0.86all values: 0.85, 0.85, 0.86, 0.85, 0.86Assertion failed. Exiting with status code 1.Uploading median LHR of http://localhost:3000/...success!Open the report at https://storage.googleapis.com/lighthouse-infrastructure.appspot.com/reports/1622222737726-80625.report.htmlUploading median LHR of http://localhost:3000/detail/1...success!Open the report at https://storage.googleapis.com/lighthouse-infrastructure.appspot.com/reports/1622222738383-49315.report.html...ERROR: Job failed: exit code 1
As you can see, there is a failure for 2 assertions: the performance score (0.83 and 0.86) is less than the minimum score (0.9) for both pages. It attaches the link to the Lighthouse report for all URLs analysed:
It exited with an error code, and the pipeline cannot continue. We cannot deploy our code to production until we have resolved all failed assertions.
Conclusion
Using Lighthouse CI we can maintain a good quality of our product, in terms of performance, accessibility, best practices, SEO, and PWA rules. To do it, we need to integrate Lighthouse CI on our pipeline, to avoid deploying a new version of our web application if it doesn’t respect all quality rules.
Useful links
- the complete repository of this implementation
- the official guide to lighthouse-ci
- keyword reference for gitlab-ci.yml
- a useful guide on GitLab Performance with Lighthouse
Thanks to Michał Czmiel and Żaneta Górska for helping me to implement Lighthouse CI on our product; Aurélien Lair, przemkow, Fabio Di Peri and Pasquale Mangialavori for the review of this article. Without you writing and publishing this article would not have been possible. 🙏