Speed up your Jest tests with shards
Version 28 of jest will introduce a new --shard
flag that will allow for a test suite to be split into arbitrary chunks. Let’s take a look at how we can leverage this in an existing CI workflow — I’ll be using GitHub actions.
Matrix build with shards
Let’s say our yarn test
script simply executes jest
and our workflow is structured as so:
name: Shard Demo
on: [push]
jobs:
run-tests:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: actions/setup-node@v3
with:
node-version: "16.13.1"
- run: yarn install
- run: yarn test
To leverage shards we could update our workflow like so:
Here, we’ve split our test suite into 4 chunks and have them all running in parallel. If like me you have some pretty lengthy test suites, this should drastically reduce runtime.
Reporting coverage
More than likely you have some sort of coverage report being generated. To collect coverage you’ll need a way of combining coverage into a single file and reporting that to whatever service you’re using.
First, update your test script to output test coverage:
"test": "jest --coverage"
Next, to aggregate coverage reports we can write and read from artifacts
. Each test can write to a coverage out file and then we can add a final step to aggregate them and upload them.
Let’s walk through the important steps from the snippet above:
- Jests
--coverage
flag will output a coverage report atcoverage/coverage-final.json
- Each step in our build matrix will rename the coverage file
${{ matrix.shard }}.json
which will write to the cache located atcoverage/${{ matrix.shard }}.json
- We’ve added a new step which downloads the coverage reports and merges them into a final coverage report located in a
merged-output
directory. After this, it’s up to you what you want to do with your final coverage report. A typical next step would be to upload it to a service like CodeClimate, for example.
Excited for Jest V28 yet? I sure am. While working on my latest project, Auger, I was able to reduce test execution time from 14 minutes to ~3 minutes. The icing on the cake would be if Jest could resolve their memory management issues with the latest version of Node 😏