CD Pipelines Part 2: How to Add Performance Testing and Utility Scripts to Your Deployment

Karen White
BigCommerce Developer Blog
7 min readSep 13, 2019

In a previous blog post, we discussed how continuous integration can help you work smarter as a developer and maintain quality as your code base scales. In that tutorial, we set up a simple CD pipeline to deploy theme files from a Bitbucket repository to a BigCommerce store.

That’s a good start, but in this post, we’ll take it one step further. Continuous delivery pipelines are about more than just getting your code from point A to point B — you can also run tasks along the way. In this part 2 blog post, we’ll add new capabilities to the build process: we’ll add performance testing to make sure that our latest commits haven’t introduced any performance regressions and we’ll also run a cleanup script to make sure we have room to push the new theme to the store.

If you’re landing on this series for the first time, I recommend circling back and reading part 1. But if you’ve mastered the basics and you’re ready to get more out of your CD pipeline, let’s get started!

Performance Testing

Google Lighthouse is a tool for measuring website performance and how well a website conforms to accessibility standards. Lighthouse is considered an industry standard for performance testing, and it’s a great idea to run frequent test against the sites you maintain. You can run Lighthouse audits directly from Chrome Developer Tools, but you can also install Lighthouse as a Node module and run it from the command line. We’ll use the latter method to add Lighthouse testing to our CD pipeline. By running a test every time we build the theme and push it to the staging environment, we can be sure that our most recent changes haven’t had a negative effect on performance.

To do this, we’ll add a second step to the pipeline we built in part 1 of this series. In that example, we installed Stencil CLI and used it to build, push, and apply a theme to a staging store when a commit is merged. Now, we’ll run a Lighthouse audit from the command line using headless chrome and publish the report as an artifact.

To start, you should have something like this in your bitbucket-pipelines.yml file:

image: karenwhite/stencilpipeline
pipelines:
default:
- step:
name: Deploy to staging
script:
- npm install
- stencil push -a Light

Here, we’re referencing a Docker image that already has Stencil CLI installed. For a refresher on building a custom Docker image that’s pre-loaded with Stencil CLI and publishing it to a public registry, see part 1. From there, the script installs theme dependencies and then runs the stencil push command to build, upload, and apply the theme to the storefront.

Recall that each step of the build pipeline generates its own container instance. We can even specify different images for different steps. Our next step requires the Lighthouse npm package and also headless Chrome. We can save a bit of time by referencing an image that already includes these dependencies. In this example, we’ll be using this one, by justinribeiro.

Next, we’ll run the lighthouse command in headless mode and point to the URL for our staging store (the same store that the pipeline deploys the theme to). We’ll also specify that we want to output a report in JSON and save it as reports-{build_number}.json. Lastly, we’ll save the report as an artifact so we can download it. The term artifact refers to a file that’s output from a pipeline process, for example, a build file or a test report. Artifacts are saved to the Artifacts tab in the pipeline dashboard, but you can also reference artifacts from previous build steps in subsequent ones. In this case, we’ll simply save it.

Here’s the YAML for the second build step:

- step:
name: Run performance testing
image: justinribeiro/lighthouse
script:
- lighthouse --chrome-flags="--headless --disable-gpu" https://mysandbox.mybigcommerce.com/ --output json --output-path ./reports-${BITBUCKET_BUILD_NUMBER}.json
artifacts:
- reports-*.json

Make a commit to your theme repository to run the pipeline. When the build completes, click the Artifacts tab on the right hand side of the Pipelines dashboard and click the download icon. Expand the compressed file — you should have a report.json file tagged with the build number. To view the report, open the Lighthouse Viewer in Chrome and drag your file onto the screen.

By keeping track of Lighthouse reports with each build, we can be aware of the performance impact of each incremental change, making it easier to stay within a performance budget or roll back problematic updates.

Cleanup Script

Every time we push a theme, we’re creating a new custom theme in our BigCommerce store, and there’s a limit of 20 total custom themes. To make sure we have room to deploy, we should check to see if there are already 20 custom themes and delete one if that’s the case. We can add a new step to our build pipeline to run a cleanup script, to take care of this automatically.

First, let’s write the script. This script was adapted from an example shared by scottfwebdev via GitHub. Thanks Scott!

We’ll use Node to call the BigCommerce themes API and make a delete request if there are already 20 custom themes. A custom theme — also known as a private theme — is a theme that’s been copied from the base version, usually so it can be customized. Each theme that we push through our pipeline is considered a custom theme. A base theme is an original copy, either of a free theme like Cornerstone or a paid theme purchased through the Theme Marketplace. We’ll get a 403 Forbidden error if we try to delete a base theme through the API, so before we try to delete a theme, we’ll first refine our data to sort out only the custom themes. We can also assume one of the themes in the response is the one that’s currently active on the store. We’ll want to filter out the active theme too because the API will return an error if we try to delete it.

Let’s take a look at the full code and step through what’s happening:

App.js

const BigCommerce = require('node-bigcommerce');
const bigCommerce = new BigCommerce({
clientId: 'Your BigCommerce Client ID',
accessToken: 'Your BigCommerce API Token',
storeHash: 'Your BigCommerce Store Hash',
responseType: 'json',
apiVersion: 'v3'
});
bigCommerce.get('/themes')
.then(data => {
let themes = (data.data);
let privateThemes = themes.filter(themes => themes.is_private === true && themes.is_active === false);
let themeToDelete = privateThemes[0].uuid;
console.log(themeToDelete);
if (privateThemes.length > 18) {
console.log('There are 20 custom themes');
bigCommerce.delete('/themes/' + themeToDelete)
.then(() => {
console.log(`Theme UUID ${themeToDelete} has been deleted`);
})
.catch(err => {
console.error(err)
});
}
})
.catch(err => {
console.error(err)
});

At the top of the file, we’re including the Node BigCommerce client as a dependency and instantiating the client with our BigCommerce API credentials. You can use the same API token and Client ID that you used in your .stencil file. Please note though — and this is super important — make sure that your Bitbucket repo is private. Exposing API credentials in a public repo would be very insecure.

Next, we’re calling the BigCommerce API and retrieving the array of themes currently uploaded to the store. We want to narrow that down to only custom themes that aren’t currently applied to the storefront, so we’re filtering the array to create a new array made up of theme objects containing the is_private = true property and is_active = false.

We check the length of the custom theme array to see if there are 19 themes (we’re deducting one from the limit to allow for the active theme), and if that evaluates to true, we’ll delete the first item in the custom theme array.

Now, let’s add the script to our CD pipeline. Make a new folder in the root directory of your Bitbucket theme repository, and name it utility_scripts. Create two files inside of it: app.js and package.json

Package.json

{
"name": "theme-script",
"version": "1.0.0",
"description": "",
"main": "app.js",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1"
},
"author": "",
"license": "ISC",
"dependencies": {
"node-bigcommerce": "^4.0.1"
}
}

Then, add a new first step to your bitbucket-pipelines.yml file:

- step:
name: Run cleanup task
script:
- cd utility_scripts && npm install && node app.js

Here, we’re navigating into the utility_scripts directory, installing script dependencies, and then running the script. When you run the step, you should see a message in your pipeline console if there are 20 themes, letting you know a theme has been deleted.

Finished Build Script

Here’s the full build script from our bitbucket-pipelines.yml file:

image: karenwhite/stencilpipeline
pipelines:
default:
- step:
name: Run cleanup task
script:
- cd utility_scripts && npm install && node app.js
- step:
name: Deploy to staging
script:
- npm install
- stencil release
- stencil push -a Light
- step:
name: Run performance test
image: justinribeiro/lighthouse
script:
- lighthouse --chrome-flags="--headless --disable-gpu" https://karensandbox.mybigcommerce.com/ --output json --output-path ./reports-${BITBUCKET_BUILD_NUMBER}.json
artifacts:
- reports-*.json

Conclusion

Continuous delivery pipelines take the mental weight off deploying changes to your staging or production environments, and they can also automate useful tasks along the way. Adding utilities like tests or scripts can help you streamline your workflow and make regression testing a routine part of your development process.

Have you set up a continuous delivery pipeline for your theme deployments? We’d love to hear how you’ve optimized your workflow. Comment below to let us know how you configure your CD pipeline or what you think about automation, or tweet us @BigCommerceDevs to continue the conversation!

--

--

Karen White
BigCommerce Developer Blog

Developer Marketing at Atlassian. Formerly at BigCommerce, Rasa. I think Voyager was the best Star Trek.