Ubuntu Design case study: visual regression testing on Vanilla Framework

Mike Fotinakis
Percy Blog
Published in
3 min readMar 10, 2017

Originally posted on the Ubuntu Design blog, written by Barry McGee:

One of the most complex aspects of managing continuous development on a large codebase is ensuring that it remains stable.

This problem is particularly acute when building out front end architecture using HTML & CSS due to the inherently global nature of CSS.

How many times have you shipped a CSS change to one small part of a website only to find you’ve inadvertently broken a page element on a different page entirely?

This problem usually arises because of all your CSS loading via one external file, added to each page of your website. If you don’t namespace or isolate your styles correctly, changes to your CSS may have unintended consequences.

Structuring your CSS using the BEM convention or similar can help prevent such clashes. However, in a fast moving team where multiple developers are working on a large codebase daily, relying on code convention alone is often not enough to stop visual regression bugs from creeping in.

Ideally, you or a team member should check each page of your site, in turn, to make sure nothing has broken, right? While that’s a solid QA approach, it doesn’t scale very well. As your site grows, it can become all time consuming to check each page, especially if you consider you may also need to check each page over multiple breakpoints.

That’s where automated Visual Regression Testing (VRT) tools can seriously lighten your workload. A VRT tool will typically run through your site and capture a baseline snapshot of all your pages to use as a benchmark.

After you then make some changes, you run the process again and the VRT tool will compare the latest capture of your pages with the baseline capture and highlight the differences. It’s at this stage where you’ll be alerted to any unintended consequences.

The concept of VRT has been around for a few years but up until now, most solutions have involved setting up your process locally, typically involving quite a few moving parts. When trying to get a project team to integrate VRT as part of their workflow using one of these solutions, we always ran into trouble as it was so difficult to keep individual developer setups consistent — inevitably, I’d spend longer debugging VRT setup than I would visual diffs.

I then stumbled upon Percy.io, which offers VRT software as a service. I was immediately interested in how we might utilise it for Vanilla Framework, our constantly evolving CSS framework.

I immediately signed up for a trial and was quickly impressed with their GitHub integration and ease of use. Percy is unobtrusive, and it’s only when a feature progresses to the Pull Request stage does Percy come into play. It will run as part of the Travis CI build and then report back if it has found any visual diffs for review. You can also configure Percy to test across defined breakpoints.

Percy’s Github integration is a big win

The person reviewing the PR can then click through to the project dashboard on percy.io and review the highlighted diffs. If the changes are expected based on the what has been outlined in the PR, then the changes can be approved.

Comparing different pages for visual differences

When the feature merges, these changes then become the baseline. If unexpected changes are highlighted, the reviewer can then highlight this to the developer for amendment.

As we make multiple changes a day to our Vanilla codebase while aiming for a weekly release, having VRT as part of our continuous integration has afforded us extra confidence that our releases do not contain missed bugs and regressions.

Related:

--

--