Ultimate Visual Regression Testing Guide | Lost Pixel

Dimitri
Lost Pixel
Published in
8 min readApr 1, 2024

Visual regression testing is a modern QA automation practice that compares the User Interface(UI) before and after a source code change. Visual testing aims to spot regressions in the UI and fix them before end users see and report them.

Visual Regression Testing(or VRT in short) allows QA Engineers and Frontend Engineers to detect unwanted changes as if somebody looked at the UI and said:

Aha, this button shall not be red, it has always been blue and changes in the source code have not touched the <Button> component

This guide covers the theory in the first sections and quickly jumps into practice with real-world examples in later sections. Here is a TLDR; of the contents:

  1. Why use visual regression testing
  2. How to automate visual regression testing
  3. Visual regression testing tools and what to look for in them
  4. Lost Pixel open-source visual regression testing

Why use visual regression testing

You would want to ensure your application looks good for many reasons. We will focus in-depth on them in this section, but let’s start with something that is not obvious but very important.

Visual Tests are cheap to write and maintain

Compared to other types of automated tests, visual tests can be added to most applications in minutes, and they don’t require the maintenance burden of rewriting the unit/integration/e2e tests code every time your feature changes.

Still, they already offer a good degree of confidence in your frontend code, as visual problems usually go hand in hand with functional ones.

Now that we have covered this, we can explore the classic motivations for adding VRT to your testing suite.

Visual Tests help eliminate Visual Bugs

VRT, as a practice, aims to eliminate one problem—faulty-looking User Interfaces, also known as Visual Bugs.

Visual Bugs can be caused by literally anything one can touch in the web platform code(JavaScript /TypeScript | HTML | CSS). Here are some examples:

  1. Text or other elements overlapping each other

2. Styles working fine on one viewport size & breaking on the other(working fine on desktop & breaking on mobile devices)

3. Broken layout of the page

4. Cut off parts of the UI due to overflow issues

Visual bugs, while not breaking anything functionally, can negatively affect the company's brand image.

Look at the above example of Amazon UI(which was broken on purpose to be more illustrative). How would you feel if you saw this live on the amazon.com page? There will likely be nothing positive in your perceived user experience.

Visual Tests bridge the gap with Functional Tests

Visual Regression Testing is an excellent addition to your functional test suite as it helps cover the part that no unit or e2e test will cover.

If we return to our Amazon example above, running an interaction test that goes to this card and clicks checkboxes will pass with flying colors. Still, it does not mean the card is fully working, as there are clear visual impurities that can’t be caught in the interaction testing.

The best practice for writing any interaction test is to be as close as possible in defining steps to the actual user behavior. However, users interface with any UI with their vision and eyes, making VRT an essential part of the testing suite if you want to close the full cycle of user behavior:

Visual regression testing is the missing puzzle to holistic frontend testing

How to automate visual regression testing

To understand better the automation flow of visual testing, we can explore this diagram

Visual testing diagram

The process can be put into several steps:

  1. The engineer pushes code to the remote repository
  2. CI pipeline(e.g., GitHub Actions) starts to run for that commit
  3. A visual Regression Testing check is executed, meaning that if there were previous test runs, there are already baseline images(how we expect our visual snapshot to look)
  4. New visual snapshots are created from the testing material — your marketing pages, components, applications etc.

Now, there are three viable options:

  • image(es) from steps #3 and #4 are identical, meaning that our test just passes 🟢
  • image(es) from steps #3 and #4 are different and we expected that 🟡
  • image(es) from steps #3 and #4 are different and we did not expect that 🔴

In the case of 🟡, we need to update the image(es) baseline to ensure that if this test runs next time, it is 🟢.

Consider an example of modifying the button color in your codebase from blue to red. Our visual test knows it should be green(based on a previously approved baseline snapshot), but there is a visual difference between the snapshots. We got alerted and are expected to act by updating the baseline snapshot.

In the case of 🔴 we caught a legitimate visual regression and need to act by providing a fix in the codebase. After we do so, we expect our tests to be 🟢again.

Consider an example of modifying a dialog component in your application and seeing that visual tests are failing for the button that suddenly became red. Most probably you touched related code without knowing that. Reverting the change should make your tests passing again

How visual testing works under the hood

We have covered the theoretical power of automation in CI/CD pipelines but have not discussed the behind-the-scenes work of visual regression testing in detail.

When the human eye looks at two similar but still different pictures, it needs to analyze all of the details and structure in the complex brain process and spit out the verdict. Mostly, it will be less than 100% correct.

When the visual testing algorithm is at work, the images are compared pixel by pixel in an efficient way that focuses on picking only the difference that would be visible to the human eye, resulting in the picture below. There are many tools that implement the algorithm with the most efficient so far being odiff.

The comparison between the two images and the result generation took less than a second.

This allows visual testing tools to compare hundreds of images in no time and produce accurate results used in the test reports. Just imagine putting one person on the task of comparing hundreds of images like the pair above. It would be inefficient, to say the least, and would take hours before the task was completed.

Visual regression testing tools

Comparison algorithms and CI pipelines are the foundation of visual regression testing. Modern tooling is the tip of the iceberg, which is necessary for the whole process to be viable. This section will explore some problems and see how tooling can fix them.

Comparison sensitivity

When comparing two images, an algorithm can go crazy and report a difference, even slight and unnoticeable in another case, to human eye difference. One of the common solutions to this that many modern visual testing tools provide is sensitivity thresholds.

Thresholds allow us to set up how much difference could be tolerated and not reported as a regression in our testing suite.

Image stabilisation

Due to how browsers render web pages, they sometimes make tests flaky. Flakiness is a concept in testing that is related to the test being unstable across multiple runs in the same conditions. Flaky visual tests will produce different images across runs even when the code has not been changed.

Modern visual testing tools stabilize the runs by default, drastically reducing flakiness.

Customization

When it comes to visual testing, it is important that the QA expert or Engineer who writes the test can customize the run parameters to make the most of the testing suite. Those usually include:

  • running visual tests for multiple browsers and viewports to achieve wider compatibility testing
  • running custom browser code before execution of tests(for example to hide the cookie banner on CI, in case there is no need to test that)
  • excluding the areas of the screenshots from visual comparison(for example you don’t want to test the video block on your page as it is highly dynamic)

There are many more customization options. Choose a tool that fits the purpose and makes your tests useful.

Lost Pixel

When I was doing contracting work, we faced daily problems with our internal visual testing tools. Fast, stable, good DX — but you can only choose one.

As soon as you start looking at the SaaS solutions, most are at a late company lifecycle stage, making their pricing target enterprises at most.

Together with my friend, we decided to shake the status quo and create an open-source tool for visual regression testing that would offer great developer experience, the ability to self-host it, and not lose to other modern tools regarding the features.

That’s how we started lost-pixel — an open-source alternative to Percy & Chromatic.

It is fully customizable to your needs and can get you started with visual testing in a couple of minutes. Check out the self-hosted setup in two files:

lostpixel.config.ts — for defining your visual tests

import { CustomProjectConfig } from 'lost-pixel';

export const config: CustomProjectConfig = {
pageShots: {
pages: [{ path: '/app', name: 'app' }],
baseUrl: baseUrl: process.env.DOCKER
? "http://host.docker.internal:3003"
: "http://172.17.0.1:3003",
},
generateOnly: true,
failOnDifference: true
};

visual-testing.yml — GitHub Actions workflow file to execute them on CI

on: [push]

jobs:
build:
runs-on: ubuntu-latest

steps:
- name: Checkout
uses: actions/checkout@v3

- name: Setup Node
uses: actions/setup-node@v3
with:
node-version: 18.x
cache: 'npm'

- name: Install dependencies
run: npm ci

- name: Build Next app
run: npm run build

- name: Run Next app
run: npm run start &

- name: Lost Pixel
uses: lost-pixel/lost-pixel@v3.16.0

This setup hosts all of your images in the repository; when your tests fail, you need to run the following:

npx lost-pixel docker update

After the snapshots are updated, push them to the repository, and your tests should pass again.

If you enjoyed the guide — try out Lost Pixel, and provide us with feedback so we can improve. Of course, don’t hesitate to ask me questions on X or via email: dimitri@lost-pixel.com

--

--