Component Visual Test Cases

Like code, prove Figma shows what and changes as you expect

Nathan Curtis
EightShapes
7 min readApr 25, 2022

--

Gallery of many visual test cases

Testing Figma Components:
Overview| Visual Test Cases | The Review

Testing isn’t just create an instance, change a label, swap some properties, resize an edge, “Done!” Seriously, ten seconds and you are convinced the next step is to release it? Instead, consider a methodical approach to explore many possible outcomes. It’s time for Figma component visual test cases.

A visual test case is a component instance that’s configured and overridden in for particular ways so that you can verify the component works as expected. Cases enable us to verify many things — style, properties, layout, content, composition with slots, and more — across many conditions. In addition, you can retain visual test cases are so that you can confirm it changes as expected as you update it over time. Analogous to a developer’s Storybook story, designers are improving habits in making visual test cases in Figma.

This article digs into the why, who, where, and mostly what types of visual test cases are important for validating a Figma component.

About visual test cases

Why bother with visual test cases?

Integrating persisted visual test cases into your testing and maintenance workflow helps to:

  • Exhibit components in diverse conditions instead of just a grid of variants.
  • Tangibly expose and sensitize observers to the role of testing.
  • Think methodically through many scenarios to test.
  • Accumulate test cases used over time by many builders and testers.
  • Review that edits change what you intend, and don’t change anything else.

Where to test cases go?

Generally, designers make and maintain visual test cases to the right of where they build (the component variant grid). This results in a healthy back and forth seeing and fix what’s broken as each successive case is added.

Zoomed out screenshot displaying many visual test cases to the right of a built component

Some teams also test components and compositions in other locations too:

  • Intermingling a component in another component’s test cases, whether as a dependency (like an Icon in an Alert) or siblings (like Headings and Dropdowns and Buttons in a form layout when testing a Text Input).
  • “Page Types” / “Page Templates” / “Page Patterns”, full pages hopefully built robustly using Autolayout often in a second file or dedicated page.
  • A “Test App” / “Demo App” that includes diverse arrangements and layouts as a reference, often in a separate file that also enables a team to test component publishing and integration.

Who makes visual test cases?

In our practice, the designer or developer assigned to build the feature is also responsible to create an initial set of test cases. Once reviews begin, tester(s) are encouraged to add cases as they review.

How does a designer know what test cases to create?

Builders benefit greatly from examples, and need guidance when creating visual test cases for the first time. Starting points could include:

  1. Examples of existing visual test cases created for other components, often by the library’s owner or core contributors.
  2. Template with suggestive blanks to fill. In code, this could be a generator script. In Figma, it could be (less often) a plugin or (more likely) a component to place and detach.
  3. System site process documentation.
  4. Task description and/or subtasks in Jira or Asana, often itself a template.
  5. In-tool process docs, such as a code README or supplemental frame documenting the process in a Figma file.

Visual Test Cases Types

There are myriad potential visual test cases, and I’ve found success organizing them into types consistent with the visual outcomes needing to be reviewed by a tester. These include variations based on property, content, spacing, layout, and modular compositions.

Properties

Testing includes validating every property combination works as expected. This could mean reproducing what’s been configured in the variant grid.

This can feel redundant. The variant grid already shows this. However, methodically placing components and adjusting properties can reveal missing values, unexpected property toggles and wrongly tagged options. Cases per property combination are also useful for regression testing subsequent changes, including deprecated cases.

Visual test cases for each property combination

Content

For elements that contain text, image fills or swapped component instances, flow in realistic and unrealistic content to test the display’s resilience. Start with text and images with length or size that’s just right, long and short.

Visual test cases for content of varying length

But don’t stop with content length or size. Instead, also stress how the component responds when the content might be wrong, may collide with other elements, or even when it’s missing (and should collapse).

Visual test cases for incorrect, missing and collisions of content

Spacing (within and between elements)

Spacing between elements can be tricky. A mindset must be meticulous to inspect all layers to find where spacing is potentially separating any pair of elements, including beyond the default display.

Consider starting with a base layout (a near “kitchen sink”) to expose many elements simultaneously. Overlay space measurements to assess padding and margins. As you do so, ensure measurements align to actual layer boundaries.

Base layout, validating most space within and between elements
Visual test case for basic spacing of most elements

Beyond a base case, create compositions to other element combinations, such as space between a title and button OR title and link OR title and detail paragraph. Most combinations may be stacked pairs, but could also include horizontally arranged elements that can collide, too.

Validating space between element pairs, such as title with button
Visual test cases for many combinations of elements

When needed, I’ll adjust component width to a pixel where a word wraps (below, when a title collides with the dismiss icon), duplicate the instance, and reduce the duplicate’s width by 1px. That provides a very clear sense of boundary collisions, and is helpful to regress future typography updates of font weight and letter spacing.

Validating horizontal pairs, such as a heading with and without the presence of a dismiss icon
Visual test cases for when text wraps, which is useful for regression testing later

Layout

Visual test cases for layout can become plentiful, and it’s important to start with the basics. Stress test the layout with varying combinations of displayed elements, and then altering component width to be Fixed Width both narrower and wider than the default. Across cases, inspect layout attributes per element to validate both horizontal and vertical settings.

Visual test cases for layout scenarios

While it’s far more common for components to vary horizontally as Fixed Width and/or Fill Container , they’ll often Hug Contents vertically. Therefore, also inspect changes in height to validate that the visual result is what you expect when the container is shorter or taller than default settings.

Visual test cases for layout scenarios

Components are often made in isolation yet included in larger compositions that set layout attributes to Fill Container. Therefore, include cases that contain components in larger containers. This can be particularly helpful in identifying elements that accidentally span beyond a component boundary or could interact with container attributes like a dark mode’s background color.

Visual test cases for layouts of varying themes containing a component

Composition

Many components take advantage of modular capabilities to include nested components (such as an Alert using an Icon), subcomponents (such as an Alert including a custom Show More / Show Less drawer), or, as described here, slots to swap in custom content.

Slots are often hidden by default and represented by a subcomponent that depicts its default boundary and relationship with other components. Start with a test case that shows the slot.

Visual test case for exposing a slot by default

To test slots, it’s useful to have a custom slot sample to swap into the location. It should include flowed content, need not be attractive and benefits from a background color that clearly reveals its boundaries.

UI component used as slot content

Subsequent visual test cases should swap in the sample slot. Once inserted, vary component layout in a variety of ways — wider or narrower, taller or shorter, different adjacent items — to validate layout and spacing holds.

Visual test cases for slots

Visual regression testing

Visual test cases offer not just a way to validate “Is this built correctly?”, but also — later, as the component changes — confirm “Did it change how I expected it to, and not change in ways I didn’t expect it to?”

Visual regression testing has long been a technique employed by developers using tools like Percy, BackstopJS, and — the emerging de-facto standard — Chromatic. For designers, Figma’s branching helps us evaluate such propagating changes but only goes so far. As a result, EightShapes built a the Visual Difference Figma plugin to:

  • Conduct visual regression prior to or independent of Figma branching
  • Run comparisons for single objects independently
  • Detect differences caused by updating library assets
  • Inspect and adjust objects rapidly while comparing
  • Discover subtle differences in page mockups
Example of EightShapes Visual Difference comparison and dialog
Example of EightShapes Visual Difference comparison and dialog

Every component — even the simplest like a Divider or Link — can benefit from not just robustly building it but visually validating it works under a range of conditions. Sure, components differ in how each benefits from fewer or more visual tests to validate the concerns that matter to you. Nevertheless, visual test cases provide a foundation to more deeply inspect what you’ve built and sustain quality over time.

--

--

Nathan Curtis
EightShapes

Founded UX firm @eightshapes, contributing to the design systems field through consulting and workshops. VT & @uchicago grad.