How we built our design system with precise phases and vast validation

Darren Wong
Dec 3, 2018 · 10 min read

In the previous article, we talked about how we started our design system through a small, nimble, part-time team. We also went over some ways we sold the vision of a design system to our partners in other departments. This article will go over how we built Connect through phases and validation.

Why Connect?

Early on we did a brand exploration to discuss guiding principles and a vision for the product. The word connect continued to appear in our discussions, how we connect the “why” to a user’s actions and eventually how we would connect the product experience with one unifying design system.

Enabling Phases

We had a pretty strong vision of how we wanted to release Connect to the masses.

⚒ Structure Phase
🌈 Visual Phase
🎉 Experience Phase

⚒Structure Phase

This was the first phase with the highest risk. Chris Abad, VP of Product and Design, helped us get buy-in that the structure of the pages should be rethought and validated before applying Connect. The thought was if we kept the architecture as it was we would only be applying a band-aid with a design system and not solving the real problem— more on that later.

Audit
We created a spreadsheet to document the pages within our app. This spreadsheet has two main columns: design and engineering. The design column was to track whether we had the sketch counterpart for the component used on the site. The engineering column was to track if that page had been added to the global style guide we created called the toolkit.

We audited our product including: buttons, colors, typestyles, you name it. We collected it, filed it, and moved on to the next element. This gave us visibility into problem areas and the ability to combine similar patterns. This also gave the management hard numbers to track progress. Visibility = trust.

Transcribe
As we on-boarded new designers there was a noticeable delay in the time to deliver mocks to developers. Designers had to create from scratch for each new view. In addition, the accuracy of elements would be a little off which delayed time to review and the creation of accidental one-offs.

In 2016, our app consisted of 4 major views: Dashboard, Drafts, Highlight Reel, Video Player. We took screenshots of each view in its existing state and recreated it in Sketch beginning the consolidation of our product. It wasn’t pretty but it gave us a great baseline to build from.

Existing → Baseline

As a new designer to the team, this was a great way to see the entire product screen by screen, button by button, and pixel by pixel. After my tour here I had a better understanding of the product, where we needed to focus, and where we needed to make trade-offs.

After our audit, we began to see elements and functionality didn’t quite fit together. With the support of Chris, we came to the conclusion that simply applying new visual styles would not fix the underlying problems. This was a challenging call because this would increase time and lift to roll out the entire design system.

Here is an example of a structural change we made on the dashboard. The dashboard squad was in the middle of rolling out an updated design and we paired with their team to ensure this would transition nicely into Connect. We focused on the structure of the page and kept the visual styles consistent:

Before Structure Phase → After Structure Phase

🌈 Visual Phase

If we did the first phase right, the second phase should be really light. Things affected in this phase are elements like color, spacing, etc. This phase was paired with the launch of Connect as it would be the most visible change to every user. During this phase, we focused on ensuring our styles within the code were extracted into variables that could be used consistently throughout components.

Here is an example of how we applied the visual phase to the dashboard:

While the visual phase was meant to be the lightest lift, there’s quite a bit that went into it and a lot of problems that can arise.

Accessible Feedback
When we started Connect, accessibility was a part of the early conversations. Emily had been making sure our product adhered to accessibility guidelines long before I joined. Developing the color palette for Connect was my first foray into one tiny facet of accessibility design and I relied on Emily, Homer (engineer), and Mike (engineer) as well as tools like Stark to ensure our design decisions were accessible for all.

Emily helped keep our focus on making sure our product can be accessed by anyone. It’s easy to get caught up in trying to hit a contrast number and miss the point of why you’re doing it in the first place.

Subjective Feedback
At UserTesting, we think we provide the most value when our customers are testing ideas, prototypes, and products early and often. However, there is one type of feedback that isn’t always constructive: subjective feedback. In our early concept phase of Connect there was this notion from our executives that our product should pass the squint test.

Squint test: the ability to recognize a product even when squinting while looking at the screen.

This manifested as a “hot” color to contrast the cool blues in our product. We arrived at a gorgeous rose quartz, #FA55A2. For the first few months, there was peace. But as we started to use the rose color in more internal facing materials we started to hear murmurs in the dark of the night. Then we got an email with the 5 words heard round the world: Why is this button still pink

We had a difficult time validating the effect of the pink color on our users. In our usability tests on pages like the dashboard, users were able to accomplish the task no matter the color of the button.

Our use of pink fell apart in three ways:

  1. Context
    We didn’t have a compelling story for the rose other than it contrasted well with our other colors. A primary color usually has roots in your primary brand color, this wasn’t the case for our selection. This disconnect became more apparent the more we stress tested the color in different views and pages.
  2. Family
    The rose color didn’t family well with our secondary color: blue. Pairing the two together resulted in a cotton candy theme, that juvenile palette didn’t sit well with our stakeholders who wanted to take our product to the enterprise market.
  3. Accessible
    We had put all this work in to ensure our colors met or exceeded the contrast ratio set by WCAG. White text on the rose color just would not pass. We attempted a gamut of explorations: darken, shadow, reverse, to no avail.

This led us to revisit our primary CTA and it led us to something that was in our faces all along: blue.

This has a direct connection to our brand color, it worked well with our family of grays — since the grays are derived from a similar hue, and it was accessible to boot. Sometimes you need to take a good hard look in the mirror and other times you need a little nudge to see things clearer. We think our team and our product are better off for it.

UX Dust™
UX Dust™ happens when customers have a strong association with a bad pattern. For example, we give our users the ability to make a highlight reel (collection of clips) of their user feedback. It allows for easy sharing of key insights. Our current pattern uses two vertical panels folder to drag clips from one side to the other. Because of its vertical nature, this pattern requires a lot of scrolling to access clips.

For the design system we decided to implement an 8-pt grid to ensure consistent spacing across the app. This spacing caused our users to see 3 clips in their left pane rather than the usual 3.5 clips. We began to hear the about the pain this caused our customers because of the extra scroll to see more clips.

We kicked up some UX Dust™ here.

This visual change highlighted how difficult it was to use this feature successfully and so the visual change was associated with the UX pain. We were able to get a quick fix out to decrease the customer pain but we know we didn’t completely solve the issue. We put a pin in this experience as an area to revisit.

  • Users can be trained to use a bad experience and like all bad habits, they are hard to break
  • “If it ain’t broke, don’t fix it” is a tired, lazy saying from the 70’s. 2018, gave us this new phrase: “just because something works doesn’t mean it can’t be improved.” (Princess Shuri, Black Panther)
  • Change is scary. Most people are change averse especially if the change is unexpected. Our products should work harder to help transition our users through these changes.

🎉 Experience Phase
This is a phase we haven’t completed just yet. This phase is intended to encompass moments of delight which can manifest itself in many ways including copy, animation, etc. This is an area we know we can improve on and we are still understanding the best way to tackle it.

📊 Validating Your Design System

UserTesting is made up of vertical squads responsible key features and products. Design system was the first squad set up horizontally to support these vertical squads. So while we didn’t have a main focus feature we made up for it by supporting all of our core products. This operation was tactical and surgical. We go in to affect specific structural changes all while ensuring everything worked well globally. There were a lot of moving parts and to ensure we were crafting the best experience for our customers we put together a validation program.

We worked with our amazing Product Insights team to get really specific feedback early on about on granular functionality and styles. We got pretty good at this. The styles that worked well got absorbed by the design system. The ones that didn’t work well, were rethought. It was quick and fast-paced but gave us the ability to give squad stakeholders comfort that structure changes were tried and true and it gave us stronger components to put in our design system.

What our team did:

  • Repurpose previous test plans
  • Conduct and/or launch sessions
  • Make decisions based on findings

What Product Insights helped with:

  • Choose methodology (if not the typical prototype test)
  • Review test plan (if new)
  • Recruitment (if not from our panel)
  • Debrief on session and discussion of findings (when requested)

Our director of research, Marieke McCloskey, wrote about this process in more depth here:

Democratizing CX research: How our Design Team embraces human insights

We were able to rapidly run through this list to validate existing and new components of the design system. We tested everything from the change in content to removal of icons to a brand new experience for our video player. Nothing was too big or too small. Going through this validation program really rang true the importance of our product. We were able to rapidly test risky assumptions move forward with decisions that were backed by qualitative data. This also meant the patterns and components going into the design system were verified which gave designers and developers confidence while building new products and features.

We enabled a phased approach to building our design system. This helped de-risk and focus on specific aspects of the build process. We validated our system to ensure if passed certain checks and it-turn we strengthen each component in our system.


UserTesting Design

Thoughts, stories, and learnings from the Design Team at UserTesting

Darren Wong

Written by

Design @UserTesting

UserTesting Design

Thoughts, stories, and learnings from the Design Team at UserTesting

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade