Establishing a “Design Quality” metric to build design credibility

by Jen Cardello, Executive Director of Design Research / Head of DesignOps & Stacey Cunningham, Principal Design Researcher

In operationally-minded organizations, numbers reign supreme. How can design play the quantification game?

We’re all familiar with the story: Technology organization, born as an eager startup, grows and scales, leaving behind a trail of inconsistent user experiences created by different teams over time. Moving quickly, there is little room to step back, reassess, and focus with fresh eyes due to forward-focused momentum.

At athenahealth, our product teams have turned out a full continuum of healthcare experiences over a 20 year period including revenue cycle, clinical, and patient workflows. In the healthcare space, every moment healthcare professionals spend with the technology may be a moment not spent with the patient, so accumulated UX debt can become a mountain rather than a molehill.

athenahealth is on a heartfelt mission to transform healthcare and is ready to do right by our users. However, in operationally-minded organizations, numbers reign supreme to describe performance and drive decisions. Thus, our boulder standing in the way of improved quality of our experiences was determining how design can play the quantification game. We were up for the challenge and, to do so in 2017, we latched ourselves on to a company-wide initiative to reduce work for our users and improve product-market fit.

We stood with the company on the work reduction call. We communicated our perspective that there’s a two-pronged approach to work reduction — eliminate or redesign. And then we looked into our toolbox.

How we framed the problem

We chose a simple, tried and true method — heuristic evaluation — upon which to build a framework and quantify “design quality”. Our hypothesis was that by assessing the interfaces in their existing state, we’d have great examples of how we could eliminate and reduce work for our users and rally excitement for addressing UX debt and improve design quality.

We worked with our customer intelligence team, analytics, and product management leadership to determine the top workflows for each user type. What are workflows? The tasks our users conduct on our systems.

In this initial Design Quality evaluation period, we evaluated 55 workflows in 9 two-week sprints.

We created a design quality scoring system to assess these top user workflows against Jakob’ Nielsen’s 10 usability heuristics. A panel of research experts used our scoring system to assign a 0–4 design quality rating for each heuristic which we averaged to create an overall design quality score for each workflow.

We were able to compare scores across workflows to show design quality for different user roles and product areas. And each workflow received a detailed audit that served as a blueprint for teams to either address problems, notate the good, or reimagine entirely.

The cover page of a Design Quality Audit. This one illustrates before and after design improvements between 2 releases.

9 two-week sprints, 5 months, 3,250 findings later

It’s a lot of work to take on, and we didn’t do it alone. The researchers on our Design Ops team leveraged the help of the broader product org to assist with scoring of the workflows (designers, product managers, engineers, and subject matter experts). We worked in an agile manner, scoring during 2-week sprints and presenting the results to the Product organization at bi-weekly readouts. In the end, we had over 3,250 individual findings across the 55 workflows. We conducted an inductive analysis to distill findings into themes and global recommendations for product teams and leadership.

Outcomes in the organization

We saw long reaching effects of this program within the company. Project teams began sharing their Design Quality scores at company-wide R&D reviews, along with plans to address major themes found in the evaluations. We trained the entire UX org on how to conduct heuristic evaluations and we implemented a tracking system to monitor the progress of changes to design quality over time, visible to senior leadership. We are currently re-scoring any improved workflows for each release and have turned the heuristic evaluation into a tool that product teams can use on their own to evaluate and discuss their workflows.

In the end, our simple approach to give our colleagues in product and leadership a quantitative way to speak about UX debt transformed our ability to talk about UX debt, and created a sustainable and common language to discuss design in general at the operational levels of our organization.

We have a variety of design roles open in Boston, Austin, Bengaluru, Chennai and Pune. If you’re interested in working with us try here and here or get in touch via LinkedIn, Twitter etc.