Design Audits: what and why!

Oscar Health
Oscar Tech
Published in
7 min readMay 20, 2024

by: Celina Leong, Dan Ariniello, and Yvonne Wang

Intro

As a team, we’ve been exploring avenues to better define and maintain our design quality. One challenge we’ve encountered is that “quality” and what “good” means are subjective. To address this, we’ve been conducting design audits. These audits utilize methods with predefined guidelines and rubrics, e.g. heuristic evaluation to evaluate usability. This approach allows us to assess our current state of quality more objectively. Not only have these audits shed light on our design quality, but they’ve enabled us to effectively articulate and communicate our standards to a broader audience.

The authors — Yvonne, Celina, and Dan — contribute to various product areas at Oscar, working on external and internal-facing platform products. Our process shows that any team can find value in performing design audits.

What is a design audit?

A design audit is essentially a health check for a product or parts of a product’s design, examining how well the visual and interactive components work together across every platform based on predefined criteria. This criteria can vary — from reviewing the aesthetics, color, and fonts, to error prevention, user control, and language. Performing an audit means taking a step back to ensure consistency in the visual style and usability of an experience.

Why is a design audit important?

Each of us works across separate product areas, representing different user groups with unique challenges. Yvonne’s platform is for external brokers, while Celina and Dan solve for internal users focused, respectively, on prior authorizations and claims. Nevertheless, performing a design audit proved to be universally beneficial.

  • Yvonne: My team builds tooling to help brokers manage the policies they sell. As our understanding of the market and users deepens year over year, we see growth in this area and consistently launch new features to meet evolving user needs. Over time, the contributions from various designers mixed with an MVP-focused build approach have introduced a variety of design patterns. When looking at the product holistically, it became evident that there are a number of inconsistencies across features. While these might be easy to ignore individually, collectively they increase the time required to select the right design components and add to our tech debt, thus slowing us down.
  • Celina: My product area has historically had limited design support, with many one-off “band-aid” solutions built as necessary fixes. As an internal-facing application aimed at operational efficiency, it has continuously accrued noticeable usability issues and visual inconsistencies, alongside design and tech debt. Our audit allowed us to identify and log areas ripe with opportunities for improvement.
  • Dan: The area I support is currently benefiting from a renewed investment from the business. This extra attention affords us a great opportunity to take stock of the current experience, address legacy pain points, and take a proactive approach to mitigating risk as we expand. With an expected increase in user adoption, we wanted to capture all of the potentially risky areas, and make all appropriate updates, while we are getting ready for new users to onboard. This opportunity came at a great time for me as I had recently transferred into this area, and the audit proved invaluable to my onboarding.

How did we conduct our design audits?

Holistic tool audit

  • Yvonne & Celina: Our first step in the audit process was to define the goals and guardrails, and align with our partners in product on the areas most ripe for this evaluation. Recognizing that inconsistencies were present across the entire application, we agreed that a comprehensive audit of the full experience would offer the most substantial benefits. Both of us chose to conduct our audits referencing the heuristic evaluation as defined in Jackob Nielsen’s 10 general principles for interface design. Oscar’s design system (our library of reusable components and guidelines) served as an additional source for auditing design patterns.

Narrow UX flow + user interviews

  • Dan: For my product area, I chose to perform a focused audit on a specific experience within an application. Following the heuristic evaluation, I conducted a mix of group and one-on-one user interviews to validate my findings and make room for new ideas to surface. Together with power users, we performed heat mapping exercises to better understand the relationship of data presentation and business logic as it pertains to each workflow. Taken together, the heuristic evaluation, the interviews, and the workshops resulted in a wealth of insights for our team to look towards in the future.

Performing the audit

  • Yvonne: I developed a worksheet based on heuristics, turning them into specific questions for workflow review. I then rated our adherence on a 1–5 scale, noting both positives and areas for improvements. Alongside my notes, I included screenshots for clarity. Upon completing my audit, I summarized the findings using broader classifications — “major issue”, “minor issue” and “pass” — and color-coded them to further simplify the communication with stakeholders.
  • Celina: I chose to pair each finding with annotated screenshots in Figma, elaborating on why each was identified as a major or minor issue. Next, I organized these findings into a results table (shown below), highlighting the areas that needed the most improvement. I applied a similar method when performing a visual evaluation of the application to examine inconsistencies across layout, text hierarchy, grammar, and branding.
  • Dan: My evaluation was purposely limited to select experiences and workflows. As a result, I was able to go into great detail on each heuristic, documenting instances and patterns within scope that both met and failed the defined standards. I then prioritized the notes in order of greatest impact to user experience. Finally, I provided a macro rating of ‘Pass’, ‘Needs Improvement’, or ‘Fail’ as well as a granular letter grade (A–F) for the heuristic to better focus our attention.

Lessons learned and best practices

Be the driver of change

  • Dan: A detailed audit document can be far too extensive to digest in a single sitting. Designers have the opportunity to own next steps and negotiate for better UX practices by leveraging the audit outcome.

Visualize the narrative

  • Dan: A high level or macro score sheet paired with screenshots is more digestible and helps others visually learn the material. Sometimes, design, product, and engineering all see the same pain point but articulate it differently. A heuristic evaluation provides a framework for everybody to speak the same language.
  • Yvonne: It’s crucial for the audience to understand and visualize the usability issues, so I put together a slidedeck for easier digestion of the findings. Sharing this audit — whether it’s with your immediate team or the wider org — really underlines the importance of design consistency and allows us to advocate for all kinds of usability improvements with proof points.

Guide your team to the optimal solution

  • Yvonne: Performing the audit also allowed me to identify feature gaps, which I included in the findings deck. Thanks to this activity, I had a chance to examine the end-to-end experience, which highlighted clear gaps. The findings influenced our focus — both towards building out our backlog, as well as directly impacting our roadmap priorities.

Inspire partners

  • Yvonne: The share-out also inspired several other teams facing similar challenges to begin their own design audits, influencing how we communicate and tackle these kinds of quality issues.

Observed impact and next steps

  • Yvonne: Following the share-out, I collaborated with my product manager to prioritize and outline action items. Given our team’s recent adoption of a fix-it day process, where we allocate a day per month to addressing our tech debt, many small design inconsistencies are perfect tasks for these sessions. This allows us to incrementally improve the user experience. We’ve also set up a spreadsheet for tracking, making sure the results don’t get overlooked. The audit findings offered our product team valuable insights, helping them evaluate feature gaps and consider how these could be integrated into our roadmap.
  • Dan: There was no shortage of opportunity areas for our team to explore. Taking the scoresheet to prioritize the most troublesome components of our experience certainly helped refine our roadmap. One positive outcome is that our engineering team was inspired to dedicate greater resourcing to delivering the ideal user experience. We now have a significant chunk of time on the roadmap to address a lot of these weaknesses, plus even more that they have brought to the table. On a more personal note, I would say this project really helped push me to understand this particular area of my new domain.
  • Celina: To ideate on opportunities, I paired my audit findings with recent user research. Unsurprisingly, there were many parallels in the friction points mentioned by users and what was highlighted in the audit. Not only does an audit help keep in mind design opportunities (especially low-lift ones) when working on adjacent but unrelated improvements, but sharing these results across the pod encourages the team to value and incorporate usability and visual improvements in our future roadmap.

As a design team, these audits are essential for maintaining our standard of quality. We’ve set up a working group focused on improving design quality, for which the audits served as a baseline evaluation framework. Our team is also exploring ways to make these audits more effective and accessible for every designer at Oscar. These include centralizing the audit process for consistent access, creating a shared pool of audit templates, and documenting best practices.

Ready to embark on your own design audit? We hope the tips shared above will serve as building blocks and inspire you, especially if you find a design audit helpful for addressing specific challenges in your product and for improving design quality.

Celina Leong is an Associate Product Designer at Oscar, working across the Provider and Clinical platforms. Dan Ariniello is a Senior Product Designer at Oscar, and passionate about building products that simplify while delivering full utility. Yvonne Wang is a Product Designer at Oscar, she holds the belief that design is a process to create order out of chaos.

--

--