Enhancing user experience design with UX assessment

Not just another practical guide to heuristic analysis

Sergio Vento
Telepass Digital
4 min readAug 3, 2023

--

In the realm of digital product design, the primary objective is to create a cutting-edge User Experience (UX), catering to both user needs and business goals. To achieve this, processes often need to engage stakeholders representing various perspectives and requirements (product managers, legal, developers and more). However, frequently, the most crucial perspective — the user — is not adequately represented.

To meet usability standards, it is crucial to identify and resolve frictions, often adopting user research methodologies (such as user tests). But, the real world is not always the fluffy and perfect place we wish for, and limitations in time, budget, and resources are common.

The Panacea

In such cases, how can we effectively address user needs? While a designer’s experience and benchmarking are invaluable tools in a product designer’s toolkit, they may not always suffice. Hence, the groundbreaking and miraculous tool that can solve all problems does not exist! Nevertheless, there is an often overlooked good practice: UX Assessment.

A Matter of Methodologies

UX Assessment or “expert review” encompasses various methodologies in which trained and qualified experts analyze and identify usability issues in a product or design from the user’s perspective. These methodologies include heuristic analysis, UX teardown, and cognitive walkthrough. In our case, we are discussing and utilizing heuristic analysis, but each methodology has its pros and cons, making it suitable for specific situations.

Some of the UX Assessment methodologies

Why Heuristics?

The choice of this method stems from the fact that:

  • It does not necessarily require external resources (auditors); anyone with a good understanding of heuristics and design principles can conduct the inspection, including designers.
  • It does not demand a script but allows auditors to navigate freely through the entire prototype.
  • It produces an output in the form of a list of potential problems associated with violated heuristics.
Jakob Nielsen’s research indicates that five evaluators can help you discover about 75% of the usability issues. © Jakob Nielsen e Nielsen Norman Group, link

Note: It is essential to keep in mind that some of the issues identified may be false positives, since we are not testing the prototype in a real context with real users. Thus, it’s always better to complement assessments with user tests (it doesn’t replace them).

Think Big, Start Small

In the initial phase, as with any respectable experimentation, we began with small steps but aimed for significant results. We formed a small team of auditors from among our colleagues and conducted a heuristic review, providing practical examples to reinforce the knowledge of those familiar with them and educate novices on the subject. Subsequently, we defined a template for each auditor to conduct their assessment, aiming to speed up and enhance the process with each iteration.

Heuristic Analysis Post-it: A descriptive placeholder in it, used to note frictions in prototypes based on heuristic evaluation.

The Lessons Learned

  1. It is essential to remember that a methodology is a tool and not an absolute solution. In our case, when we couldn’t test with real users, the UX assessment helped evaluate usability and anticipate potential critical problems.
  2. Over time and with each iteration, the process becomes faster and can serve as an excellent tool for pre-testing before user tests, even in projects with more time and budget resources.
  3. Always be prepared, as every project “should be ready for yesterday”. Having a group of skilled colleagues and a pre-designed template is invaluable (We have the luck to have over 20 members in the design team).
  4. Heuristics can be an excellent rationale to present to stakeholders to convey your point of view during the mediation of solutions.
  5. With time, everyone learns heuristics and becomes more sensitive to usability issues, almost like a game, where colleagues and teammates begin to identify potential problems (and that’s when the work becomes more manageable).

These insights represent what we learned during our personal experience. However, we acknowledge that different contexts and situations can bring forth new perspectives. Feel free to leave a feedback or share the article if you found it interesting.

This article was written by product designer Sergio Vento, and edited by Marta Milasi and Gaetano Matonti, respectively UX Content Lead and Managerial Software Engineer at Telepass. Interested in joining our team? Check out our open roles!

--

--