Published in


Heuristic Evaluation: An introduction

There are lots of different ways to evaluate software but what I’m going to talk about today is a particular technique called Heuristic Evaluation, which was created about twenty years ago now by Jakob Nielsen and colleagues.

The basic idea of heuristic evaluation is that you’re going to provide a set of people — often other stakeholders on the design team or outside design experts — with a set of heuristics or principles, and they’re going to use those to look for problems in your design.

Each of them is first going to do this independently and so they’ll walk through a variety of tasks using your design to look for these bugs.

Different evaluators are going to find different problems and then they’re going to communicate and talk together only at the end.

This is a technique that you can use, either on a working user interface or on sketches of user interfaces and it works really well in conjunction with paper prototypes and other rapid, low fidelity techniques that you may be using to get your design ideas out quick and fast.

Neilsen’s ten heuristics, are a pretty darn good set. They do a pretty good job of covering many of the problems that you’ll see in many user interfaces; but you can add on any that you want and get rid of any that aren’t appropriate for your system.

Neilsen’s 10 Heuristics

Give your evaluators a couple of tasks to use your design for, and have them do each task, stepping through carefully several times. When they’re doing this, they’re going to keep the list of usability principles as a reminder of things to pay attention to.

Now which principles will you use?

I think Nielsen’s ten heuristics are a fantastic start, and you can augment those with anything else that’s relevant for your domain.

Obviously, the important part is that you’re going to take what you learn from these evaluators and use those violations of the heuristics as a way of fixing problems and redesigning.

In this process you might want to have multiple evaluators rather than just one because an evaluator cannot find all the problems, while more evaluators find more problems.

It’s of course going to depend on the user interface that you’re working with, how much you’re paying people, how much time is involved — all sorts of factors.

Jakob Nielsen’s rule of thumb for heuristic evaluation is that three to five people tends to work pretty well; and that’s been my experience too.

If we compare heuristic evaluation and user testing, one of the things that we see is that heuristic evaluation can often be a lot faster — It takes just an hour or two for an evaluator — and the mechanics of getting a user test up and running can take longer, not even accounting for the fact that you may have to build software.

Also, the heuristic evaluation results come pre-interpreted because your evaluators are directly providing you with problems and things to fix, and so it saves you the time of having to infer from the usability tests what might be the problem or solution.

Now conversely, experts walking through your system can generate false positives that wouldn’t actually happen in a real environment and this indeed does happen, so user testing is, sort of, by definition going to be more accurate.

Personally, I think it’s valuable to alternate methods:

With user evaluation and user testing, you’ll find different problems, and by running HE early in the design process, you’ll avoid wasting real users that you may bring in later on.

Thank you for reading!

I hope I have provided a useful introduction :)




Smart Insight Communities

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Angie Sinanaj

Angie Sinanaj

Product Designer (UI/UX)

More from Medium

Managing Researchers: Part 1

How responsible are UX researchers for a real-life user experience and consequences?

Parkinson’s Law

The Design Thinking Process