User Acceptance Testing vs. Usability Testing: What is the Difference?

REASSEMBLE
Reassemble
Published in
6 min readAug 19, 2019

--

Recently, some of my fellow UX practitioners have had to work with software engineers to do some testing. It was interesting to observe the differences in the ways each side saw their tests, as they tried to hammer out a testing plan that satisfies both sides.

From the viewpoint of the UXers, what the engineers wanted to do was a little strange. A quote kind of says it all:

If they’re providing such detailed instructions, how do we get any sort of insight from this?!

Now, I didn’t get to talk to the software engineers, but I suppose they would have seen a similar problem from the other perspective. If you don’t provide detailed instructions, how are you supposed to get any actionable information?

So here’s the thing, though — both tests are crucially important in understanding how your product (app, software, website, anything) works. It’s just that they answer quite different questions about your product and how it works.

And yes, this means a product can, and often does, ‘work’ in one way without being any good in another. That is in fact a frequent cause of products that fail — they may do something important, but they do it in a mystifyingly difficult way. Let’s break this down a little further.

User Acceptance Testing

User acceptance testing is a pretty established thing in software development. The International Software Testing Qualification Board (ISTQB) defines UATs as follows:

Formal testing with respect to user needs, requirements, and business processes conducted to determine whether or not a system satisfies the acceptance criteria and to enable the user, customers or other authorized entity to determine whether or not to accept the system.

One way to think about this is to think about the product as a machine. Let’s say it’s a car.

Vroom.

Now, by the time you’ve made a car that is seeing actual buyers, we’re going to assume (and hope) that you’ve tested it for what we might call ‘showstopper’ defects. This car isn’t going to explode when you turn the ignition key. Your wheels can actually turn left and right, and you do actually have brakes installed. You should have a way of tuning the engine up and down, so the wheels turn faster or slower.

But just because the stuff is already there — brakes, acceleration, steering, ignition — doesn’t necessarily mean they are accessible to the driver. So that is where a user acceptance test comes in — it is there to prove, to the driver’s satisfaction, that their interactions with the machine is going to produce the ‘correct’ outcome. The right input gives you the right output.

The whole point of a UAT is to confirm things like the following:

  • If you turn the key (or press a button, you fancy fellow), the engine starts.
  • When you turn the steering wheel left, the wheels (therefore the car) are supposed to turn towards the left.
  • When you put your foot on the accelerator pedal, the car speeds up.
  • Conversely, when you shift your foot over to the brake pedal, the car slows down.

This is why user acceptance tests need to be pretty explicit in telling you to test one thing or another. The whole point is to confirm the function of each part of the car; we’re not going to let you just sit in there and figure it out yourself. Once you input some instruction into the car, and the car gives the correct result, we’re good.

So far, so good. These are all important questions. But these are not the questions that usability tests are made to answer.

Usability Testing

Before we talk more about usability tests, here’s a couple questions about cars to ponder.

Why???
  1. Why is the accelerator pedal always on the right, and the brake always on the left? (And the clutch, if you’re driving stick, is always on the left of that?)
  2. Why do turning indicators make that clicking sound?

I know, these questions seem a bit ‘duh’. It’s just how it is! But that goes to the heart of the difference between usability tests and user acceptance tests.

Remember what we discussed just now — a user acceptance test is simply there to test whether the pedals and the indicators function. It doesn’t matter if the brake pedal is on the left or right — if the driver hits the brake and the car slows down, that’s a pass.

And yet, I can assure you this — if you built a car with the gas on the left, and the brake on the right, and took it for usability testing, it will be a disaster. Anyone who’s been through driving school, or driven a bunch of other cars, is going to try driving this one with this mental framework established:

  1. The left pedal is for slowing down.
  2. The right pedal is for speeding up.

You can see how this might not go well in your new, reversed car.

In other words, what a usability test is really testing is that, given what your user expects and wants, they will be able to use the product to accomplish what they were going to do. And that’s quite a different question from whether the product functions correctly, given a certain input, because it now involves a lot of fuzzy, human factors. Some of these factors include:

  1. What the user is used to doing without even thinking
  2. What sorts of cues the user is looking out for
  3. What the user views as the ‘correct’ process to complete a task

A good example of how cars take these things into account has to do with the way a car sounds. Cars these days — driven mainly by electronics, circuits and the like — are very different from the models of olden days, which are actually mechanical. For example, that clicking sound when you turn on the turn indicators — that’s actually caused by a spring clicking back and forth to make the light flash on and off.

Cars don’t use springs to control those lights anymore. They haven’t been doing so for about 30 years, probably. So why does that clicking sound still stay around? Simple: because drivers expect it as feedback. If you flicked that lever and there was no sound, you might think:

Uh, is the indicator turned on, or are my lights spoiled? What’s going on?

You might have to divert your attention to the green arrows on the dashboard, taking your eyes off the road for a moment. That would be inconvenient, disconcerting — and, if you’re driving at speed, even dangerous.

As such, usability tests tend to be more task-oriented. You don’t ask drivers to ‘step on the brake pedal on the left’; instead, you ask them to ‘slow the car down’. That is how you find out if your way of slowing the car down matches with the driver’s idea.

And if he puts the wrong foot down and ends up revving the car engine? Your car might work, but it sure isn’t working for its users. And that’s going to be a serious problem.

Working Together

These different aims determine what the different tests look like — and they also mean that neither test can really replace the other entirely.

A usability test that covers all the technical functions a UAT is looking for, is going to be very long and unwieldy. It would be inefficient to run, and it would give pretty fuzzy data that won’t be much good for engineers.

On the other hand, a UAT in itself won’t tell you that fuzzy data that is crucial to designing a product. It is very possible that something works perfectly, but it works in a way that frustrates, confuses and even endangers users. For example, the driver who isn’t sure if his silent turn indicator is on slows down to check — and gets kissed in the rear by another car.

As such, both UATs and Usability tests are important for gauging if a product is suited for use. Their ultimate goals do still converge after all. We do them because we want to be sure that, when we build something — a car, an app, a website — it makes life easy for the most important people to our business. The user.

Evaluating your product with users can be quite the challenge — but we can help. Take a look at some of our work, and get in touch with us — we love to talk UX!

--

--

REASSEMBLE
Reassemble

UX Design Consultancy based in Singapore. We convert caffeine into user insights and design. reassemble.io