Basics of Human Factors Engineering for UX Designers

Dano Qualls
10 min readJan 20, 2017

Sometimes you learn something that completely changes how you look at the field of design and your job as a designer. One of those experiences for me was the “Foundations in Human Factors” course, which kicks off Bentley University’s master’s degree in human factors. This article provides a quick overview of the course, and reading it might help you see some of the science behind good design.

Human factors focuses on cognitive psychology and perception, and it helped me learn to see beyond “the user” and “the product” to think about:
- maximizing the use of the human strengths
- accommodating the human limitations

We do this in order to create a product that is highly usable. Designers who understand the basics of human factors get the big picture that:

The signal in the physical world is the foundation of design. We can understand how the human system works in order to design the most effective signals in a world full of distractions.

Users don’t just see and respond to your product. There are several discrete steps that happen along the way, and each of these can hurt or help your product’s usability. These steps are explored in human information processing:

1. Detection and discrimination (can they see it at a biological level?)
2. Grouping and organization (how does the mind group signals at a pre-attentive level?)
3. Meaning and working memory (can they find meaning at a cognitive level?)
4. Learning and long-term memory (can they retain and recall knowledge at a metacognitive level?)
5. Decision making (how do they act on what they know?)

Each of these processes are distinct opportunities for design success or failure. This article will touch on highlights for each of these steps.

1. Detection and Discrimination

Can they see/hear/feel it at a biological level?

Key takeaway: Humans are great at detecting change and contrast.

Instead of thinking about a phone app as a collection of letters, pictures, and sounds, think of it as a collection of signals. Letters and buttons are colored light that the eye sees. Sounds are waves in the air that the ear hears. Silent buzzes are vibrations that the body feels.

I point this out because a signal can be measured. Scientists have studied how people perceive different signals and we know some useful things, like:
- the visual channel has way more bandwidth than touch, hearing, smell, or taste
- the strongest signal reaches the brain first and dominates attention
- contrast is the biggest determinant of signal strength

Bandwidth of our senses, by Tor Nørretranders

Humans are great at noticing differences right in front of them, especially if the difference is high contrast. Things like borders, edges, color, and size can create powerful contrasts, and powerful contrast dominates attention.
Although we’re great at detecting change, we’re bad at conceptualizing differences and comparing things from memory. The mind is not like a camera that records what it sees. It’s more like a change detector that takes the 10% that’s unique about what it sees and overlays it on the general form of what it already knows. If you ask a person to compare something on the screen to something in their memory, they’ll have a hard time. But if you ask a person to compare two things on the screen, they can easily tell the two apart. And we can take advantage of the mind’s ability to notice contrast.

To understand the importance of signal strength, think about our ancestors, the cavemen. In order to survive, they had to quickly notice anything that moved at the edge of their vision, anything brightly colored, anything big. We aren’t as worried about jaguars, poisonous plants, or mammoths today, but our perception is still highly-attuned to these high-contrast signals. This explains why you keep looking at the TV in the bar even though you want to keep eye contact with the people you’re with — you’re hardwired to pay attention to high contrast signals, like the movement and bright light coming out of the TV. The mind can learn to ignore unchanging signals that don’t go away, but constant change is hard to ignore. By the way, the mind’s ability to ignore unchanging signals is known as “habituation” and is the reason why you shouldn’t overuse high-contrast signals — the brain says, “Not changing? I’ll ignore it because it’s not a threat.”

2. Grouping and Organization

How does the mind group signals at a pre-attentive level?

Key takeaway: The brain is a cognitive miser and finds patterns to reduce its workload — even if there wasn’t supposed to be a pattern there.

There is a step between seeing something and making sense of something known as “pre-attentive processing.” It’s called pre-attentive because it occurs before the conscious mind can focus attention on it. Your brain tries to make things easier for the attentive mind by grouping things at this stage. Signals that share similar qualities are grouped together and sent along as a single chunk. Similar qualities can include:
- proximity (closer together = closer relationship)
- alignment
- symmetry
- similarity in size, color, or shape

This list may sound familiar if you’ve studied Gestalt principles. Be aware that the mind is so good at finding patterns that it can find them even if they aren’t supposed to be there. If you want to know how your design is grouped at a pre-attentive level, squint until you can’t read the words and you will see the pre-attentive groups.

Once implication of understanding pre-attentive processing is that you should use the least intrusive ways to group things:
- white space is least intrusive
- common background color is next
- use borders/frames as a last resort

3. Meaning and Working Memory

Can they find meaning at a cognitive level?

Key takeaway: Working memory has a limited capacity, limited duration, is highly volatile, and is affected by motivation.

Since we’re getting technical, let’s talk about the two kinds of memory: long-term memory and working memory. You’ve probably heard of long-term memory and short-term memory, but cognitive psychologists refer to short-term memory as “working memory.” There are differences between the concepts of short-term memory and working memory, but the two are interchangeable for laymen. Working memory acts as a “scratch pad” where information is processed. Working memory is where “thinking” happens. Working memory gets information, either by sensing it from the outside world or retrieving it from long-term memory, and then thinks about it. There are three important things to know about working memory: it has a limited duration, limited capacity, and is highly volatile.

Limited Capacity

Imagine the volume of what we can process as an hourglass shape. Our senses can take in an enormous amount of information. Working memory has a tiny capacity. Long-term memory has an enormous capacity.

How much can working memory hold? One famous study says the “magic number” of objects working memory can hold is 7 +/- 2 (although this number is smaller if the objects are complicated). Another famous study says the number is four chunks. So whether it’s four or seven chunks, it’s not much.

Why am I calling the objects in memories “chunks?” Because the mind can combine several pieces of information together into a single chunk and hold several of these. For example, can you remember DWHVXTFQI? Try reading that a few times and see how much of it you can remember.

Now read BMWIBMUSA and see how much you can remember. Probably the whole thing. Both are nine letters, but the first is nine chunks and the second is three chunks (BMW, IBM, USA).

One more note — the capacity of working memory gets even smaller with old age, learning disabilities, anxiety, or exhaustion.

Limited Duration

Working memory is commonly known as “short-term memory” because it can only hold information for a 20–30 seconds. The mind can keep it in working memory longer with effort such as repeating the information over and over.

Highly Volatile

The information held in working memory can also evolve or be corrupted as it sits there for its 30-second life. Or disappear entirely, never to be seen again. Imagine you and I are in a room together. I show you small string of numbers on a whiteboard, like 529306. I then erase it and ask you to hold it in working memory for 30 seconds. And after 20 seconds I pop a balloon.

Image courtesy of Unsplash

Where did that string of numbers go? It’s gone, and it’s gone forever. Working memory has limited capacity, limited duration, and is highly volatile.

Motivation and Anxiety

Something else to know about working memory is that it’s affected by emotions. The dominating emotions on human performance are motivation and anxiety — working memory is increased by motivation and decreased by anxiety. Humans are motivated by reward, gratification, pleasure and efficiency. They can extend their biological and cognitive abilities to reach complete use when they’re motivated. Anxiety (in small doses) can also help us stay focused and fully engaged, but too much anxiety overwhelms working memory. Anxiety can come from the environment, fatigue, frustration, anger, threat, or loss. If you’re designing something that deals with a sensitive topic, like money or health, then people could have higher levels of anxiety and be less capable than the average person when using your product.

Designing for Cognitive Load

With this understanding of working memory, you can see the importance of minimizing cognitive load. When the task at hands requires more cognitive load than the person has to give, you will get mistakes and abandonment. Asking people to perform fine discrimination tasks between similar sounds/colors/shapes draws on our working memory, which you now know is very limited. It’s even possible for a person to detect two different signals but fail to discriminate between them as different, like a nurse who is busy at work and confuses two similar-sounding medications.

A key takeaway from this section is to learn to see humans as cognitive misers. We are engineered to only give enough to get by. We’re good at efficiency, including auto-responses and unitizing. We sample bits and pieces; we don’t devote our full attention to everything. Besides being misers, we’re also foragers. Information foraging theory says that we seek, gather, and consume the flux of information in our environment. We scan for a scent, if it’s cold then we move on, and if it’s hot then we dig in.

4. Learning and Long-Term Memory

Can they retain and recall knowledge at a metacognitive level?

Key takeaway: People expect new systems to mirror the ones they already know, but they can learn new ones with greater ease with help from cognitive scaffolding.

Working memory is very limited, but long-term memory can hold an enormous amount of information for a lifetime. We arrange knowledge into semantic networks, also known as schemas. When a schema stores information about systems, they are known as mental models. These mental models help people anticipate events, reason, and underlie explanation. When a person interacts with a system for which they have no categorization, they “thrash about” randomly. Our industry loves innovation, but our users love familiarity. When you create an interface, you should strive for intuitiveness by mapping to what people already know (even if the underlying technology is groundbreaking).

Information is moved from working memory to long-term memory and retained through depth of processing. This can include:
- rehearsal (rote memorization where the muscle memory “wears a groove” into the mind)
- elaboration (where you build on understanding and self-generate new information)
- duration (longer time spent learning helps it stick; for example, 10 half-day sessions are better than 5 full day sessions)
- distribution of presentation (you need time to absorb, assimilate, and accommodate information, and much of this happens during sleep)

Learning can be aided with “cognitive scaffolding” where you set up helpful guides at the beginning which serve as a framework, and take them down when the session or the learning is complete. You may need to leave the cognitive scaffolding up permanently if the environment is constantly changing or there is a long time gap between uses. TurboTax is one example of a highly-scaffolded product; you don’t need to learn the tax rules, you just need to enter your information and move on. And even if you did learn the tax rules, the rules could change next year and you would have to learn them all over again. Games are another example of learning done well — they don’t shift the burden of learning onto the user, rather they embed learning into the experience.

5. Decision Making

How do they act on what they know?

Key takeaway: Humans are bad at conceptualizing decisions they can’t see, so they choose an option that’s easy to pick to reduce the anxiety of not knowing.

Let’s start with an uncomfortable proposition: humans are bad at making decisions. The decision making task often exceeds our capacity. There is a gap between the initial state and intended goal that we have trouble bridging. Standing at this decision gap causes anxiety, and we hate anxiety. We learn to cope with this anxiety, but not always in healthy or helpful ways, including
- looking for confirmation
- avoiding conflicting information
- overweighting certain factors
- not optimizing the outcome

The result is that we often “satisfice.” This means we search the available alternatives until we decide on an acceptability threshold, and accept one of the available options as satisfactory. This helped our ancestors survive in a dangerous world that required quick action, but is not helpful at solving problems we can’t see right in front of our face, like health and money questions. This means its helpful to give people scenarios to conceptualize a choice, like “What if I can reduce your workload by one hour a day? Would you pay $100 for that?”

Wrapping it Up

To summarize this article, I would say the signals we design are competing with a noisy world, and that humans we are designing for are:
- change detectors
- pattern seekers
- bandwidth-limited

If you want to learn more about human factors, you might take a course in human factors or cognitive psychology at a local university. This article just began to touch on the subject and I’ve got dozens more pages of notes that I didn’t even touch on.