For most projects, designers are at a minimum trying to reach table stakes goals — Intuitive, Consistent, Useful, Usable. After meeting the table stake goals they try to meet desirability-based goals like Delightful and/or Efficient. And then there are product-specific goals — those goals which differentiate their product from its competitors.
While some design goals or principles are formed as phrases, single adjectives are often how a user describes a product (Delightful, Efficient). In this article, we will revisit some adjectives (Accessible, Consistent) and extend the set to find additional or more nuanced ways of finding and addressing issues.
The paper What Makes a Good UX Questionnaire? User-centred Scorecard Development by Tina Lee and Melody Ivory inspired this article’s list of attributes (adjectives). It suggests the following: Useful, Learnable, Clear, Discoverable, Communicative, Universal, Credible, Valuable. Also, in her presentation Does it have legs? Abby Covert offers nine heuristics (adjectives for evaluation): Findable, Accessible, Clear, Communicative, Useful, Credible, Controllable, Valuable.
These adjectives are used to:
- Evaluate the strength and quality of what is currently offered to users.
- Facilitate critique during planning, design and development.
- Predict the effectiveness of a potential solution.
Some adjectives speak more to product-market fit than user satisfaction, but as they are expressed through the user experience so they would be addressed through changes to the experience. Some of the adjectives below are basic table-stake or desirability-based but restated or clarified, and some of the adjectives overlap in their coverage. The overlaps are important though because they can be used to hone in on issues from different directions.
Use adjectives appropriate to the scope of your experience. The adjectives listed below are best used for a whole experience. Some adjectives, though, are more appropriate to features than for experiences. While a feature can be Findable or Discoverable, asking whether an experience meets these adjectives may not make sense. Finding and discovering an experience may be an OS issue.
The explanation of the adjective is phrased in the form they could be presented to a user when the user is rating the product.
We have to be aware that the user may interpret the adjective differently than we meant. The consideration bullet provides further context for evaluating responses.
When we discussed these adjectives at Ellie Mae, Michelle Snyder, a senior researcher, asked an appropriate question; “What would you do if these were tested and measured?” The remediation bullet provides direction when a low score is received.
Accessible — you can reach the functionality of the experience with minimal effort.
- Considerations — Users will likely interpret this broadly, not just a question of whether the experience meets Section 508 guidelines, but whether the user thinks they can access the experience on the device, whether the functionality can be found in the information architecture, etc.
- Remediation — If the user feels that the visual style or interactions are inaccessible you will need further research to understand the extent and specific issues. If the issue is related to screen readers you need to make sure the product meets Section 508 guidelines.
Approachable — the experience is usable with minimal prior knowledge.
- Considerations — This is related to learnability but leans more into assessing whether the user finds the experience daunting when they first encounter it, without really judging if they can learn it once they use it.
- Remediation — A low score indicates that the user feels the solution is complex. You will need to look at whether the experience looks visually complex or activities feel too complex. May require changing disclosure experiences (what information/capabilities get exposed when). Would also look at Clear, Coherent, Comprehensible, Consistent and Understandable scores to see if and how they contribute to the issue.
Appropriate — the tasks and visuals are relevant to your work.
- Considerations — Users may be judging features as well as experience appropriateness. Speaks to whether the product’s style feels professional for enterprise products, fun or engaging for consumer products, trustworthy for financial products, etc.
- Remediation — Scores on related adjectives can help troubleshoot. Verify whether product features and visual and interaction style matches user expectations.
Clear — it is obvious what the experience is about and asking you to do.
- Considerations — Evaluates whether the product is unfocused or too complex or too technical for the user.
- Remediation — Assess whether information architecture, content, layout, or interaction needs to be simplified and whether content language is vague, technical or jargony.
Coherent — the parts and paths through the experience fit together.
- Considerations - Another adjective evaluating the organization of the product. This speaks to the level of consistency, whether the user thinks the parts look like they go together (like in a design system) and whether the tasks relate meaningfully. Can the user understand how they proceed point-to-point? Are the application's responses what is expected?
- Remediation - For issues with visual coherence evaluate the visual hierarchy and consistency across components. Also should compare activities and make sure that similar interactions use consistent tools. May need additional testing to see how the user expects the experience to proceed and if there is misalignment in the task flow.
Complete — the experience has all of the capabilities that I need.
- Considerations — This is a measure of whether the user feels the product has the capabilities to fulfil the goals they expect it to. If the user rates it low then they feel the product is missing features.
- Remediation — If the user feels it is not complete because they cannot find features the disclosure strategy needs to be addressed. If features are missing then product management needs to decide how important it is for the user to feel the product is a full solution.
Comprehensive — the experience covers the requirements needed to complete the tasks.
- Considerations — Another way of looking at completeness, Comprehensive extends the notion to cover additional possibly not-required capabilities.
- Remediation — This mostly speaks to whether the functionality the user wants or needs is there. If not, go back to PMs…
Comprehensible — the experience makes the tasks understandable.
- Considerations —Part of the set of adjectives that evaluates aspects of how understandable the product is. Focused on whether the product can be grokked.
- Remediation — Used together with Approachable, Clear, Coherent, Consistent and Understandable to diagnose the nature of failings in presentation and flow.
Consistent — things work the same and the same cues are used across all of the experience.
- Considerations — Yes. That consistent. It’s important to ask user, not assume, and find out from them whether they consider the product to be consistent.
- Remediation — Design out inconsistencies, from micro-interactions to visual cues, up to the information architecture. For parts the user feels to be inconsistent, that the team feels need to be different, consider redesigning capability so that the difference is meaningful.
Delightful — the experience is enjoyable to use.
- Considerations — If being delightful is important to your product it is important to specifically find out if you are meeting the metric. While not usually a priority for enterprise applications, getting those experiences to have any aspect of delight can make the experiences more palatable.
- Remediation — See Approachable, Efficient, Engaging, Functional, and Meaningful scores to see the nature of the issue.
Efficient — you are able to accomplish the task with minimal effort.
- Considerations — The priority for enterprise, productivity, and professionally-focused applications. In these cases, keyboard accessibility is often a priority.
- Remediation — Indicates a need to rearrange the order of tasks/functionality, or provide access or completion via fewer clicks.
Engaging — experience helps maintain your attention through task completion.
- Considerations — A priority for consumer-focused experiences, engaging can be of interest in enterprise when thinking about task completion and satisfaction.
- Remediation — For consumer experiences revisit storytelling aspects. For enterprise applications look at redesigning what might be distracting; that takes the user out of the task or experience. For either type of experience be sure to address any interaction equity friction.
Error-resistant — the experience minimizes the possibility of making errors.
- Considerations — While quantitative testing can provide a score for this adjective, it can be useful to find out whether the user “feels” the product is error-resistant. This is important in experiences where there is a subjective component — such as interpreting visualizations.
- Remediation — Qualitative testing and interviews to find the specific interactions the user feels are error-prone.
Functional — the experience lets you complete tasks to the desired completion.
- Considerations — This measures whether the user considers the experience to have the capabilities they need to complete tasks. Low scores here indicate that the user doesn’t think the experience even does what is needed, let alone desired.
- Remediation — Go back to requirements and see what additional capabilities are needed.
Meaningful — the experience provides value to my work.
- Considerations — Another functionality-focused adjective, meaningful focuses on whether an experience meets the user’s internal goals and desires. Use in conjunction with the Valuable adjective from the studies mentioned above.
- Remediation —Look at whether language and messaging meet the user’s mental model. If Functional scores are low also look at choices of capabilities.
Organized — the relationships of tasks (activities)/components (layout) is logical.
- Considerations — Can be used to breakdown issues where experience is not understandable. If experience seems disorganized it may seem unapproachable, inconsistent, or incoherent.
- Remediation — Mostly speaks to the information architecture. Look at going back to users with user research like card sorting and other activities that pull out the user’s mental model.
Secure — the experience prevents information from misuse.
- Considerations — Need to understand whether the user is thinking about themselves or other peoples information. Especially important for enterprise applications, but also important to users personally, the experiences handling of any information but especially personally identifiable information (PII) is an important consideration.
- Remediation — From the design side, minimizing the display of information or hiding information when specific individuals can be extracted from aggregates. Design changes may help but likely will require technical and functional changes.
Truthful — the experience portrays information transparently and without bias.
- Considerations — Use in combination with the Credible adjective from the studies noted at the beginning of the article to determine the user’s willingness to use the experience to make decisions. If the acceptance of the veracity of content is important this measure is important. Often an issue with statistics and visualizations, this adjective indicates the user’s perception of the portrayal of data, whether textual, numeric or graphical.
- Remediation — Mostly speaks to content (text) changes. For visual tools may require adjusting labelling or providing an explanation of how the data was gathered and processed.
Understandable — the experience is logical and consistent.
- Considerations — Like functional, understandable is used to see if the experience meets table stakes goals — in this case usable. Specifically, it looks at clarity, consistency, and organization.
- Remediation — Check Clear, Coherent, Consistent scores and redesign pages/flows to be logical and consistent. Does the user consider the experience to be disorganized or incoherent?
Asking the user to evaluate all of these and possibly more would be too taxing. Carefully choose the adjectives that represent what is important for your product to achieve. As a way of choosing, your team’s first activities should be to create a list of adjectives specific to your product, decide whether they are goals or principles, and prioritize them.
There is a near-infinite set of adjectives that could be used to assess a product. These, and possibly more, can be the adjectives that lead your team to the best experience your users have ever encountered.
How many of these do you design to? How many do you test for? Which do you find would most change how your team designs experiences? What adjectives would you add?
Thanks to Andrew Avedian for proof-reading and suggestions.
If you are interested in more about applying design practices to help your company innovate check out Designing for Innovation.
What Makes a Good UX Questionnaire? User-centred Scorecard Development. Seunghyun “Tina” Lee, Melody Ivory. 2016.
Does it have legs? Abby Covert. 2011.