Evidence based design for digital applications is an approach we use to create effective user interfaces. It is an empirical process that integrates the scientific mindset into the design of user interfaces.
It is a comprehensive decision-making process that is different from how most design departments work.
What’s the point?
When stakes are high, decisions have consequences and making the right design decisions matters. For example, we work for the automotive industry, where interface design decisions affect the likelihood of accidents.
Measuring the consequences of a product design after its launch is smart, but it is not enough: designers have to be active in preventing negative consequences even before the product hits the market.
Looking out the window during the design process is a small step in that direction, and some designers do that by performing some user research.
Prophecy vs prediction in design
Design is a process where decisions about the user experience or the visual details of the interface are made. It is based on what we believe will happen once users interact with the interface.
Evidence based design takes this process from prophecy to prediction:
Prophecies come about through “divine” revelation, esoteric inspiration and elusive interpretation. Prophecies are highly personal and they only come close to reality through sheer luck or when they are so ambiguous that they can be reinterpreted after the fact.
Predictions on the other hand, come about through a transparent process of systematic information gathering and a critical evaluation of replicable data. They are expressed in terms of probabilities, which are open to be refined and changed as the state of the data evolves.
Evidence based design is the approach that does away with design prophets by introducing a robust system where the quality of design decisions evolves gradually according to principles and rules that are continuously improved.
Evidence based design changes how we think
Evidence based practice appeared in medicine but it has since penetrated a few domains. Still it’s progress is slow because it requires a paradigm shift that involves an intimate relationship with cognitive complexity.
Fundamentally, evidence based practice acknowledges the limitations and biases of human thinking and decision making.
As cognitive scientists we are familiar with the quirks of the human mind and we know that we cannot trust any single thought or overall intuition based human thought patterns.
Instead, a more robust way to assess thoughts is required to make sound design decisions. In evidence based design for digital applications, a robust process works along these lines:
- Uncertainties are translated into answerable questions by a critical process
- The best available information is retrieved in a systematic manner
- The evidence is assessed for internal and external validity
- The implications of the evidence are assessed based on predictions of likely outcomes
- Predictions are tested and adjusted based on an evaluation of performance
So what counts as evidence?
Even before the fake news era (remember those times?), people found it challenging to distinguish between genuine information and delusion.
Hearsay, opinion, conjecture, extrapolation, belief, intuition, rhetoric (how something is presented), bias, guesswork, customs, trends, fallacy, advocacy, advice, anecdotes, common practice and convention are all the opposite of evidence.
They are not based in a systematic process of uncovering reality, but they are random occurrences of the human mind. When decisions are based on any of these, the results are random. Even when luck strikes and these flimsy decisions lead to something, it is very difficult to keep improving the results.
The sources of evidence in the Evidence Based Design approach for digital applications are:
Fields such as human-computer interaction, cognitive science, and sometimes fields directly related to the subject matter of the app have built up extensive research results. This is a convenient method to find information that is tested and validated. More importantly it offers empirical evidence.
Unfortunately, sometimes it is difficult to find and understand relevant articles due to the complexity of scientific approaches.
User research & User testing
This is what commonly falls under the label of user research. It provides relevant feedback from people and it helps accumulate generous information about their natural behaviour in relevant contexts.
However, it is only useful in as much as the research is performed using accurate and meticulous methodology. Unfortunately, cargo cult user research is widespread, because mastering research methodology is something even academics struggle with.
Design guidelines and technical specifications
A significant amount of information about design has accumulated and tools and guidelines to simplify work exist. However, this type of information may not fit use-cases and it is sometimes embraced as a trend without any real scrutiny from the design community.
This is not a courtroom
Most designers are used to finding arguments to favour their decisions and to use them to fight product owners. Sometimes this is critical to prevent the UX design process from being hijacked by a completely naive approach.
However, evidence based design is not about taking sides. The point is to weigh pros and cons of implications before even taking the decision, not about finding arguments to support one side.
These pros and cons are a factor of the evidence and to get there we must asses the evidence critically: both when it is contradictory and when it seems to point in the same direction.
These factors affect the quality of evidence in digital evidence based design:
- Risk of bias: evaluating evidence on the basis of the chance that bias affected the information gathering process and estimating the effect.
- Imprecision: an evaluation based on the chance that the observed subject matter is prone to change.
- Indirectness: it means considering the differences in characteristics of how the information gathering was conducted and how the results are going to be used.
- Inconsistency: the variability of results across the included information sources.
- Effect size: considering if the effect observed is sufficiently large to make it unlikely that it would change completely.
- Confounding: assessing whether other factors may be responsible for the observed effect.
Grading information quality in the design process
To assess the evidence that has been gathered one must understand the processes of information gathering and the general logic of the scientific mind. Otherwise one cannot evaluate information correctly.
By critically evaluating the evidence, it can be classified into three broad categories:
High Quality Evidence
Several different reliable information sources point towards a congruent picture. The interpretation is that “there is a very low probability of further research completely changing the presented conclusions.”
Moderate Quality Evidence
The information sources that point toward the conclusion are less reliable or there is only one reliable source of information that supports the conclusion. This means that designers are confident about what the evidence suggests, but the truth could also be substantially different. In this situation, further research may change the conclusions.
Low Quality evidence
The evidence supporting the conclusion is weak and further research is likely to completely change the conclusions.
Despite there being a gut feeling or even unanimous agreement, there is no solid evidence to support those beliefs. Research would probably change the conclusions. There is no shame in admitting there is no evidence for a particular point.
Weighing evidence in the design process
The ideal situation where designers have perfect evidence does not exist. In reality, designers have to weigh design options and to decide one way or another. Still, they benefit greatly from deciding based on the best available evidence they can summon and by critically understanding it. It is a conscientious, explicit and judicious process that is carried out in a systematic fashion.
Common principles used in this process include:
- The more important the outcome of an interaction, the higher the required standard for evidence.
- The more frequent an interaction, the more high quality evidence should be available to back up design decisions.
- Favouring a design decision that has no evidence over one that has moderate or high quality evidence is unlikely to be sound.
- When design decisions have to be made based on no evidence, they should be documented and ways to measure their impact should be considered.
- Contradictory evidence can be a sign of faulty data gathering or that the reality is more complex than assumed at first.
- Quantitative evidence may shed a more accurate picture of reality, but qualitative evidence helps to put it in context.
- Often times a clash between different streams of qualitative evidence can only be resolved through a quantitative research approach.
The evidence based design organization
Evidence based design needs a certain climate to operate in. Organizations that are evidence based employ a systematic process to accumulate data. Information is collected and documented, including how it has been collected.
Ultimately this creates a culture of people who understand human reasoning and its limitations.