The Science of Design
Have you ever spent all day looking for something, only to find it in the wrong place? Or perhaps you’ve spent months trying to solve a problem, only to find you were looking at it the wrong way to begin with? The way we think about creating value is largely a guessing game, but it shouldn’t be.
Blame it on evolution
People like to think fast, not slow.¹ We enjoy entertaining our intuitions and are acutely unaware of what we don’t know we don’t know. And don’t forget we also suffer from overconfidence and self-serving biases. Did you know 93% of Americans think they’re better drivers than the average person?² Or how about the fundamental-attribution error, so aptly named for its pervasive propensity to allow ourselves to cast blame on others due to their character and to excuse our own shortcomings to unforeseen and uncontrollable circumstances? Oh shoot! here’s another one: the false consensus effect in which people tend to assume that their personal qualities, characteristics, beliefs, and actions are relatively widespread through the general population in any given situation.
If we have data, let’s go with the data. If all we have our opinions, let’s go with mine.
The point is, we’re all fighting faulty shortcuts in cognition, and that’s what’s underpinning the problem.
Everything we’re doing is actually an experiment, yet most of us don’t think of it that way. When tasked with creating something, humans innately prefer to follow intuition.¹ This isn’t a problem for art, but for business and design, it can be disastrous.³
How we think about the task at hand
If you wanted to become an expert in something, one of the best things you could do would be to form a strong “mental representation” of the task at hand.² This is one’s way of constructing the perception of what they’re doing.
If you were to ask a software developer in the 2000s what they were doing, they might have said “shipping code.” Ask them today and they’re “building great user experiences.” Their mental representation has changed. Ask a software developer in the future what they’re doing and hopefully they’ll say:
“I’m running an experiment to see which of our hypotheses is strongest.”
If we want to create things that people love, sustain profitability, and prevent being replaced, we better start thinking of our work as structured hypotheses and experiments.
The science of design
Design thinking has been heralded as perhaps the best way to create new value in the economy today, and it’s truly valuable. However, design is merely a means to an end.
We design things to achieve a goal (more customers, happier users). How we achieve these goals are hypotheses, yet we often don’t think of them this way and certainly don’t approach them with the rigor of structured experimentation:
Figure 1. How we should think about our work.
Ask a designer what they’re doing and they might say “re-designing the landing page.” Hopefully in the future they’ll say:
“I’m designing a handful of experiments to see which experience drives the biggest increase in conversion.”
Imagine thinking about your work this way. It requires recognizing that we don’t know all the answers. It requires recognizing that we’re constantly battling the false-consensus effect: imagining that more people see the world as we do. It requires humility.
I work with hundreds of “software designers”. They’re amazing people.
I wish we thought of ourselves as “user experience scientists.”
This is because our goal is not to do design for design’s sake, it’s to discover what is most valuable to our customers. To do this, we must design experiments. Every product we launch should be thought of as an experiment that has been designed to test a hypothesis. This does not devalue design, it just places it in service of something larger: the scientific method.
Examples in industry
It’s no secret that the fastest-growing companies have this embedded in their workflow and culture.³ ⁴
Read about how Netflix creates incrementally better user experiences every day and you quickly realize it’s not because their designers are more tasteful at creating interfaces or because they hold more rigorous critique sessions in the office. They simply have a culture of experimentation at scale.
Google runs more than 100,000 experiments each year to test a variety of data-driven improvemnts to its services.³ Microsoft has a whole team of ~90 people supporting their internal experimentation infrastructure.⁸
Where to go from here
I’ll be honest: I’m sad about this. I much preferred thinking of design as a creative process where taste and intuition led to great results. Where deep human understanding and empathy enabled teams to win. This feels like a colder way to look at professional work.
The good news is: this is still about creating better user experiences, it’s just a new way of listening to our users. It also democratizes ideas. Rather than going with the highest-paid person’s opinion, we can run experiments that test those hypotheses.
If design is our way of communicating to users, data is our way of listening.
Finally, this can be seen as a positive because it also helps reinforce another critical habit for product teams: measuring outcomes over output. Historically, teams would be compensated for merely shipping product, whether or not it was actually driving results for the users or the business.
Today, product and design teams are being tasked with articulating the business value of our work. When we think of our work as experiments to test hypotheses to achieve a goal (increase in revenue, conversion, etc), our work becomes more focused and our value to any organization more clear.
- Thinking Fast and Slow. Daniel Khaneman, 2013
- Illusory Superiority. Driving Example
- Experimentation Works, The Surprising Power of Business Experiments. Stefan Thomke, 2020
- Competing in the Age of AI, Marco Iansiti and Karim Lakhani. HBR, 2020
- Peak: Secrets from the New Science of Expertise. Anders Ericsson, 2016.
- Outcomes over Output, Why customer behavior is the key metric for business success. Joshua Seiden, 2019
- Designing with Data, Improving the User Experience with A/B Testing. Rochelle King, Elizabeth Churchill, Caitlin Tan. O’Reilly 2017.