A common mistake made by novice researchers is to ask users what they want from a new product or service. Although this seems like the correct way to do user research, in most cases users don’t know, don’t care or can’t articulate what they need. It is the design team’s job to establish the underlying problem, identify the best solution and then validate that their solution works. Design ethnography is the first step on that journey.
Predicting what will work best for users requires a deep understanding of their needs. Research methods like focus groups and surveys have obvious face validity but they continually fail to provide the insights that design teams need in product discovery. The reason is that these techniques require users to predict their future behaviour, something that people are poor at doing.
An alternative method is to examine what people do, rather than what they say they do. This approach is based on a simple premise: the best predictor of future behaviour is past behaviour. What people do is a better indicator of the underlying user need than what people say.
To avoid simply asking users what they want, user researchers have appropriated methods from ethnography and applied them to user research. This technique is broadly known as ‘design ethnography’ but it differs in important ways from traditional ethnography.
What is ethnography?
Ethnography is the study of culture. Branislaw Malinowski, who studied gift giving amongst natives in Papua, wrote:
‘The final goal is to grasp the native’s point of view, his relation to life, to realise his vision of his world’.
Replace the word ‘native’ with the word ‘user’ or extend the metaphor and think of your users as a ‘tribe’ and you can see why this approach could offer value in product and service design.
Some of the defining characteristics of ethnography are that:
- Research takes place in the participants’ context.
- Participant sample sizes are small.
- Researchers aim to understand the big picture: participants’ needs, language, concepts and beliefs.
- Artefacts are analysed to understand how people live their lives and what they value.
- Data is ‘thick’, comprising written notes, photographs, audio and video recordings.
To some degree or another, design ethnographers appropriate each of these characteristics in the work that they do.
In addition to Branislaw Malinowski, other examples of ethnography include:
- Margaret Mead, who studied ‘coming of age’ rituals in Samoa.
- Sudhir Venkatesh, who embedded himself with Chicago drug gangs to understand drug culture.
- Matthew Hughey, who’spent over a year attending the meetings of a white nationalist group and a white antiracist group.’
So how does design ethnography differ from traditional ethnography?
It’s a struggle to use a traditional ethnographic approach in modern product development, mainly because of the timescales. That’s not to say it’s impossible: Jan Chipchase (who specialises in international field research) says he spends half the year travelling around exotic destinations. But most people who practice design ethnography in business would agree with these distinctions:
- The purpose of traditional ethnography is to understand culture. The purpose of design ethnography is to gain design insights
- The timescale of traditional ethnography is months and years. The timescale of design ethnography is days and weeks.
- Traditional ethnographers live with participants and try to become part of the culture. Design ethnographers are visitors who observe and interview.
- With traditional ethnography, data are analysed in great detail over many months. With design ethnography, there is ‘just enough’ analysis to test the risky assumptions.
- The findings of traditional ethnography are shared in books and academic journals. The findings from design ethnography are restricted to a team or an organisation.
How should you approach design ethnography?
Instead of asking people what they want, with a design ethnography approach the user researcher tries to discover why people want those things. Through observation and interview, they answer questions like these:
- What goals are users trying to achieve?
- How do they currently do it?
- What parts do they love or hate?
- What difficulties do they experience along the way?
- What workarounds do they use?
You answer these questions by observing users and interviewing them.
As someone who spent several years on a single research project in the Trobriand Islands, I don’t know what Malinowski would think of the compromises made in design ethnography. My view is that, if we liken traditional ethnography to a prize heavyweight boxer, then design ethnography is more akin to a street fighter. It doesn’t follow all of the rules but it gets the job done. That’s usually acceptable for most design projects but be aware that too much compromise can jeopardise the quality of your results. Let’s look at some of the ways I’ve seen that happen.
Avoiding some common mistakes
When I work with companies and I suggest a design ethnography exercise, I often hear, “But we already do that.”
It’s true that most companies carry out some up-front customer-focused field research activities (that are different to their traditional market research). They often dub it “insights research” done by their Insights Team or by their Innovation Team.
But these activities frequently amount to nothing more than going to a customer site to carry out the same interviews or surveys the team would normally do out of context, with little to no observation of behaviour taking place. I’ve even seen it done with ‘concept testing’ where researchers write a descriptive paragraph of their idea and ask respondents to read it and say what they think to it — which has to be the worst kind of customer research imaginable.
The consequence of this is that development teams often set out creating the wrong product or system. The team continues blindly on until the UX team get involved and usability test it. Now the development team gets to see real users at work, at which point they suspect they have built the wrong concept. But now the team is too far along in development and too wedded to their idea to pivot.
The mistakes I see most often are:
- Doing research in the field — but doing the wrong kind of research.
- Not knowing what is and what is not data (because there is no focus) so user opinions and comments are prioritised over user behaviour.
- Not sending experienced field researchers — instead sending people familiar with interviewing only.
- Doing it after the company has already decided what the design solution is going to be — therefore looking only for confirmatory evidence and missing other opportunities.
Thanks to Philip Hodgson for contributing to this article.
Originally published at www.userfocus.co.uk.