Two heads are better than one.
Why co-analysing user research data is a golden trick for agile teams.
Analysing research means looking at how everything fits together. What is actually going on here? Are we sure about this? Do we need to dig deeper? What can we do about it?
It’s important to do some kind of digest as you go along so that you are getting the most out of each session. This means sometimes adjusting the next session a little if you’re not learning anything new.
The juicy part however is at the end. It’s when you get to really pull together research findings and recommendations in a way that allows the team to take action and stay focused on real people, real user needs. This is where co-analysis can do wonders.
Co-analysis is all about perspectives. Qualitative data is prone to bias, no matter how seasoned of a researcher one is. Researchers are only human and our minds play the same tricks on us just as yours do. And that’s ok, we have our ways to be ‘bias aware’ and some tips up our sleeves to keep this to minimum. Co-research and co-analysis is one way to deal with this.
Co-analysis is gold for user-centred mindset. Researchers should not be the gatekeepers of data and the experiences of real people. We should step away from the computer and involve others. This is where co-research and co-analysis is so valuable- it will help you get richer, more reliable insight quickly and infect others with user-centred mindset in an open and collaborative way.
Here are some tips to run succesful co-analysis sessions:
- Bring others along to observe research sessions and take notes. Ideally bring those who will take part in co-analysis and own actions from research. This will help them empathise with real people behind the data and speed things up during the session.
- Make sure your observer knows a bit about bias and how to take notes for agile analysis. I shared some tips on good note-taking for agile teams on dxw’s blog.
- Prepare materials in advance. This means thinking about how your notes will be captured and laid-out from the start. Post-its are best, but hand-written index cards are great. Printed and chopped-up interview notes or transcripts work well too.
- Don’t use a white board if you have lots of data and you’re not using post-its. Use the floor, a huge table or a row of desks instead. This tip may seem silly, but I’d like to see you move around 600 gazzillion bits of paper along with the magnet or bit of blue tack that’s keeping it up there! I learned the hard way of course!
- Think through how you’re mapping and how you’ll explain this to others. Do you already have themes, stages in the service, assumptions or user needs to validate? Be prepared to explain this to people and give some examples.
- Beware of confirmation bias, bandwagon bias and superiority bias. Think about the balance of loudest voices in the room and team dynamics. You need to make sure some people won’t hijack the perspectives of others or nit-pick data to re-enforce their own opinions and assumptions. Act as a facilitator- stay in the shadows and balance these things.
- Point out that we’re not here to reach a consensus on everything, but to review and discuss. If people get stuck on something specific, move on and fall back on complementing data sources such as audio recordings after the session.
- Keep it small. The biggest group I’ve done this with is four so far. I think it’d be ok with up to six people as you can get people to work in pairs. Any more than that will be hard. You’ll need everyone to be involved properly and probably have additional facilitators. It will also be harder to do additional activities in larger groups like writing user needs together.
- Include a short ‘outputs’ activity at the end. This could be writing users needs or job stories together. This will have most value for the team as you can actually design or build a thing directly from these outputs.
- Tell people what happens next and how much they’ve helped you. Make sure to invite them and thank them in show and tells. This will strengthen the fuzzy feeling of ‘co’ and one team. It will help others share the ownership and responsibility to do the right thing and not forget the real people behind the findings.
10 is a nice number me thinks so I’ll stop now. Feel free to take, re-work and discard as you please.
I would love to hear how others have done this!
Aaaaand .. just to balance this text-y post, here’s a another picture of co-analysis I facilitated recently (without bloody magnets):
Some thanks are also in order! I have to thank my researcher colleague Jess who often reminds me that we’re only human when I get all itchy-scratchy about bias. Also Stephanie Wilson, who did an excellent lightning talk about being ‘bias aware’ at User Research London 2016.
Need an interesting read about bias? If you’re quite new to bias or simply want a witty read, apparently Steve Cantwell’s book is excellent (I’ve just started it, stay tuned!)