Balancing Research and Action

Tom Kerwin
Oct 12, 2016 · 5 min read

Or: “Use Tools, Don’t Be One” – a discussion sparked by a Google Ventures Design Sprint

We just had a good conversation. On the Internet.

Yes, we know! We can’t believe it either. We’ve captured it here for posterity.

It happened in Designer Hangout when Tom shared that he’d just begun facilitating his first complete 5-day Google Ventures Design Sprint. (If you don’t know what one of those is, then 1) where have you been for the past year? and 2) Here’s a link:

Dan (dan turner): Hey Tom, what’s the role of Getting Out Of the Building (or somehow observing any users) in the Google Design Sprint methodology?

Tom (Tom Kerwin): Day 5 is when you test a prototype with users.

Dan: So only after you’ve decided what the problem of users is and how to address it? So no observation of “who are the users, how do they think?”

Tom: It’s not built in — there’s not really time for that.

But the process does include getting that knowledge from the people who know the users best, like customer services, UX researchers, etc.

And you can always bring that knowledge into the sprint. For the sprint problem we’re tackling this week, we’ve already interviewed more than 40 potential customers and watched them go about using our website in several different settings.

Dan: Cool. I’m just trying to get a sense of how it works. So that means your user research is really something you brought into the sprint process, not baked in?

Tom: That’s right. Because the design sprint is really focused on one thing: build and test a bold prototype in 5 days.

Dan: Hm. Yeah, so another process/tool that requires a lot of caveats and contextualizing, rather than just letting people think “this is all you need to do”.

It was kind of presented in Design Disruptors as “look at this awesome all-you-need-to-do to make products!”

I worry “bold” frames it as “screw data and user expectations” in many minds. (Not a criticism of you!)

Tom: Yes, that’s a valid worry. If you didn’t know anything about your users on the way in, you could easily pick a daft target and end up making a pointless prototype. I’ve certainly seen design sprints come up with solutions for problems nobody’s having.

Similar problem to all of Lean Startup, if you ask me.

Dan: YES.

I worry that so many of these things get seen as “well, Google does it, so we just need to do this”.

I’ve talked w/ Laura about that — her book is great, but I’ve seen startups just do “let’s make a landing page — we got signups, let’s move on” and “let’s do concierge, done”. WITHOUT LEARNING.

Like it’s all just checkboxes and that’s enough.

So that, plus no observation of real users, real problems, real experiences, worries me about this. Is that just me worrying?

Tom: No, that’s valid, but there’s also a flip-side danger of too much learning though.

Dan: Well, if you’re experienced, you refine insights, not chase “user X wants feature Y” and so on. Come on, we’re not Microsoft Office team.

Tom: Thank fuck. But It is still possible to read too much into patterns we see. Narrative bias is strong.

What’s good about Design Sprint is that rather than spending a few months on a Lean Start Up MVP, you’re time-boxing and focusing so you can test your risky assumptions in just one week

I think we need Erika Hall’s approach of Just Enough Research, and then fast, meaningful action.

Dan: That might be a more useful and powerful combo, but it’s kind of being sold as “just do this!”

Yet Google has not been known for observing users — especially users who do not work at Google — OH HI BUZZ

Tom: Truth. There’s a problem when any methodology is sold as “just do this”

It’s like “one weird trick” articles but for business. As with anything, nobody has all the pieces of the puzzle!

That’s one thing I really like about day one of the sprint. It draws on multiple experts at all times. We learnt a ridiculous amount today. And that’s on a system I’ve been learning about for 8 months!

Dan: Yeah. I guess we could all (and esp. Google) could/should do better about explaining that so many of these things are finer-grained tools, and explain what they can and can’t do. Just as with data!

Tom: Truth!

Tool literacy (That sounds rude)

Dan: “Use tools, don’t be one.” < — ship it

Dan: All this is part of why I push back against the “tell me your process (singular)” question in job interviews. That reifies each tool, frames your work as Do X, Then Do Y, Then Done.

Tom: YES!

Dan: Got a lot of pushback on that thought in the jobs thread. Think it has to do with how people think of the word.

Tom: This is great

I had someone I’m mentoring ask about the key tools of UX and I had to really think. It might just be: thinking, writing, and drawing on Post-its with Sharpies.

Dan: Tools are fungible. And processes need to fit the needs of the context/situation/constraints. “Then I do contextual inquiry” is not always the right answer.

Wait. I think we came to some sort of… agreement? Not sure how to handle that.

Tom: No — is this still the Internet?

Nothing seems to be on fire



Tom Kerwin is apparently something called a “swing DJ”. He also helps companies and projects improve their user experience through UX-led CRO and writes about it. His professional site is

Dan Turner likes bikes more than dancing. He writes and teaches about UX, product design, behavior, and some of those aren’t even rants. Currently working on designing products and systems to help timebanks discover untapped sharing resources in local communities. More on him at

Photo via Visual hunt.

Designer Hangout

Insightful articles by UX designers and researchers of the UX Slack Community.

Thanks to dan turner.

Tom Kerwin

Written by

Be profitably wrong. Join awesome designers and smart business owners and get my weekly letter. Evidence-based design and better A/B testing →

Designer Hangout

Insightful articles by UX designers and researchers of the UX Slack Community.