User Research in practice
I was recently researching some insights to include User Research best practices in Fueled’s process. I noticed that there was a lot of theory available, but very little practical examples.
In this post, I’ll go in detail in each step of the process, and share what I’ve learned through an example, covering problem and assumptions definition, running user interviews, analyzing results and drawing conclusions.

Problem and Jobs definition
In order to know what you will research for, you need to do your homework first. That is, defining for your product/feature :
- The problem(s) your product is solving or will be solving in the future.
- The job(s) your users will hire your product for.
Let’s take a recent Product Design exercise at Fueled as an example.
- The Problem(s) :
Short term problem to solve : There is no simple and easy way for Photographers to manage their photoshoots on mobile : coordinating a shoot with multiple talents is a painful process.
Long term problem to solve : Current marketplaces make it hard for photographers and talents (models, designers ,etc.) to discover, connect and collaborate with trusted industry professionals.
Here, our approach to solve this long-term problem is to first build an efficient tool for photographers, then get users to enjoy the product, and eventually spin it into a professional, niche network once we reach critical mass.
2. The job(s)
We like to formulate jobs as “Jobs Stories.” Here are some examples :
When I have to organize a photoshoot, I want to coordinate centrally with stakeholders, so that I don’t waste time texting and emailing everyone about reminders and updates.
When I am done with a photoshoot, I want to pay and sign contracts with my talents electronically, so that we trust each other, and both partie’s tax reporting is facilitated.
Etc.
Alright, we have a rough idea of what the problem and main job stories are. Now comes the fun part that will guide your research : listing the hypothesis that you made in formulating (1) the problems and (2) the jobs.
Hypothesis definition
You don’t know what problems your users are facing. Without first-hand user research, you can only assume what they are.
To cover most assumptions your team is making, the best way is to think in detail about the activities people currently go through to solve their problem. As you list these activities out (as a User Journey Map, for example), you’ll naturally come across some assumptions.
Let’s go back to our photography example, and list out the different activities our photographers go through before, during after a photoshoot.

There are a few pages of these so I won’t list them all, but here are a couple of assumptions I came up with.
- Photographers are highly frustrated with existing talent networks
- Photographers currently coordinate their shoots via a combination of multiple text messages and email.
- Talents are frustrated with financial aspects of their contracts taken care of by the product.
- Talents decide to trust a photographer based on their portfolio and social profiles
This is the kind of assumptions we will want to test directly with potential users. In that scenario, I chose to run one-on-one user interviews.
User Interviews
Talking to potential users
Michael Margolis does an outstanding job at describing his user research process in this workshop video. In practice, here is how I applied his concepts to prepare and run our interviews.
A) Preparing for interviews

- Planning everything you want to learn or test during these interviews : what assumptions you have, the clarifications you need, or what products you’d like them to try and how (usability test scenarios)
- Describing all interview questions into a full script, outlining what you will say and do. You will need it to keep track during the interviews themselves. It is also useful to consider legal aspects (NDAs) and incentives (reward) at this point.
- Broadcasting to participants. Define who you want to talk to, and more importantly, who you don’t want to talk to. (There’s nothing worse than wasting your time running an interview with someone you know is not providing you value.) Turn this into a form to screen participants. In this screener, do ask participants for their contact details and availability. Publish the form where your target audience most likely to see it.
In practice, I used Typeform to create the screening form, and published it under a “job” on Craigslist NYC, promising a $70 Amazon gift cards to selected respondents.

After just 24 hours, more than 80 people responded.
Their answers automatically filled in a Google Sheet, and I used the filtering feature to hand-pick the 12 people who fit my criteria.
For each of the participants (based on their availability) I generated a unique calendar invite link with Sunrise Meet. Using the awesome “YetAnotherMailMerge” Google Sheet add-on, I emailed all of them at once with a custom greeting, introduction, and link to schedule their interview.
80% responded within a day, the rest never did.
Here’s what I learned since then :
- Always confirm ~30% more interviews than you need : some people will confirm, but not show up. Humans are flaky.
- Do ask for people’s phone number : email is not enough if you need to contact users last minute or if they don’t show up. You’ll need instant answers.
- Don’t use Typeform to build screeners. Their free tier doesn’t allow you to include conditional logic inside your forms, so you can’t show specific questions based on choices made by respondents. Use Google Forms instead, they recently redesigned and are much better than they used to be.
- Check out some interview scheduling tools such as UserInterviews.co : they might be valuable to make this process much, much faster.
- Follow up with the respondents you did not select : their input can still be valuable at a later stage. Suggest them to join your beta users when the product is released
B) Running interviews

- Introduction : Greet participants, offer them a drink, make them feel at ease. Establish trust, and draw the context and rules for what’s next. Make them sign your NDA, ask for permission before recording.
- Discovery : the way I structure interviews is first by asking participants who they are, how they work. Then, I go over questions related to my assumptions, focusing them on their usual journey, and figuring out what problems they experience with the tools they use.
- Usability :While they’re in the office, I like to make interviewees test a prototype of our product, or a competing product. follow a product test scenario (preferably on their own devices, with which they are much more familiar). This is either to find the flaws in a competitor’s product, or get some reactions about our own.
Learnings include :
- This is key : have everyone in your product’s team listen in (muted, obviously) via a muted Google Hangout, and have them take notes (here is a structured way to do so). This will save you a ton of time, as you will not have to transcribe and analyze interviews.
- After each interview, plan 15 minutes with them to regroup and go over their key takeaways. No reporting to do. Once all interviews are over, your team already has a list of learnings and insights to bake into the product.
Moving forward
All of the above can easily be achieved a week. This is a repeatable process you can apply to every new product or feature you are planning to build or improve on.
You will only be able to know if you are on the right track once your MVP ships, but at least, at the end of this process, you’ll have a better idea of which of your assumptions are correct, and which ones aren’t.
I’m curious : what have you learned involving users in your product design process? Please share in the comments.