Asking the Right Questions in User Interviews

ITHAKA Tech Staff
ITHAKA Tech
Published in
10 min readMar 7, 2023

With Diba Kaya

Diba Kaya is a senior researcher on the Insights team working on Search and Discovery (e.g. search engine, recommendation models, etc.) at ITHAKA, an ed-tech organization known for JSTOR, a digital library with primary and secondary sources used across universities and museums. ITHAKA aims to improve access to knowledge and education for people worldwide.

Learning from users and asking the right research questions are essential to building the right products for Search and Discovery to help scope product development. Diba runs us through the process she uses to ensure that ITHAKA gets meaningful qualitative data for product strategy.

The research question and product development phase

Often product teams are interested in user interviews because they can provide quick and tangible feedback on a product, where getting user quotes and reactions is compelling to understand the user experience. However, interviews are simply one method to answer the research questions driving the project. Research methods vary and are merely tools in a researcher’s toolkit, and what tool or method is chosen depends on the research question guiding the whole project.

“Getting to the primary research question driving the project is the most important thing to get clear on. We can often over-index on running interviews because it’s a method most people have heard of regarding UXR,” Diba says.

The first step in choosing a method like an interview is to understand where you are in the product development lifecycle and immerse yourself into the domain (e.g. product business goals) and functional knowledge (e.g. design patterns, concepts) the team has about the space. You’re not trying to be an expert, but you’re trying to understand the framework and context. Diba says, “Research questions are different early in the product development cycle, like in the product definition stage versus in the product delivery stage, so it’s important to orient yourself before developing anything.”

A snapshot of the product development lifecycle:

Once you’ve worked with your product development team (e.g. your PM, engineers, and designer) to collectively understand where in the product life cycle you are and where you want to be, you can work together to develop a set of questions and an assumptions list to help you get a sense of what answers are needed to guide next steps.

“These are crucial conversations, and getting as much practice at this stage as possible is my best piece of advice. In addition, having these conversations with openness and humility is essential; having intellectual humility at this stage is the bedrock for developing the proper focus.”

In these conversations, the research questions should start to become clear, what are the repeating patterns in the conversations? Where does the conversation circle back to? What needs clarity? In finding these patterns and creating an assumptions list alongside the emerging research questions, you can set yourself up for success. Maya Elise Joseph-Goteiner wrote an excellent article about this stage and the importance of assumptions in the discovery phase/product definition phase. Check it out here. And a great reference book for workshops that can be conducted with teams during this phase is 101 design methods: A structured approach to driving innovation in your organization.

Diba says, “The crucial conversations at ITHAKA always hold the organization’s values at heart: belonging, trust, teamwork, evidence, and speed. You can see this in the Slack interactions with emojis sprinkled everywhere, reflecting engagement and excitement about insights; we place a lot of trust in one another and in the value of research. A culture like this makes it easier to have intellectual humility together and get on the same page quickly about what we know, don’t know, and need to understand.”

Once you’ve established the product landscape, where you are, what you want to learn, and the research questions and assumptions that may block you, you can pick your method. Choosing a method depends on a few factors including: where the product is in the life cycle, the research question, what kind of data would help you answer your questions the best, the access to participants and time constraints on decision making (i.e. do you need to make a quick product decision or do you have a runway?). Neilson Norman has a lot of great resources on methods and UXR in general; check it out here.

User Interviews: Developing the interview questions

As we mentioned previously, interviews are often a popular method for product teams. They’re great in providing bite-sized quotes easily circulated to show impact and affect. Interviews are great when you have more complex research questions that require probing. The hallmark of an interview is the ability to observe nonverbal responses, ask open-ended questions and to probe into answers to get rich qualitative information. As such, the research questions appropriate for interviews are typically questions around a need to understand how someone thinks and what about their experience brought them to their conclusions; they’re ideal for understanding mental models which can be used to develop knowledge of the user and their landscape and help narrow product and design decisions (e.g. conducting an interview + a concept test).

“For the interview method, you want to create a script, almost like a movie, where the research questions are the ‘themes’ of the movie or the main plot. The different sections in the interview can be thought of as ‘scenes.’ Each scene needs to touch on or build on the deepening of a theme in the movie.” Much thought goes into transitions and segues, as in a movie, which must be reflected in the interview script. When you’re done writing the script, practice it! Look out for awkward transitions or ambiguous language and edit until it feels easy to walk through.”

There are typically 5 steps to an interview:

  1. Setting the stage: introductions, how long the session will be, the topic you’ll be discussing together etc.
  2. Broad questions/warm up: questions about their job, location etc.
  3. Topic specific questions: ask the questions on the different areas of the research question.
  4. Conclusion questions: summarizing questions about the topic you’re interested in (e.g., overall, how would you describe your experience with x).
  5. Wrap up: thanking the user, information about remuneration, answer questions they might have.

The five steps above sound simple and straightforward but conducting a good interview is a skill that takes practice. A great article by Mia Northrop on developing your interview skills says, “rather than relying on participants being good talkers…, realize that everybody has something to say, but you have to know what buttons to push to find out what it is they care about and what they know that can inform your design.” She goes on to say that “sometimes when there is a poor interview, we blame the person we interviewed, e.g. the participant was unengaged, inarticulate, etc., but the constant in all these are: you, the interviewer.” Essentially, conducting a good interview takes skill and experience, and some people will be better at it than others, but it can be learned with some pointers.

For example, some don’ts:

  • asking closed questions
  • posing questions in random sequence
  • little followup
  • failing to build rapport

For example, some do’s:

  • set the stage at the beginning of the interview
  • follow up on sentiment statements
  • have a structure to the interview
  • Ask clarifying questions

A good interview combines creating good rapport, having a “storyline”/interview structure based on the primary research questions, and phrasing your questions properly, to avoid bias and asking leading questions.

As Mia mentions in their article, you’ll get various responses from any interview question you ask, including facts, anecdotes, opinions, attitudes, feelings, and values. However, a hallmark of a good question is that it will also contribute in two ways:

  1. It’ll support the knowledge you’re gathering.
  2. It’ll foster positive interaction with the participant by encouraging a flow in the conversation.

For crafting interview questions in a script, Portigal has a great article on the 17 types of interviewing questions. These include questions to:

  1. gather context and collect details (e.g. questions about the sequence, relationships, and organizational structure like “describe a typical workday).
  2. probe what’s unsaid (e.g. questions for clarification, code words/native language, emotional cues, asking “why,” delicate probing like, “You mentioned a difficult situation that changed your usage. Can you tell us what that situation was?”).
  3. create contrasts to uncovered mental models and frameworks (e.g. questions to compare processes, compare to others, compare across time like, “Do others also do it this way?”).

Sometimes, the interviewer or participant can get stuck when articulating a question or answer. To avoid asking a leading question and help the user express their experience, Diba encourages the use of qualifiers and action verbs to help prompt users verbalize their thoughts when open-ended questions may feel too open-ended. Bloom’s taxonomy of the cognitive domain is an excellent resource for these qualifying verbs.

Conducting the interview

Rapport building and maintaining flow in an interview is top of mind for Diba when conducting interviews with users. “It’s an interview, but you want to break the ice; you want it to feel like a natural conversation, not a hard interview where someone feels on the spot, so giving them an opening to expand their ability to share is super important,” she says. For example, emphasizing gratitude and using probing questions like, “How does that make you feel? Why were you expecting that? What about the information here makes you think x?” can help. Essentially, you want to avoid yes or no responses and leading questions so that you can prompt meaningful thoughts from a participant to understand their perspective, and building rapport can break the ice to get someone to speak more fluidly.

Diba stresses the importance of not asking leading questions. “The first step in making sure you don’t ask a leading question is to be aware of what assumptions you have before you walk in and interview anyone. Write them out, discuss them as a team, and let this be part of your planning process. In general; there are way too many failures out there that are just based on faulty assumptions.”

Such as in life, things don’t always stay on script, and your goal is to create the feeling of a conversation. So it’s essential to learn the balance between staying on script and going with the moment. Here’s how Diba recommends keeping a user interview on track. “In your interview protocol, keep your primary research questions in front of you. For example, what were the two or three research questions that started this study? Put them on a Post-it note or somewhere visible, so you know why you’re having this conversation, almost like agenda points.” Diba says, “You only have a certain amount of time with a user, and you have an objective, so it’s important to timebox your topics before you go in so you know that you’ll spend at least five to 10 minutes setting the stage and closing the interview, but in between that you’ll spend 70% of the time on topic A and 30% of the time on topic B.

Another way to make sure you’re using your time well and understanding what you need to, is by employing reflective listening. “I call this crystallizing,” Diba says. “You hear this a lot in the context of Therapist sitcoms or TV shows where they will repeat something you said and reflect it to you so you can think about what you said and decide if it aligns,” Diba explains. “When I do this, I’m trying to use summary statements to ladder up what the user is saying and use phrases like, “I want to understand what you’re saying. I think I hear XYZ. Is that what you mean, or is there a different way I should think about it?” This kind of crystallization helps researchers highlight how a user thinks, but developing rapport is key to this tactic. With those best practices, Diba says the conversation can be more organic without sacrificing quality.

Finally, as you wrap the interview, it’s important to give the user the stage for closing statements. Ask them, “What are some thoughts about your experience with the product you want to leave us with?” Probe to understand what was easy vs. hard, enjoyable vs. challenging. Diba says, “It’s always good to get a glimpse into the peak-end effect for the end user — where the memory of an experience is usually what the ‘peak’ and the ‘end’ experience were. Having this last bit is a great way to help inform user mental models and share them with the product teams.” After this, standard to most interviews: you thank them, provide information about the incentives and when they can expect to receive it, answer any last questions they may have and open the door to future questions via email.

By employing impactful user research and interview skills, we have made significant improvements to the Search and Discovery product suite on our research and learning platform, JSTOR. We have also used our interviews as fireplace moments to bring the central and partner teams together like data science, marketing and metadata librarians which has helped spur cross team work to benefit the user. We’ve effectively leveraged this research method to encourage user empathy, ground product strategy and hope that our experiences with it will help others find the same level of collaboration and user impact that we have.

Resources

Developing your interviewing skills, Part 1: preparing for an interview

InterViews: An introduction to qualitative research interviewing

Understanding your users: A practical guide to user research methods

Interested in exploring careers and Ann Arbor jobs, New York jobs or New Jersey jobs with ITHAKA? Check out our ITHAKA jobs page to learn more and speak with recruiting.

--

--

ITHAKA Tech Staff
ITHAKA Tech

Insights from the ITHAKA engineering team and beyond.