The Solution Interview Kit

Including real-world tactics and experiments you can use!

Joe Pelletier
7 min readApr 13, 2018

When I started building my solution interview strategy, I had a million questions. I wanted to answer every possible question, so I immediately went out and started building a massive Powerpoint deck. It turned out I was jumping the gun. To better prepare for a wave of solution interviews, I decided to step back and figure out my solution interview kit.

What’s in my solution interview kit?

  1. A lightweight user journey
  2. Mapping of hypotheses/assumptions → experiments
  3. Solution interview script
  4. Solution interview experiments
  5. Solution interview tracker

Be prepared with two types of solution interviews:

  • Quick: Something you can run in 5 mins on your own. Should focus on your riskiest assumptions. This works well if you’re at a conference.
  • Full: Something you can run in 30 mins. Use the “buddy system” so someone else can take notes for you. This works well in-person, or over a virtual meeting.

The user journey

You want to have a rough idea of what your early adopters are trying to do. Therefore, spend some time building a lightweight user journey. At the time, I found this guide on appcues.com very helpful: https://www.appcues.com/blog/user-journey-map

The user journey is important because you need this context for delivering a story your interviewee will understand. For software products, it needs to be relevant to whatever workflows they are accustom to, and the job they are trying to perform. Your user journey will change and get updated overtime. Expect your solution interviews to affect this. After five or so solution interviews, update your user journey again — and add more context to it.

Note: The user journey requires you to have an early adopter persona. This is also important during the solution interview, where you are trying to determine if the interviewee is someone who would use your product as soon as it’s available, or your future target customer segment.

The hypothesis/assumption map

This is a simple spreadsheet I created for planning purposes only. I used it to help me understand what assumptions I had about my solution, and which of those assumptions were risky. Here are the columns I made in the spreadsheet:

  • Hypothesis/Assumption
  • Is it risky?
  • Experiments
  • Expected result

Depending on the assumption (or hypothesis), I brainstormed different experiments I could run to remove bias and try to “falsify” each assumptions. This takes some work. Seek feedback from others as to whether the experiments make sense.

For the expected result column, I wanted to understand the criteria that would determine if my assumption was true-or-false. Remember, the outcome of a well-designed experiment should give you a decisive outcome.

Example:

  • Assumption: People don’t like to get wet when it rains out
  • Is it risky: No
  • Experiment: Ask the interviewee if they agree with this statement: It’s raining outside and you see an umbrella that you can freely take and use. You decide to exit the building without the umbrella.
  • Expected result: For the assumption to be true, the interviewee must disagree with the statement.

The “quick” interview

Finally, I also made a “quick” version of my solution interview. This was narrowed to the assumptions I thought were the most risky. I modified the experiments slightly so I could deliver it on my own, in person, without supporting artifacts. Always be ready to conduct a solution interview!

The solution interview script and experiments

Here are some ideas for your solution interview script. I’ve tried to include sample experiments/things to ask so you can hit the ground running.

Re-validate pains. In my opinion, solution interviews give you much higher quality feedback than problem reviews. That said, problem and solution interviews should be a continuous process.

Sample Experiment: A simple tactic is listing all the pains you discovered in your original research, then ask the interviewee which pain resonates the most.

Get context up-front. Use demographics to tweak the questions you ask in real-time — for example, a people manager will likely have different jobs and pains than an individual contributor. Seek to understand their specific context so you can come across as more authentic, and ultimately have a better conversation that yields better results.

Remember, you should be trying to speak to your early adopter persona in solution interviews — so if you end up speaking to a wide variety of personas using a single solution interview format, you may end up with unusable results at the end of the process.

It’s OK if you‘re testing problem/solution fit with a few sets of early adopter personas. If they are each materially different, consider making different solution interview kits so you can better compare results at the end of this process.

Sample Experiment: There really aren’t many “experiments” here. This is usually just a conversation and you asking a few questions, like title, size of team, etc. What you should ask will come from your previous problem interviews.

Setup hypothetical examples and tell a story. This is the essence of a solution interview. The examples should be scenarios your early adopter has experienced before, so lean on your problem interviews for this.

Sample Experiment: In our most recent solution interviews, we organized around three use cases. Each use case was accompanied by a 3–4 sentence “story” that setup the situation and positioned our feature. After telling the story, we asked the interviewee what felt wrong, and what felt odd about what we just shared. This helped us determined what was considered “false” in our use case, so we could easily falsify our hypotheses/assumptions.

Another example is setting up an intentionally incorrect user scenario. You may want to try this to elicit a direct response from your interviewee, as well as “test” whether or not they are paying attention :-). For example, in our recent solution interviews, we were trying to figure out if email notifications or Slack notifications made more sense. To do this, we setup an example that said “…and finally, at the end of the workflow the user was NOT alerted.” This allowed our interviewee to say “hold on, that sounds stupid, I’d expect X to have happened”.

Test for differentiation: In my opinion, this is the most exciting part of solution interviews! My post on making data-driven decisions with the Kano model is very useful for this exercise, since it gives you a quantifiable way to determine if a feature is a basic expectation, delighter, performance need, or simply indifferent/not worth building.

Sample Experiment: Simply put, the experiment here is a Kano exercise. After describing your solution, be sure to ask two questions: one that describes a functional version of the feature, and another that describes a dysfunctional version of the feature — i.e., a situation where the feature is absent. Finally, map your responses against the Kano model and you’ll have your answer.

The reason why this is important is because you want to figure out what will excite your customers and early adopters. You want something that they will find remarkable; you don’t want to build “me-too” features or a bunch of basic expectations that bore your user. Learn more about a “Minimum Lovable Product” here: https://medium.com/the-happy-startup-school/beyond-mvp-10-steps-to-make-your-product-minimum-loveable-51800164ae0c

Remember, delighters don’t have to be major features that no one else offers. They can also be “little” things that deliver positive, unexpected reactions.

Determine if they will pay. This is very important. A business is only viable when the solution you present is something your early adopter is willing to pay for. For B2B products, there are few questions you want to know:

  • Does this persona know how to acquire budget?

Sample Experiment: Recently, we asked this question immediately after collecting demographics and before telling our story. Try asking something like: “When was the last time you had to buy something to do your job/help your team.”

  • After describing the solution, does this solve their need?

Sample Experiment: This is where re-validating the pain up-front in your solution interview helps you. At the end of the interview, say something like: “You said your primary pain was X. After seeing what we just showed you, how well does this solve your pain (using a 1–5 scale).”

  • After describing the solution, how much would they pay?

Sample Experiment: There are a few different ways you can do this. For the purpose of assessing problem/solution fit, you only need an approximate range. Here are some other great strategies you can peek at to get started: https://leanvalidation.hanno.co/pay.html

A future blog post will go into a specific tactic we used :-).

Do I need a real prototype or demo to run solution experiments?

Nope! But if you have one, that’s even better. All of the experiments and tips I provided above can be done with mockups in Powerpoint. Prototypes add an even greater degree of fidelity to your interview, but if it’s going to take you months to build, then it’s probably something you want to re-evaluate.

The goal of this process is to focus on something quick, lean, and gives you an ability to iterate. As I mentioned above, my original solution interview was too big and too long to run. Since I used Powerpoint, I had a very malleable form-factor that allowed me to trim things down and make changes quickly — allowing us to focus on what’s important.

Tracking solution interviews

For each solution interview, you should capture as much detail from that interviewee as possible. You can record a virtual meeting, meet F2F, have a “buddy” record notes for you, or offer a follow-up survey. Save these notes/artifacts because you may re-visit them later when you want to double-check your assumptions.

I also maintained a summary of solution interview results so I can chart/graph the data later. This is important for educating your board on what you’ve learned, as well as other members of your team. (Note: for early stage companies, try to include your team as much as possible. Everyone should feel responsible for user research and market validation in an early stage business.)

The solution interview tracker is modeled almost entirely after the Hypothesis/Assumption Map I detail above, but includes the results of those interviews. The additional columns are things like:

  • Actual result (so you can compare it against the expected result)
  • Demographics of interviewee (so you can map it to a persona)
  • Quotable moments
  • Has budget / willing to pay

Your thoughts?

I’m interested in your thoughts on this. If you have feedback for me, or have run solution interviews using different tactics, I’d love to hear about it! Send me a tweet at @joepelletier.

--

--

Joe Pelletier

Boston-based product management professional. Passionate about technology and entrepreneurship. Currently @Fairwinds, previously @Veracode.