50 Questions I Ask PMs About Data and Their Teams

In my coaching work, I end up talking to lots of product development teams about data, KPIs, running experiments, and “measuring the team”. People are looking for the silver bullet:

I’m also wondering if you’ve seen any good solutions to the product KPI problem in general

My response is a typical, befuddled “it depends”. Why does it depend? I’ve yet to find a one-size-fits-all solution. Goals vary. Products vary. Cultures vary. Skills vary. The scope for possible actions vary. At at the end of the day, cultivating a learning culture (a prerequisite in my mind for the responsible use of data) isn’t something you install. That’s the blocker … not some tool, technique, or having access to a data scientist.

Here is my backup list of questions I use when doing interviews and gathering “data” on an organization before making any recommendations.

Use ’em if they’re helpful. Look before you leap, and gauge the organization’s DNA. Then start the experimentation…

  1. How open is your culture to disconfirming information? Is it welcomed or fought? What would be the path from that disconfirming insight to an actual change in direction?
  2. Trying new ways of using evidence and data to drive decisions can be risky. Mistakes are common. How safe is your environment for teams that might make the occasional mistake in their quest to make better decisions?
  3. When it comes to using evidence to drive better product decisions, what’s working? Why is it working?
  4. Describe something new you have learned about your customers in the last two months. How did that learning come about?
  5. Describe a current blind spot. What do you know you don’t know? What might you accomplish if you could learn more in this area? What would you ask a crystal ball?
  6. Could you describe a recent ad-hoc report or analysis you generated? What questions were you hoping to answer, and what decisions were you hoping to inform? Can you tell me more about how you actually gathered, cleaned, and analyzed the data?
  7. Are you required to deliver regular reports to other parts of the organization? What do they look like? How frequently do you prepare these reports?
  8. Moving beyond the pros and cons of individual decisions, how does your team reflect on the quality of your product development decision making process?
  9. Describe some recurring decisions you must make, and how you leverage data to make those decisions.
  10. What does being “data-driven” actually mean to you? What is the opposite of being data-driven? Can an organization be customer-driven AND data-driven?
  11. If your teams could get faster feedback, what might that let them do? What type of feedback would drive these actions?
  12. Who is involved in your business intelligence efforts? Who do they report to? Who thinks about the big picture and handles cross-functional / cross departmental needs?
  13. Words like data, business intelligence, data science, analytics, insights, experimentation, and learning are thrown around often. Can you describe your personal mental model?
  14. How do your teams handle and explore ambiguity?
  15. Can you describe insights that exist elsewhere in your organization that are logistically difficult to incorporate/integrate with your product data? Why does this silo exist?
  16. “Fail fast” is one of those buzzwords we hear often. It inspires many reactions. What’s your reaction?
  17. What was your last flop? How did you figure out it was a flop? How long did you know it was heading in that direction?
  18. People often speak about “wrangling” data. Can you describe some recent struggles with getting data into a state suitable for analysis?
  19. How is decision making distributed across your organization? When the Why for a particular decision involves data and evidence, how is that communicated?
  20. An engineer once asked me if there were “passing tests” for product managers. From the PMs perspective, how would you describe the “passing tests” for your work?
  21. What are some common adoption patterns for new features in your product? What dictates the uptake?
  22. How do you go about understanding the impact of newly shipped features?
  23. What can’t data tell you? What are some of the intangibles that are driving your product decisions?
  24. Who has access to your product data? How do they access that product data? If they produce some sort of insight — a graph, a dashboard, a report, an analysis, etc. — how do they share that internally?
  25. Some common themes associated with data include: control, learning, efficiency, resourcing, performance management (measuring people), and innovation. What’s at the heart of your efforts? What’s your motivation? What will become possible?
  26. Do you have monitors in the office with product metrics? Or a dashboard that is widely and passionately consumed? What’s on it? What’s the story on how that dashboard came to be?
  27. What is your current weak-spot in terms of using data and evidence to drive better outcomes? Where is there friction and hiccups? What’s stopping progress?
  28. Describe some of the longer term trends in your product. What have you been tracking for some length of time? When will it reach an inflection point?
  29. How do your teams leverage existing product data during the design process? Can you describe a recent design decision that was informed by benchmark data and insights?
  30. How might someone internally get a quick snapshot of your product’s health? How would you know if something was going wrong, especially if it was a longer term trend that might not be immediately apparent.
  31. Does your team trust the data it has? When is trust lacking? Where is there confidence?
  32. How has data and measurement been inadvertently abused in your organization?
  33. How do teams share their insights? Where would I go to learn about recent learnings?
  34. If presented with data showing poor adoption or that the work was not driving the desired outcome, what action might your team take? Who would make that decision?
  35. Who is the most data-savvy person you work with? What can they do that others cannot do?
  36. The word “experiment” can mean different things to do different people — especially in the business context. What does it mean to you? Can you give some examples from your organization?
  37. Sometimes it is OK to guess. Other times, not so much. Describe a recent situation where it was decidedly NOT ok to guess, and explain how you went about gathering the facts.
  38. Describe a metric you refer to on a daily or weekly basis. What does it mean? Why is it important?
  39. Can you give a recent example of a situation where your team’s intuition was proven wrong? How did that unfold?
  40. The “intuition vs. data” debate is a frequent one. Have you observed this in your organization? How did it play out?
  41. How is your personal performance measured?
  42. In your opinion, what makes a product successful?
  43. In your opinion, what are the traits of a high-performing product development team?
  44. What are you thoughts on the phrase “learning organization”? How do you personally balance learning and execution?
  45. What leeway do your teams have for shipping, analyzing the data, and acting on that information before taking on new work? How quickly are they called on to the next task?
  46. How do you segment your customers and users? Who uses these segments internally? How does your segmentation scheme manifest in how you work, prioritize work, make decisions, etc.?
  47. What traits do your most valuable customers have? How do you prioritize your customer success and customer support efforts when in a pinch?
  48. What are the signs that a customer is embracing your product? What are some signs that a customer is having trouble embracing your product, and may be at risk of churning? How does the information get funneled to your product development teams?
  49. How does your team know it is doing a good job?
  50. Teams take a lot of satisfaction in seeing the impact of their work. How do you show your teams that their work made a difference?
  51. What are some of the key things you consider when prioritizing items in your backlog/roadmap? How does data figure into your prioritization efforts?