Focus on the Right Metrics to Measure Success

Melissa Kerr
Facebook Design Program Management
4 min readNov 15, 2018

--

Design Program Management is an emerging discipline that involves embedding operations experts within product design teams to add focus, solve problems, and ship exceptional products.

DPMs typically juggle many projects and programs simultaneously, with an aim to help the design team function seamlessly. But when your mandate is so broad, it can sometimes be difficult to know whether you’re keeping in mind the right problems. How to be sure? Well, as they say, data doesn’t lie. The only real way to assess what success looks like is to measure it.

Taking measure
For better or worse, most of the programs I work on are typically thought of as hard to measure. For example, the product team I contribute to at Facebook is building an internal tool for Facebook employees, and as a Design Program Manager on the team, I’m responsible for increasing adoption through awareness and education. Such programs are related to the product we are building, of course, but indirectly. The tasks in my purview include creating and running educational sessions so users can learn the tool, gathering feedback during focus groups or office hours, and making sure that the team is communicating the work we are doing to our stakeholders.

At first blush, it may seem like this type of work can’t be measured easily. But if you know which metrics to capture — and which way to measure them — it’s quite a bit easier to find out what’s working and what’s not. I recommend approaching these initiatives like these with typical key performance indicators (such as app downloads, active daily users, or revenue) as well as metrics that clearly reflect your value-add to the product team.

To make sure I gather high-quality data, I usually ask myself the following questions when I’m kicking off a new program:

  1. Is my metric easy to measure?
    Whether the data you’re gathering is qualitative or quantitative, it is imperative that it be easy to measure. If so, your chances of consistently and accurately gathering data around it will be much improved.
  2. Does my metric align to my team’s goals?
    This helps focus the data you’re collecting. You could measure many things, but if you’re able to align your metrics to what’s important for your team’s mission, it will help you understand whether your work is helping the team.
  3. Is my metric comparative?
    To appreciate whether you’re heading in the right direction, you need to understand changes over time or contrast your data with someone else’s from a similar program.
  4. Is my metric easy to understand?
    You must be to be able to clearly communicate your metrics to others. If you can’t quickly and easily explain it, communicate how you’re measuring it, and detail how it aligns to your team’s mission, then it’s too complicated.

Quality over quantity
Though it’s typically easier to use qualitative rather than quantitative measures for the work DPMs do, I try hard to include both. It’s easy to send out a survey, but surveys don’t give a full picture of the value I add to the team. Two success measurements I’ve used to determine the strength of an educational session are:

  1. The percentage of users who say it’s easy to learn about our tool (qualitative metric measured through a survey).
  2. The percentage of education session attendees who become an active tool user in the following month (quantitative metric measured through usage data).

Both of these metrics are simple to measure and align to my team’s goal of adoption of our tool within Facebook. One note of caution: Be careful not to rely too heavily on a single data point. For example, it would have been easy enough for me to track attendance of the session, record high turnout, and report that the program was successful. But if that high attendance doesn’t actually translate into increased tool usage, then it’s not really successful, is it? The second measurement — whether the educational session is driving actual increased usage — is a much stronger indicator of the program being on track.

Going back to the other factors, I can compare these numbers well from month to month. Even better, I can compare them from one tool to another. The data has been easy for me to explain and others to comprehend. I’ve been able to have robust discussions with people on my team and on other teams — a good indication that these measurements are bringing clarity to the evolution of the program.

Simply put, having clear metrics that align to your team’s mission will help you prioritize your areas of focus and allow you to adapt the program to achieve the outcomes your team desires. Happy measuring!

Note: This isn’t a comprehensive guideline about how to create metrics and I’m not a data scientist. This is my approach and interpretation of how to create metrics for the programs I’ve been working on as a Design Program Manager.

--

--

Melissa Kerr
Facebook Design Program Management

Design Program Manager at Facebook. Skier. Gardner. Lifelong outdoor lover.