Pillars of Customer Success

Alaina Talboy, PhD
UXR @ Microsoft
Published in
9 min readAug 16, 2021




The User research spectrum stretches from foundational to tactical work. In general, researchers are highly skilled at collecting and analyzing different types of data at one or both ends of this spectrum. They turn research findings into actionable insights and help partners across their company experience the customer’s journey in their own words. More importantly, though, they are the champions for customer success.

In a sense, researchers act as the customer success strategist: the person who knows and speaks up on their behalf. This strategist dives into the users’ thoughts, feelings, and behaviors to understand the goals they want to accomplish. Those findings are turned into actionable recommendations that partners can use to improve products and features.

Good researchers are well versed in both business and customer success metrics. However, impact for researchers means they are well versed in the very same metrics that customers use to measure their success. Therefore, success for researchers aligns closely with customer success. Sticking to this outside-in view ensures we are evaluating our products through the lens of our customers’ experience rather than traditional business goals.

In this article, we’ll walk through an example foundational research program using the jobs-to-be-done (JTBD) framework, which is grounded in understanding what users want to accomplish, otherwise known as customer success. Within each phase, we discuss how the customers’ experience and success are kept at the center of the innovation process from abstract conception through product evaluation in market. Finally, the partnership between research and design is highlighted as a primary route for prioritizing customer success across disciplines.

The JTBD Framework

Jobs-to-be-done (JTBD) are a series of activities that users complete to accomplish a specific goal. If customers are struggling to meet their goal, product developers need to know. Otherwise, they run the risk of losing that (potential) market segment to competitors who do focus on their users’ needs. Researchers are invaluable in this space because we are trained to identify those user needs, uncover pain points and opportunities, and convey that data in ways our partners can use to innovate.

Understanding user needs can be a data-intensive process. Each job a user wants to accomplish typically involves between 12–15 individual steps, with approximately 5–10 user needs embedded within each step. When we take a global view and look at the end-to-end journey for accomplishing each job, we can easily uncover over 100 unique user needs to catalog and prioritize.

The great thing about this approach is that these user needs are stable over time and completely solution agnostic (among other characteristics). We all know that technology is ever-changing and fundamental shifts are coming. But user needs are constant, no matter what the technology landscape looks like.

Although there are many frameworks for uncovering unmet and underserved user needs, we wanted to create a holistic picture of our users based on multiple touch points. Therefore, we combined several approaches which are highlighted through four major research phases:

Foundational research program with four phases: desk research, qualitative, quantitative, and expansion.
Foundational research program.

Desk Research: Identifying JTBD

To kick off the research process, we started with desk research to identify common customer activities and jobs. Past work included several months (and in some cases, years) of internal and external data about what our customers are thinking, feeling, and doing.

The goals identified in this qualitative meta-synthesis are framed in customer-focused language ensuring our customers are kept at the center of the entire research program. If we focused on our existing product portfolio instead, the research would not have the same long-lasting impact or drive novel solution ideation.

Jobs are customer driven, not assumption driven.
Preferred customer driven language versus non-preferred assumption driven language.

As we evaluated the existing data, we found that many of the users’ activities could feed into more than one job. For example, a user may search through email, message colleagues for information, or check their to-do list as a way of organizing their workflow or staying informed on projects. These two different jobs have a similar set of activities. However, the motivation driving each job is completely different.

Example demonstrating how user activities can feed into different jobs based on motivations.
Example set of activities feeding into different jobs based on user motivation.

The foundation of customer success is firmly rooted in the customers’ experience and evaluated through the lens of what they want to accomplish. By looking at these activities and jobs via motivations, we can untangle the user’s behaviors more directly and identify those stable user needs.

Qualitative Phase: Uncovering the Why

Once the primary jobs of interest were identified, we moved on to the qualitative phase of data collection: uncovering the why behind users’ actions. This phase is the most time intensive but designed with a two-fold end-goal in mind:

  • Create detailed and nuanced contextual journey maps to help our partners understand our customers’ experiences
  • Identify customer success metrics to evaluate our existing and novel product innovation suites

Both require extensive information about the tools used, pain points experienced, workarounds, and potentially ideal solutions. Although the level of detail is more granular than we would normally collect, the context is incredibly valuable when trying to understand how users move through their end-to-end experience. Therefore, this portion of the research was split into two halves.

In the first half, participants engaged in a 5-day diary study to provide initial information about the core user jobs and activities identified during desk research. This information helped us prioritize the second half of this phase, which involved one-hour individual interviews. The interviews followed structured discussion guides so researchers could dive into the nuance of all the details provided in the first half.

Throughout both studies in this phase, we focused on the stepwise progression of how users accomplish a job. We looked for where our tools come in and out of the user experience, rather than how our users show up in our tools.

Journeys should be customer focused, showing how tools coming in and out of the experience, instead of product focused.
Customer journeys versus product evaluations.

Pain points were discovered around more than just entry barriers, revealing experiential flow issues alongside traditional usability traps. Customers talked about what they liked and didn’t like, what they would do differently if not constrained by current technologies, and the experience gaps that might be closed with new solutions.

Quantitative Phase: Creating Success Metrics

That rich, nuanced information collected in the qualitative phase is what we use to identify customer success metrics. Each metric is based on a singular user need, and this need is documented as an outcome statement. When needs are met, customers feel successful. But when those needs are not fulfilled, they create workarounds and can become frustrated. We measure customer success across experiences by turning these outcome statements into user experience metrics.

Each statement begins with a verb, typically minimize or reduce. The metric itself is established within that statement and focuses specifically on time, likelihood, frequency, or numbers. Finally, each statement is completely solution agnostic, meaning it does not refer to an existing product (see example below). This is another way we ensure that the customer remains at the center of our investigations.

Outcomes are customer focused instead of product focused.
Preferred customer focused language versus non-preferred product focused language.

We generated over 100 outcome statements from the qualitative portion of this research, each one addressing a specific user need. A large-scale, global quantitative survey was distributed to determine which of these outcomes were undermet or underserved user needs in different segments.

The top opportunities are identified through a combination of two core aspects: importance and satisfaction. Statements rated by customers as having the highest level of importance but lowest level of satisfaction are considered unmet user needs, the area we really want to focus on for innovation. (See this article for more about how to calculate opportunity scores.)

We used several segmentations to uncover unique sets of opportunities based on aspects such as:

  • Tool preferences
  • App usage
  • Market
  • Customer characteristics (e.g., workplace industry, job title)

In total, the initial quantitative data segments revealed over 30 unique opportunities to address! As we continue to learn more about our customers and innovate on segmentation, more opportunities will be identified.

Design Partnership: Visualizing Research

As you might imagine, lots of data were gathered in this research program. To make it easier and quicker for partners to uptake the information, research and design worked closely to organize the details inside of experiential journey maps. At a high level, these journey maps illuminate and amplify pain points, while effectively drawing attention to important gaps in solutions or opportunities.

We chose to create a novel, interactive journey map designed specifically to ease partners from a wide array of disciplines and leadership into the data through progressive exposure (see example journey map below). The findings are broken down into digestible pieces, in a versatile interactive prototype that’s friendly to diverse audiences.

A modern, accessible presentation allowed our partners to view findings at both a high and granular level. This was achieved through a minimalist design using ample white space, and careful attention to detail and accessibility requirements (e.g., contrast, legibility, conciseness). On-trend UI techniques were implemented to create a modern feel, while also enabling clear visual interactions that help partners navigate through the various journeys without friction.

All findings were compiled with relational mapping. This aided clarity and understanding through UX principles, such as Gestalt grouping and Hick’s law, to streamline interactions for phase-by-phase synthesis. This flexible approach and design-forward presentation of the research findings enabled us to reach larger audiences with varying levels of knowledge and awareness.

Example journey map.
The journey maps are hierarchically organized within phases, steps, and interactive information panels, which define critical milestones in a customer’s workflow where customer motivations, decisions, and actions contribute to an outcome or goal. Each step can be clicked on to reveal correlated pains, workarounds, and ideal solutions.

These artifacts, paired with the opportunity analysis completed in the quantitative phase, were the backbone of our customer success activation plan. Achieving customer success is more than just identifying needs through research. The findings won’t make a difference if no one else in the company applies them to their product. This is where being a customer success strategist is vital. Activation of the work requires a plan that incorporates effort, engagement, and strong partnerships through which the customers’ needs are championed.

Leading with the Customers’ Experience

Activating this kind of research requires going further than just sharing the results in a traditional write up. Anyone can read a report. But getting people to energetically connect with and use the findings, that’s where researcher success truly happens.

Research and design worked together to engage a multitude of disciplines through a series of interactive, design thinking workshops. During these workshops, we asked partners to critically evaluate their own assumptions and biases regarding how they defined customer success. We then introduced the customer success metrics that were identified throughout this work. When our customers are successful, we are successful as disciplines and as a business — and that’s a novel way to think about how we evaluate our products.

Armed with this new understanding, we helped our partners walk through our customers’ experiences, together uncovering the gaps and opportunities to explore. The findings were applied to our current product suite, in addition to generating imaginative solutions that could exist if technology were freed from reality.

At the end, we evaluated these concepts using those same outcome statements, converted into customer success metrics. That way, the entire process from ideation through market evaluation is grounded in the largest opportunities identified around customer needs and their measures of success.

Lasting Impact

in. We extended the reach of this work by partnering with other researchers and designers. They took our artifacts and pitched them to their partners and leadership, creating a cascading effect of activation and solution innovation across the company. Novel products are being designed using stable customer needs rather than ever-changing surface layer behaviors.

This shift in priorities focuses on the customers’ success as the means of product evaluation. In turn, this creates success for research and design through the overall:

  • Reach: how many stakeholders or partners are using customer needs in their work
  • Impact: where the research findings show up in planning and product
  • Innovation: when the needs backlog is used to create novel experiences

And because this research is grounded in user needs that are stable over time, we can keep coming back to the data again and again.