Introduction to Customer Analytics
Quantifying Customer Experience
A tech company’s relationship with its customers has become an important lever of growth (examples such as Nubank and Zappos come to mind). Yet, the effects of providing an above average customer experience are still seen as hard to track. Thus, customer experience (and/or customer success, customer support and other highly customer-oriented teams) end up commonly treated as a soft discipline.
But digital businesses are uniquely positioned (at least when compared to their away-from-keyboard counterparts) to better understand how customer experience affects their bottom line by measuring results and analyzing customer-related data.
Measuring key customer-related results
The Net Promoter Score (NPS) is a long-term customer relationship indicator that surveys customers with one specific question: “On a scale of 0 to 10, how likely are you to recommend us to a friend or colleague?” We may consider who scored 9 or 10 as product or service promoters, while those who score 7 or 8 are considered passives, and six or below are classified and detractors.
It is often referred to as a brand or company-customer relationship indicator as a whole, since it aims to capture a sum of each person experience with said product or service.
The Customer Satisfaction Score (CSAT) is a short-term customer relationship indicator which surveys how satisfied a customer is with a recent interaction (often a purchase or a customer support issue).
An indicator related to CSAT is the Product Satisfaction Score (PSAT), an adaption of the former focused on measuring short-term satisfaction and gathering contextual feedback on a specific product or feature.
The Customer Effort Score (CES) measures how much effort customers spent on a specific interaction or process (often customer support and/or product onboarding, which are important drivers of product usage).
Analyzing customer feedback
Customer feedback analysis may be defined by executing three main tasks:
- Processing unstructured data (i.e., tagging customer support tickets);
- Analyzing processed data (i.e., customer support tags frequency);
- Translating customer data insights into product and/or service improvements;
Even though those tasks are equally important, translating insights into improvements means avoiding the pitfalls of being ‘data rich, but insight poor’. Non-insightful data provides already known information, whereas insightful data leads to disproving current assumptions, confirming a certain hypothesis, or quantifying the importance level of specific issues.
Customer Operations metrics and indicators
The Tickets by Assignee metric measures how workload is evenly (or unevenly) distributed across customer support agents considering a certain period of time.
Tickets by Assignee = absolute number of filed tickets by assignee
The Number of Tickets Filed metric measures how workload fluctuates across a certain period of time.
Number of Tickets Filed = sum of filed tickets within a timeframe
The Tickets by Channel metric measures how workload is distributed across communication channels considering a certain period of time.
Tickets by Channel = absolute number of filed tickets by channel
The Tickets by Tag metric measures how frequently each tag is attributed to tickets filed by customers to a customer support team.
Tickets by Tag = absolute number of tickets by attributed tag(s)
The Average First Response Time (AFRT) is an indicator which measures how long a customer needs to wait before receiving an initial response to their support request. Acceptable times before a first response vary across industries and communication channels, for example, while social media questions are expected to receive an answer within 24 hours, a chat interaction is expected to be answered within a few minutes or even seconds.
Average First Response Time (AFRT) = sum of elapsed time until a first response to each customer support ticket is given divided by the absolute number of filed tickets within a timeframe
The Average Reply Time (ART) is an indicator measures how long it takes to follow up to customers on all interactions, including first responses. However, this indicator is subject to being affected by outliers (an unexpected outage on a key partner API — application programming interface, for example). An alternative to minimizing outliers’ effect on the ART is using Response Time Bands (RTB) chart to view the percentage of tickets within a specific timeframe.
Average Reply Time (ART) = sum of elapsed time until each and all customer messages are answered divided by the absolute number of customer messages received within a timeframe
The First Contact Resolution Rate (FCRR) is an indicator which measures what percentage of support tickets are solved in a single interaction, an important proxy on process efficacy and customer retention.
First Contact Resolution Rate (FCRR) = sum of all customer support tickets solved in a single interaction divided by the absolute number of filed tickets, and multiplied by 100 within a timeframe
The Average Number of Replies per Request (ANRR) measures how many touch-points are required to solve a customer request, an indicated of how autonomous a support team is and how efficiently their issue routing process operates.
Average Number of Replies per Request (ANRR) = sum of each and all occured customer touchpoints (i.e., received messages) divided by the absolute number of filed tickets
The Average Resolution Rate (ARR) measures which percentage of support requests are solved considering the amount of tickets received. This indicator may help in diagnosing bottlenecks regarding number of tickets received, number of agents on the team, and the complexity of said issues.
Average Resolution Rate (ARR) = sum of all solved tickets divided by the absolute number of filed tickets, and multiplied by 100 within a timeframe
Disaggregating customer indicators for additional insights
When dealing with indicators associated with unstructured data (i.e., tagged customer support tickets or NPS comments) it is possible to use a simple technique to understand how much certain issues affect the overall score.
For example, consider a database of tagged customer support tickets, you may calculate both the overall AFRT and the impact of delayed shipping tickets by excluding all shipping-related issues from the later calculation before comparing them.
While customer experience always will present a considerable degree of subjectivity, measuring and analyzing data are elementary practices when aiming to establish a data-informed culture within a company. This remains true whether referring to tech or non-tech companies. I hope this content helps you in taking first-steps into customer analytics practices.