NPS for data driven teams, feedback tagging — Tactics from the front lines

Ask Inline helps teams build great customer feedback campaigns using NPS® and CSAT surveys. Not familiar with NPS or CSAT? Read our introductions to NPS & CSAT surveying.

Bucketing NPS Surveys — Categorization

Quick refresher, a customer’s answer to the second question of a Net Promoter Survey®, the open ended question, is called the verbatim.

Read every verbatim. Every single one. You can automate eventually but at the start, someone on the team should really dig in.

Create tags for the common types of feedback and tag each feedback item. Use software to track these tags so that you can keep aggregate counts and generate a per-tag NPS.

Here are some common tags for SaaS companies:

  • Performance
  • Reliability
  • Feature request
  • Support request or bug report
  • Usability or complexity complaint

This is a continual process and takes time each week. It’s worth it. You want to gain insight into your customer’s experiences with the product.

Baked into Ask Inline

Ask Inline automates feedback tagging for you, we call it bucketing. Automated bucketing captures feedback items that contain associated keywords. You’ll still have to do reading at the start but once you’ve identified trends, tools like this make the job much easier to scale.

Once feedback is bucketed we want to derive meaning from the numbers of items in each category and their respective Net Promoter Score.

Below, are two example buckets that we use for our own Ask Inline account. Notice that the keywords that automatically fall into these buckets are shown next to the bucket name.

Here we’re tracking two common SaaS categories: reliability and design. Already, we see that more people are making comments about design and UX issues than about reliability. From an Ops perspective, we’re doing well but it’s probably time to double down on design and user experience. Digging into the feedback, we identify areas that customers find confusing.

Segment users by other data sources

We’ve got our feedback bucketed into meaningful groups. Now what? How can we draw more insights from the data? How about connecting this feedback to outside data sources? Again, you can do this manually using a spreadsheet or database. Naturally, we make it easy with Ask Inline. Below is the Ask Inline dashboard, showing metadata right along-side feedback.

One benefit of connecting customer data to feedback is that when we identify a feedback item which needs a follow-up from support or customer success it’s easy to send a link to the correct team member. All the data they need to get in touch with the customer is available.

Some other interesting use-cases:

  • Cohort analysis: What is the sentiment of customers who joined in January compared to June? If there was a change, what onboarding improvements did we make in that time?
  • Split tests: Does version A or version B of the new profile page result in an overall sentiment change? Is one mentioned in verbatims more often? Positive or negative?
  • Customer types: Are some types of customers more satisfied with the product than others? If enterprise customers are neutral or promoters and your free customers are detractors, perhaps free is attracting the wrong type of customers.

Bringing it together — Release analysis

The holy grail for many teams is correlating product changes with a change in customer sentiment. With our customer feedback bucketed and connected to outside data, let’s see if we can get there.

Determine a target bucket for each feature that’s released — a bucket that you hope will be affected.

Hopefully, the feature itself is based on ideas generated by reading customer feedback. While you can do a release analysis purely after the fact, it’s better to develop a goal prior to starting development. Science!

A quick example

Problem: After digging into customer feedback in our ‘Design’ bucket, we’ve found that a large number of detractors are administrators. The customers often complain of confusion about how to correctly set up the application.

Action: We hope that a redesign of the administrative panel, one that clearly outlines best practices, will positively influence our Design bucket.

Result: One month after releasing our best practices iteration, we take a look at the Design bucket. What does the data say? Both a downward trend in the number of people talking about design and an upward trend in the NPS for surveys which mention design related terms.


I’m Wes, a co-founder at Ask Inline where I help teams listen at scale by creating great customer feedback campaigns. We keep things on Medium fairly short and sweet. If you’re after something more in-depth, check out training.askinline.com where we publish new product iteration and customer insight resources every week.