How User Feedback Tools Fail 

Publishing user feedback tools is no longer the challenge, giving value is and many fail at this.


“I was not sure if I should write down my thoughts on this topic or not… let alone publish them online.”

Ok, I guess I made up my mind on that conundrum. Let me point out though that this is not an attack on anyone, any company, or any tool. It’s a mere observation.


Let’s break it down first

While working on my online course about user feedback tools I discovered a common error that many user feedback tool vendors are making.

You see, there is an easy process when it comes to user feedback research and putting it to use for online optimisation.

  1. Collect Feedback
  2. Analyse Feedback
  3. Report Feedback

‘Feedback Collection’ Galore!

All the tools out there on the market today do a great job at collecting feedback. All vendors offer passive collection methods, the more premium vendors allow you to actively approach users, or even target them based on behavior.

I applaud these techniques, I really do because I have been able to solve some real issues using user submitted feedback and have even considered replacing regular NPS surveys with more page level feedback since it is more contextual, and more effective for solving user experience issues.

Analysis Fail One

I have accounts at almost all the vendors out there, and I have used almost all tools out there. I can safely say that with only a few exceptions (few as in 1 or 2) the analysis capabilities in user feedback tools are lacking the necessary punch.

Chart plotting user feedback tool pricing (X-axis) and analysis functionality (Y-axis)

In the chart above, I tried to plot the tools I reviewed on two axis.

  1. Pricing (X-axis)
  2. Analysis functionalities (Y-axis)

I defined Level 1 as being able to view quantitative data about the number of feedback items received, number of feedback items per pre-defined category (if available), and the average scores given.

In Level 1 you would also have access to the qualitative data, in this case the answers to the open-ended questions.

For Level 2 I added analysis functionalities such as filters for text, categories, scores, pages, date ranges, and more than 1 export options.

Level 3 would also, just to name few, contain feature like dynamic word clouds, and the ability to cross reference quantitative answers (radio button answers etc) with open-ended question answers.

Clustered based on functionality

As you can see many tools are clustered simply based on the analysis functionalities they offer, or should I say don’t offer.

This is where the big opportunities lie. Gone are the barriers to collect, gone are the barriers to target users, and gone are the barriers to collect on mobile devices.

It is time to separate the men from the boys so to say.

While vendors continue to focus on collection methods, analysis functionalities are failing. A decision that, from personal experience and talks with insiders, is costing vendors customers.

Churn is massive.

Opportunities for user feedback tool vendors

Vendors, in my opinion, can truly distinguish themselves simply by starting to investigate and invest in functionalities for the analysts, the marketeers, the people who need to use the tools, and eventually recommend the renewal of the tool’s subscription.

Stop focussing on collecting, we’ve crossed that bridge. Start making your tools more than eye-candy for management, make them leverage the knowledge and motivation of intrapreneurs, the foot soldiers in the online realm, the people who give your company meaning and value in return.

Value of user feedback tools does not rest solely in the collection capabilities, it never should have.

Analysis and reporting are the two ignored selling points of your power to give us insights into our user’s motivations and fears.

Harnas it please. Today.

Show your support

Clapping shows how much you appreciated Matthew Niederberger’s story.