The risks of “insight trackers” and how to avoid them

Researchers shouldn’t be judged on the impact of each and every insight they generate

Jake Burghardt
Integrating Research
3 min readJan 15, 2025

--

Researchers shouldn’t be judged on the impact of every single insight. Integrating Research ‘office hours’ advice. I.R. logo.

Sharing some advice I gave in an office hours session:

It’s overreach to frame researcher success as impact across all of their learning.

Just like not every item in a product team’s backlog will get implemented — not every insight that researchers generate will drive impact. It’s a given.

During research “report out” phases, it’s great to ask decision makers to provide specific responses for each insight. It can point the conversation toward action.

But after a study’s timeframe — when researcher’s insights will live on in repositories — it’s wrong to set up an expectation that insight generators should be judged by the ratio of impact across all of their past outputs.

Insights aren’t bugs. Insights can inform action at any scale, from product tweaks to new offerings.

I’ve had a few conversations lately with teams building “insight trackers.”

It feels like the spirit of the times, given recent layoffs. Many researchers (UX, Market, etc.) feel the need to prove their value. In some cases, research teams are being directly asked to show the impact of their work.

On the positive side, methods for tracking progress against insights can start new kinds of conversations about accountability. By making requests to owning teams more consistent and enduring, tracking can increase expectations for research-informed planning.

Some suggestions across recent “tracker” chats:

Count any wins

While there are many leading indicators of research impact (e.g. a link to an owning team’s backlog item), the most compelling wins to senior leaders are KPI lifts. Looking at research-informed launches and adding up the numbers that partners claimed as their own success can be super compelling over time (e.g. research insights informed features in X teams that lead to X amount of customer retention this year).

Many wins are stories, not just KPIs

Internal and external anecdotes are important. And not every impact has a clear KPI to count. For example, your research community might help make the case for building out a new product team. That’s a great story (and I’ve seen it happen).

During a study’s time frame, ask for broad accountability

Drive productive conversation by asking for responses to each insight in a report (in whatever format makes sense in your context). A simple table can enhance discussion. Promote any compelling wins to a larger, cross-study impact list.

Don’t report on ratios of impactful insights

Ratios of impact can suggest the inverse: researchers’ implied responsibility for insights that haven’t gotten traction. If you start communicating that researchers will be judged based on all of the insights in a “tracker,” it can negatively influence what they decide to put in there. Hard problems that could be big wins? Maybe not so much.

Beyond a study’s timeframe, track top insights

Here’s where a shared insight hub tool can come in. Continue pushing the top insights that matter most. Don’t expect everything to land — researchers usually learn more than product development and delivery teams can ever address.

“New initiatives” over “new tools”

Instead of talking about an “insight tracker” that researchers are responsible for, consider launching an “insight to planning” initiative that asks for contributions from multiple teams.

Plenty more on impact tracking and related topics in my book
Stop Wasting Research
coming in 2025 from Rosenfeld Media

Sign up for monthly newsletter
of fresh ideas about how to maximize product value
from your organization’s customer insights

--

--

Integrating Research
Integrating Research

Published in Integrating Research

Articles on research repositories and better integrating streams of research into product planning. Tech organizations are acting like labs without collective notebooks, unlocking only limited value from their research investments. Let’s get more done with research insights.

Jake Burghardt
Jake Burghardt

Written by Jake Burghardt

Focused on integrating streams of customer-centered research and data analyses into product operations, plans, and designs. www.linkedin.com/in/jakeburghardt

No responses yet