Pushing for citations of research repositories in product practice — to get more done with insights
Where are insights being used? How can we grow adoption of existing research if we don’t track its application? Product teams may be consuming insights, but the ‘through line’ to resulting action is easily lost.
Researchers can push new standards for how and where their research repositories should be referenced. Product people can then apply these standards to cite the insights they’re working against, communicating evidence-based rationale and unlocking other downstream value.
Great researchers periodically follow up to understand where their insights have been used and which insights have not yet been applied. As organizations scale, this type of awareness becomes increasingly difficult. Researchers may rely on a handful of impact stories, while keeping track of a few crucial insights that have not yet landed in product plans.
Even when research repositories are well-established and building up amazing content — if no one has a clear understanding of where those insights are flowing, they may be viewed as dead ends.
By consolidating content into known locations, research repositories make insights more ‘portable’ and adaptable, ready to be referenced within planning, design, and engineering workspaces. The best repositories provide persistent identifiers that can be deep-linked into the context of any work product. These direct citations become an obvious way of seeing where research is being integrated into product people’s efforts, clarifying the ‘impact radius’ (B1) of where insights have landed — or have yet to land.
It takes some serious effort to understand — in detail — how product teams’ are metabolizing research. I see value in using subjective surveys where product people self report about their experience using research. And I’m all for using engagement analytics that allow us to see how repositories are being accessed. But I tend to lean on tracking actual citation of research as a more concrete indicator. I want to actually see where research is being applied. Insights as cited inspiration and justification — no longer on the cutting room floor. It’s much easier said than done, and requires some serious buy in, though tools and automation can also play an essential role.
Buy in for citations starts with insight creators. Creators can use citations when tackling big meta-analysis projects (C4) or folding existing insights into new research projects (C2). After some experimenting, researchers can come together to define shared standards for how they want their work to be extracted from repositories and linked into others’ outputs. Once these standards are in place, operations can explore different avenues for getting the word out about citation as a best practice. Although the research community may see this as a collective outcome, broad use of citations are the result of individual insight consumers finding value for their own situations. We can make expectations clearer by getting product leadership and sponsors on board, while finding ways to reduce barriers to doing the ‘right thing.’
Citation in product development isn’t an academic problem; it’s about research making more of a difference for the people we’re striving to serve. It’s about visible connective tissue between insights and outcome — maintaining a line from spark to feature.
As dry as standardized citations may seem, making research more visible in more places can become a launch pad for even greater impacts. Once folks have seen references in use, the improved transparency can lead to a growth in research literacy and an increased appetite for starting projects with research. Common use of citations can move research from a ‘momentary, optional inspiration’ toward ‘durable fuel for product planning.’ Researchers can see how their outputs are being consumed, and they can choose to help shape and amplify those next steps, correcting any customer misconceptions and adding context. And deep-linking repositories into insight consumers’ own workspaces builds a shared ‘laboratory notebook’ of evidence and ideas that can fuel product-led growth.
Improving your insight operations
Get more done with your research community’s insights by:
- Building the muscle of researchers citing themselves
Request that your research contributor community cite existing insights from their own repositories — but don’t give too many specifics as to how. Let the experimentation run free at first, then harvest the different formats that contributors generate.
- Defining standard patterns for citing repository content
Bring together research leaders and repository lead users to review ideas for citation formats. Collect a range of product teams’ deliverables and discuss which product development locations would be ideal for research citations to have impact. Create citation patterns that could fit within these locations, ranging from minimal, named links to more descriptive formats — such as an ‘embedded insight’ that includes anonymized customer quotes, related links, and other content. Explore language for cases where insights are being ‘solved’ versus cases where insights are only ‘informing.’ Develop strategies to ensure that citation links will not turn into dead ends over time. Consider how your new patterns can reflect a larger brand for existing research.
- Getting leadership buy in for new citation approach within targeted planning and design deliverables
Shop around your new standards and assemble some early examples. Choose specific processes and templates to pilot your new citation patterns in planning and design work. These targets could be backlog item input forms, appendices in planning documents, introductions for design presentations, A/B experiment reports, or any other pivotal location. Mock up what ‘good’ would look like, and bring your proposals to various leaders. Start your presentations with a basic demo of your research repositories, sharing notable wins to date that leveraged existing research. Ask leaders to support the changes needed to enable citations — including updates to tooling and workflows.
- Marketing the expectation of research citations and where they fit
As part of communication campaigns from your repository initiatives (B2), share launched changes to templates and processes. Frame new expectations to cite research in terms of the value on offer, with documented insights inspiring and justifying product proposals. Ask your leadership and sponsors to broadly communicate about these new expectations. Offer pathways for product people to get their hands on existing insights for their areas, easily add insights to projects, participate in research processes, or even submit questions, if applicable. Ask researchers for help pushing out new citation standards, requesting payment ‘in kind’ for insights delivered in the form of citations within product deliverables.
- Inspecting, iterating, amplifying, and expanding use of repository citations
Search internal tools for citations, sharing new references with your research contributor community. Explore automation for these monitoring and reporting efforts. Evaluate where any citation blockers are occurring and iterate your standards and approaches (e.g. could other deliverables or processes be more impactful targets?) Provide social proof of citation best practices by including related wins in your research marketing campaigns (B2) and other high-visibility communications. Update your leadership on progress and challenges, reiterating the need for them to regularly voice their expectation to see research references in the deliverables that they review.
- Your idea here…
On the path from insight to product impact
Growing standards for referencing research repositories in product practice is part of product teams achieving awareness of possible planning targets from their research. It’s also related to having envisioned solution ideas and prioritized plans that address those targets.
If you’ve read this far, please don’t be a stranger. I’m curious to hear about your challenges and successes building a culture of referencing research in your organization. Thank you!
- B1. Growing research ‘impact radius’ by connecting learning to more internal product audiences — to get more done with insights
- B2. Improving internal marketing about upcoming and new research — to get more done with insight
- D1. Enabling collaboration with standards for describing research across teams and disciplines — to get more done with insights
- C2. Strengthening insights by evolving each research output into a meta analysis of compiled learning — to get more done with insights
- C4. Meta analyzing across existing research to inform strategic product uncertainties — and get more done with insights
- G1. Mapping existing research into design briefs and workflows — to get more done with insights
- View list of all ‘Integrating Research’ posts (and upcoming topics)
- “When talking about measuring the impact, we have pretty good analytics so we can see whether people are reading stuff or linking to pages… The last part I’d add is — it helps if there is a leader in the organization who’s creating some pressure for people to either be consuming or linking to the research. If there’s a leader in the organization who says something around not wanting to see any proposals for new features without links to good research. Something as simple as that can create a habit where non-researchers are incentivized to show the research behind, or justify, why they’re making a decision. The impact this can have is, all of a sudden your research knowledge management tool is an asset to them and solving a problem that they have.” Matt Duignan, Lisa Nguyen
- “The good news is that people are very willing to help, I find, even if they don’t know you, even if they outrank you on the org chart. If you succinctly explain to them what you’re trying to do, I find that people almost always reply and help me. Demonstrate that you have an agenda, clearly communicate the outcome that you want, and you’ll start to build trust. I think a lot of this is simply about building trusting relationships both externally and internally.” Hana Nagel, Sofia Quintero
- “When measuring outcomes, moving beyond measuring whether research knowledge is used in decision making to measuring how research knowledge is used becomes important (Pelz 1978; Weiss 1979). Research knowledge may be used in instrumental, conceptual, or symbolic ways. Instrumental use is defined as acting on research in specific and direct ways, such as to solve a particular problem at hand (e.g., developing the first iteration of Medicare’s “Resource-Based Relative Value Scale” physician fee schedule). Conceptual use involves a more general and indirect form of enlightenment (e.g., resisting a move toward more for-profit hospitals because of a general sense that not-for-profit hospitals offer a survival advantage for patients compared with for-profit hospitals, but without knowing about the particular studies or their strengths and limitations). Conversely, symbolic use pertains to a use of research knowledge, but not to inform decision making; here research knowledge is used to justify a position or action that has already been taken for other reasons (sometimes called a “political use of research”) or the fact that research is being done is used to justify inaction on other fronts (called a “tactical use of research”).
John N Lavis, Dave Robertson, Jennifer M Woodside, Christopher B McLeod, and Julia Abelson