Frameworks for Design Research in Diverse Scenarios

Julian Jordan
3rd corner
Published in
5 min readOct 27, 2022

A reference, not to determine, but to inspire and guide decisions

Frameworks and templates should not simply be looked to for pre-baked, ready-to-go answers.

Their value lies in their ability to inspire and/or stimulate us:

  • to remember what we have forgotten
  • to make new connections and apply what we know towards a vision;
  • to seek out ways to fill gaps in our knowledge

I recently was speaking with a designer and a researcher who were struggling to convince their team to engage in extensive, in-depth qualitative interviews. The cross-functional team was hesitant because of the time investment. Part of our chat went something like the following.

I asked — “why do you think you should dive into a lot of qualitative interviews?”

They responded — “because we need to do some needfinding to understand the users, it’s the beginning of the project…”

I asked — “you believe you don’t understand the users enough? Do you have a sense for what problems you might be solving for? Are they the right problems? Are there no explicit problems to solve?

Silence. For a bit. Then we had a great conversation about what the team knew and didn’t know, blindspots, the reason the new project had been created, what success meant, etc.

The duo was not completely wrong in their instinct to jump into interviews. At the same time, our back and forth conversation served as a good reminder of something important. Good research methods (and their correct execution) are essential to problem solving, but trying to implement them without first thinking critically about the challenge at hand and how best to build towards relevant findings can result in a reluctant team, or worse, a waste of time.

It is important to think critically about why we research at all and the phase of the product we are researching.

Sometimes thinking critically is not as easy as it sounds when dealing with the pressures of a fast-paced environment where “we don’t have time for deep research” and “we need to ship yesterday” mindsets prevail. I have several thoughts on the deep research vs. delivery speed debate, but I will save that for another post. Here, I share some frameworks that can help support the conversations and critical thinking that guide decisions about research approaches.

The frameworks present a variety of approaches (sometimes we forget what is out there). More importantly, they nudge one to consider a research approach within the context of product phases, needs, goals and timeframes. Basic usage of the frameworks could be to simply use them as reminders of when a given approach might be useful. Better usage would be to leverage them to jump start critical questions like “What do I want to know?”, “How will I know when I’ve learned it?”, “What decisions or actions will this knowledge enable?”. With these questions in hand, one can again look to the frameworks to reflect on which approaches, or which elements of them, will best lead to achieving the goal at hand. I created these frameworks some time ago and have used different versions of them over time.

This first framework presents research approaches across a fidelity spectrum while also matching these approaches to a given product’s maturity. Product maturity (vertical axis, left) is relatively straightforward — as a product / service spends more time in the market and collects more users, its goals, and thus its research needs, often change as well. Fidelity (horizontal axis, top) refers to the technical implementation complexity of a given research approach. More than just a synonym for “execution time”, implementation complexity conveys the point that some research approaches require more advanced artifacts, sophisticated coding / sensing tools, or quantitative modeling, all of which stretch complexity. I don’t mean to imply that “Card-sorting, Co-creation” always take less time than “Resonance tests with Max-diff, Conjoint analysis”, but while the former can be implemented with tools in hand, the latter more likely requires coding or purchasing dynamic surveys.

The resulting framework offers a matrix that stimulates reflection about where one is in the product cycle, one’s technical capacity and, above all, what one’s research goals are and how they might be reached.

This second framework puts a twist on the first by explicitly introducing the element of time constraints. A former design professor of mine used to assign projects with an initial timeframe but before the end of the class he would cut this timeframe in half. It was a forcing mechanism that pushed me and other students to think about what was essential to the project and how to go about obtaining optimal impact and learning. It was a great exercise. I used it as inspiration for this second framework which presents a number of approaches to address two research goals while considering time constraints and product maturity. While it can serve as a “quick reference menu” in a time crunch, I mainly appreciate how it encourages a reflection of how different approaches can expand or condense into others to reach a given goal when time is a factor.

The above content is not earth-shattering, but I have found it extremely useful when leading teams or in my own work. In my experience, it engages multidisciplinary teams and enhances the conceptualization, planning, and communication of design research. The reality is that in different contexts, for different reasons, a clean line from goals/questions to approach does not always exist. Instead, something of an iterative dance takes hold — from goals to approach and from approach to implied goals — and it moves the researchers (and their teams) towards crystalizing what it is they really want to accomplish and how. In the case of the duo mentioned above, it stimulated rich thinking around their main questions and goals and reminded them of a simple but helpful point: deep thinking about key research goals and questions should come first and shape one’s chosen research approach, but it never hurts to have some approaches in mind or on the table.

--

--

Julian Jordan
3rd corner

design research + data science; staff researcher @Spotify, prev. design @McKinsey, advisor @echos, product @empregoligado, @AKQA, @dschool, @GSB, Merrill Lynch