Responding to the Routine ‘Research Process’ Interview Question

Building your response based upon an experience-backed, five-step process framework

Connor Joyce
Bootcamp

--

Created using Dall-E

Even after almost a decade of conducting user research, I was still caught up the first time I was asked to describe my research process in an interview. In my experience, researchers’ go-to response for this type of question is generally, “Well, it depends,” but that is not an acceptable response in an interview so I ultimately rehearsed an answer based on what an ideal setting would look like. As shared in interviews with positive feedback, I noticed few resources exist to assist with this type of query. Thus, I set out to construct a basic framework for an ideal process for creating research to help those new to the field and those who need some help organizing their thoughts into a concise response.

Conducting research in an applied setting rarely occurs in a straightforward environment. It involves operating within constraints from the organization, stakeholders, and customers while dealing with multiple requests and changing priorities. With the dynamic role that user researchers play, it can feel like having any established process is meaningless. Yet, I have learned to challenge that, not just to sound good in interviews but because processes ensure consistency during chaotic times. While it is unlikely that you will precisely replicate the 5-step framework I will detail throughout this article; it should be used as a checklist to ensure quality outputs. Repeatedly doing so will build an intuitive response to the project setup, freeing up mental resources to focus on choosing the correct elements within each step.

Navigating the complex world of user research, I vividly recall the first time I was asked in an interview to articulate my research process. After a decade in the field, you’d think it would be a breeze, but it caught me off guard. The default researcher’s refrain, “Well, it depends,” doesn’t quite cut it in the high-stakes arena of job interviews. So, I refined my approach, crafting a reply that painted a picture of the ideal research scenario. This approach, validated by positive feedback in numerous interviews, highlighted a gap in available resources for handling such inquiries. Recognizing this, I embarked on creating a fundamental blueprint to guide the greenhorns in user research and seasoned practitioners needing a streamlined method to articulate their process.

In the real world, user research is anything but linear. It’s a dance with constraints imposed by organizational structures, stakeholder demands, customer needs, and a flurry of shifting priorities. Amidst this whirlwind, the notion of a rigid, established process might seem almost impractical. However, embracing a structured approach isn’t just about impressing in interviews; it’s a lifeline to consistency amid chaos. The 5-step framework I propose isn’t a rigid mold to fit every project. Instead, think of it as a quality assurance checklist, a foundation that supports you in navigating the nuances of each project. By applying this framework consistently, you’ll develop an intuitive knack for setting up projects efficiently, freeing your mind to focus on the nuances of each step and the unique challenges they present.

The Framework

Embarking on research in an applied setting is like setting sail on a voyage of discovery, guided by the compass of business needs. The journey begins with a fundamental question: How can we deepen our understanding of customers, their context, or other critical factors influencing the bottom line? This vital query forms the bedrock of our research expedition.

The inaugural step is crafting the questions that will steer our entire research voyage. It’s akin to charting a map for an unexplored territory. Next, under the stewardship of a seasoned research captain, we draft a meticulous plan detailing our route to uncovering the answers. This plan is our north star, guiding us through the uncharted waters of data collection.

Once we’ve gathered our treasure trove of data, the third phase is akin to navigating through a storm, transforming raw data into gleaming insights. However, sharing these insights is the penultimate and perhaps the most pivotal stage. Like a lighthouse guiding ships to safe harbor, effectively communicated research illuminates the path for stakeholders, ensuring the actual value of our research is recognized and utilized.

Five-step process for creating research

To encapsulate this journey, I propose a five-step framework: Generate Research Questions, Create a Research Plan, Collect Data, Analyze Data, and Share Insights. Each step, rich with its intricate subcomponents, will be unfurled in the pages ahead. A product team can confidently sail towards a horizon brimming with valuable insights by diligently following these steps.

Generate Research Questions

Embarking on a valuable research journey lies in pinpointing the precise questions that seek answers. Often, the initial queries posed by product teams merely skim the surface, anchored in assumptions that require validation rather than acceptance. Consider a seemingly straightforward question: “Will customers be satisfied by the new feature?” While this may yield some insights, it makes several logical leaps, presuming existing product value and customer comprehension of how this new feature integrates into the broader product ecosystem. More critically, it may not align with what truly piques stakeholders’ curiosity.

To probe deeper, we might reframe the areas of focus. Questions like, “Will customers find value in the new feature?” or “Does the feature effectively address the users’ problems?” elevate our understanding from mere sentiment to utility and problem-solving efficacy. Yet, even these questions remain somewhat nebulous, offering only a bird’s-eye view of the feature’s impact. Going deeper, I like reframing the question based on the type of data the stakeholder seeks. To do this, I first need to understand what kind of decision they are trying to make. If they are asking more about what happened, behavioral insight is beneficial, whereas if they are trying to understand why, attitudinal data is the way. From there, I try to understand what will make them most likely to act; that is the data I am trying to create.

The crux of generating powerful research questions lies in specificity and context to generate that actionable insight. It focuses on the mental models users bring into a feature, what changes while using the feature, and what outcomes are created for them after usage. Questions such as, “What expectations do users harbor regarding their interaction with the feature?” or “Does incorporating a summary page enhance the likelihood of users revisiting the feature?” concentrate on distinct aspects, revealing more nuanced insights.

With this rich tapestry of context, I can devise questions whose answers will enlighten and drive positive change. Mapping out how the answers to these questions will influence subsequent decisions is a powerful strategy to ensure that the insights gleaned lead to tangible action. In essence, robust research questions are the bedrock of practical research. They illuminate the path forward, dictating the course of all subsequent stages. Therefore, it’s paramount that a team ventures further only when it has actionable, well-defined research questions in its arsenal that promise to unlock meaningful insights.

Create a Research Plan

With the foundations of research questions firmly set, the path to uncovering answers requires creating a roadmap: the research plan. Consider the earlier posited question, “What expectations do users harbor regarding their interaction with the feature?” To delve into this, we must ask: what kind of evidence will confidently address this? Would an intricate understanding from a handful of users suffice? Or do we need many users’ broader, surface-level viewpoints to capture the essence of value truly? The former is qualitative in nature, profoundly explaining why, whereas the latter is quantitative, providing a more nuanced perspective. As the question involves people’s expectations, the team would likely ultimately want to gather attitudinal, qualitative data.

Understanding the nuanced nature of the data you seek is only the beginning. The crux is in determining the methodology to unearth this goldmine of information. Reflect on the significance of understanding stakeholder incentives. It’s akin to choosing between a miner’s pickaxe and a bulldozer: each tool (or method) has unique strengths for particular situations. You wouldn’t use a bulldozer to extract a gem, much like you wouldn’t rely solely on broad surveys to gain deep user insights.

However, no matter how clear the direction is, constraints can be the unseen boulders in our path. Just as we considered the underlying assumptions in our questions, it’s imperative to recognize potential limitations when drafting our research blueprint. Perhaps it’s a ticking clock or a tight budget string — every research initiative has its own set of constraints, and a well-thought-out plan accommodates them, ensuring the journey is feasible and fruitful.

Articulating a detailed research plan is much like crafting a story for our stakeholders. Start by setting the scene — what decisions or actions are to be influenced by this tale of discovery? With the context vividly painted, provide your audience with a walkthrough of your chosen methodology, elucidating how it ties back to the original question. Employing a RACI or ACID framework in your narrative can be a masterstroke, demystifying the roles and responsibilities and ensuring a synchronized execution dance.

However, our story is only complete once it resonates with its intended audience: our key stakeholders. Share your research plan draft just as a playwright would yearn for feedback before the grand performance. Their feedback, shaped by their unique perspectives and incentives, will not only fine-tune the plan but will also ensure the research results in insights that are both actionable and aligned with business objectives.

In essence, while understanding the research question sets the stage, crafting a meticulous research plan brings the performance to life.

Conduct the Data Collection

With a plan in place, the data collection phase begins, and it’s much more than just executing tasks — it’s about performing a well-choreographed dance grounded in the chosen methodology but with the agility to pirouette through constraints and unexpected turns. If the research plan is our script, the data collection phase is the live performance, and it requires every ounce of precision, attention to detail, and adaptability.

While on the surface, this stage might seem like a linear journey — follow the plan, gather the data — it’s rarely that simple. Methods such as surveys, interviews, and observations might be time-tested tools in our repertoire, but every audience (or participant) is unique. As researchers, we don’t merely go through the motions; we dance to the rhythm set by our participants, ensuring their experience is smooth and unobtrusive. This involves leveraging techniques that don’t disrupt their day-to-day, providing crystal clear communication, and enveloping them in a cocoon of support.

Ethics isn’t just a chapter in research textbooks; it’s the bedrock of data collection. Before capturing a single data point, we prioritize informed consent, emphasizing participant privacy and ensuring no one is harmed or discomforted during the study. This commitment to ethics isn’t just about meeting standards; it fosters trust, ensuring participants are more open, honest, and engaged — a win-win for all.

Agility, often discussed in the context of software development or gymnastics, is equally, if not more, crucial here. Even with the most meticulous plan, surprises can emerge — a key demographic might be more challenging to reach, a survey question might be consistently misinterpreted, or technology might choose the most inopportune moment to act up. Being agile means we don’t just react; we adapt, tweaking our strategies to ensure the quality and validity of our data remain unassailable. Additionally, keeping stakeholders in the loop is paramount. Sharing progress, early insights, and occasional hiccups ensures alignment and fosters collaborative problem-solving. If a new step or technique is introduced during this dance, documenting it ensures that we remember it for the current performance and pass it on for future endeavors.

In sum, data collection is more art than science. While the steps might be predetermined, the grace, ethics, and adaptability we infuse into the process ensure our performance results in genuine, actionable insights.

Turn raw data into evidence

Data analysis is the crescendo in our research symphony, where notes on a sheet — disparate data points — fuse into a harmonious understanding that sings to our initial research questions. Yet, this transformation isn’t an impromptu act. It’s a meticulously curated experience contingent on our data’s nature. Like a conductor interprets the composer’s intention, our analytical tools and methods must align with the type of data collected.

Qualitative data is a form of storytelling with a scrutiny lens. Thematic analysis helps us understand the text to uncover recurrent motifs or narratives. Sentimental analysis goes further, diving into the nuances of emotions expressed in conversations or written feedback. The aim is not to quantify emotions but to explore them, understanding their hues, shades, and origins.

Conversely, quantitative data lend themselves more naturally to descriptive, evaluative, and predictive analytics. These aren’t just buzzwords; they are prisms through which we look at numerical data. Descriptive analytics help us understand the ‘what,’ evaluative analytics delve into the ‘why,’ and predictive analytics venture into forecasting the ‘what’s next.’ Each plays a critical role in piecing together a multidimensional understanding of our research terrain.

Statistical significance is often touted as the gold standard, yet it’s not the end-all-be-all. It’s like aiming for a standing ovation at a concert; great if achieved, but not having one doesn’t make the performance a failure. The objective here is not merely to attain statistical significance but to collect a wealth of evidence robust enough to answer our initial research questions convincingly.

Remember, we started this journey with questions that were not merely academic but tightly woven into crucial business decisions and stakeholder needs. We are building a compelling narrative as we sift through the data, clean it, organize it, and apply a mix of statistical or thematic lenses. This narrative should be statistically significant, deeply insightful, compelling, and unequivocally relevant to the questions we set out to answer.

To summarize, the analytical phase isn’t a separate chapter but the climactic sequence in our ongoing research story. It’s where the plot comes together, revelations are made, and we, along with our stakeholders, gain new wisdom to inform our next steps.

Create action by sharing insights

I run by the philosophy that “applied research insights are only as valuable as the actions it causes stakeholders to take.” With this in mind, the final step is the most important when creating impact from work completed. No matter how groundbreaking, the knowledge unearthed remains latent unless it propels action.

Often, the key to driving action is more than just what is presented but how it is presented. Tailoring is not just for fashion; it’s for information dissemination. With their panoramic view of the organization and time constraints, senior leaders might resonate more with succinct executive summaries. They need the essence, the distilled wisdom. In contrast, fellow researchers with an insatiable curiosity for details and intricacies will want a deep dive into the methodology, data points, and the nuances of the analysis.

While presenting, clarity is the north star. Highlights and vital notes should be mentioned, accentuated, underlined, and reiterated. If there’s a clue of doubt about a point’s importance, err on explicitness often; what seems evident to us as researchers might be a revelation for the audience.

Beginning with the end in mind is always a good strategy. Reminding the audience why the project was initiated serves two purposes: it provides context and anchors the insights to larger organizational goals or challenges. This narrative arc helps create resonance and ensure that the research’s implications are not just understood but felt.

Drawing from communication theory, I’ve always preferred two formats that enhance the absorption of insights. The first, a classic triad, involves priming the audience with “tell them what you are going to share,” diving deep as you “share it,” and then reinforcing the knowledge with a “recap of what you shared.” It ensures repetition and reinforcement. The second, a narrative arc borrowed from storytelling, is structured as “what,” “so what,” and “now what.” Here, you present the facts, provide their implications, and then paint a picture of the future propelled by those insights.

To wrap it up, sharing insights is akin to passing the baton in a relay race. It’s a defining moment. It’s where research transitions from a mere academic exercise to an actionable blueprint, insights ignite innovation, and the quest for knowledge culminates in tangible change.

Research Process Recap

As suggested in the beginning, there is no formal process for conducting research. This is what makes the questions commonly asked in interviews both challenging and an excellent opportunity to get into the researcher’s mind. I hope that this simple five-step framework will serve as a grounding for more junior researchers to craft how they approach the creation and execution of research plans.

--

--

Bootcamp
Bootcamp

Published in Bootcamp

From idea to product, one lesson at a time. Bootcamp is a collection of resources and opinion pieces about UX, UI, and Product. To submit your story: https://tinyurl.com/bootspub1

Connor Joyce
Connor Joyce

Written by Connor Joyce

Mixed Methods Researcher and Behavioral Scientist. Ex-Microsoft, Twilio, Deloitte, and Tonal. On a mission to build products that change behavior! Penn MBDS '19

No responses yet