Making User Interviews Easy: A Case Study on Designing a Moderated Testing Platform
In this case study, I will talk about my learnings about designing a complex video conferencing tool.
Hi there! 👋 I’m a Product Designer based in Bangalore — I’ve worked on some really exciting projects at GetCurious (formerly UserStudy). Today, I’m sharing one of my favorite projects — Moderated — where I worked closely with researchers and product teams to design a seamless moderated testing platform.
About GetCurious & the Project
GetCurious is a user testing platform built for UX Researchers, Designers, and Product Managers. It helps teams conduct research, capture screen sharing, record videos, and gain AI-powered insights.
In this case study, I will be focusing on the helping users conducting moderated interview session on our platform.
But what is moderated testing?
Moderated testing is when an Interviewer (Moderator/Facilitator) guides a participant through a task, observes their actions, asks questions, and gathers insights about what works and what needs improvement.
Here’s a standard research project workflow:
Why Build a Moderated Testing Tool?
We wanted to make researchers’ lives easier. Our internal team was struggling with conducting interviews, so we decided to dig deeper into the challenges they faced.
While researching, we found a few tools for moderated testing. But they didn’t actually solve the core problems that researchers deal with.
We’re already helping users conduct unmoderated testing (surveys) on our platform. But we thought — why stop there? Why not build a tool that makes moderated research easier too? By doing this, we could help more users while making research smoother for everyone. Here’s what we aimed for:
- We planned to serve new and existing companies that rely on user feedback by making it easier for their researchers to conduct effective moderated tests.
- By having a moderated tool we would able to provide a complete package of research services to big enterprises more easily.
- By making it more simpler, designers could also be able to conduct research not relying on researcher’s only.
Understanding the Problem Space
To better understand the current process and pain points associated with moderated testing, I interviewed stakeholders across and outside the company to understand different perspectives, insights, and opportunities for improvement.
User Groups
Who are we solving for? And what problems are they facing exactly?
Shoutout to @pablostanley who made these illustrations!
The interviews uncovered several challenges:
1. Moderator (Interviewer)
- Moderators found it tough to follow scripts or discussion guides while conducting interviews, and they also had a hard time sharing updates to the script with their team.
- A big challenge for Moderators or Clients is testing new prototypes or designs with the target user without sharing direct links. This is to keep the new updates confidential.
2. Observer (Note-taker)
- Observers had difficulty cross-referencing their notes with the recordings. They often ended up taking notes by hand or using tools like Spreadsheets or Miro, which don’t allow for saving notes with timestamps directly next to the recordings. This increases the time for the synthesis.
- Observers and the Moderators need real-time chat to share questions or instructions without the participant knowing.
3. Participant (Tester)
- Sometimes, participants were uneasy and tense knowing that several people were observing and listening in on the interviews, which made it hard for them to freely express their thoughts.
- Many participants find tools like Microsoft Teams or Cisco WebEx complicated, especially if they’re using them for the first time, despite being commonly used internally by companies.
Disclaimer: All the above problems are not faults of any tools like Google Meet, Zoom, or Teams; they serve their purpose well. However, these problems are specific to researchers, designers, and participants during moderated testing.
Objectives of the project
Based on the insights from our user research, we set out to design a moderated video conferencing tool that would address the following objectives:
- Seamless Experience: Provide moderators and their teams with a platform that has all the necessary features and minimizes manual efforts.
- User-Friendly: Create a user-friendly interface that is easy to use for people of all ages, genders, and regions with different levels of digital literacy.
- Building Trust and Confidentiality: Protect the confidentiality of design links and educate participants about confidentiality to build trust.
- Streamlining Processes: Reduce researchers’ time analyzing and synthesizing data by providing AI-powered insights or summaries.
Scope
The scope of this project was decided to design the interview experiences for moderators, observers, and participants.
Recruiting Participants that’s the whole different project which we have not planed to introduce in our platform yet but researchers can recruit participants, either through the getCurious Panel or manually.
Design process and challenges
Before doing anything, I began by observing and talking to researchers about their day-to-day work. I looked at how they used different tools for research and meetings, and I did a competitive analysis to learn about existing design patterns. The tools are:
Although these platforms share similarities in functionality, they still differ in how they shape their experiences for different use cases. I have started alongside finding solutions to the user challenges and also created multiple versions of each flow and prototyped them to evaluate their performance to identify the most effective one.
The Solution: A Unified, User-Centered Platform
🚀 Phase 1 of this project is already live! In the first phase, we focused on building the core video conferencing functionalities, ensuring a stable and seamless experience for moderated research sessions.
📌 Now, in Phase 2, we’re taking it a step further — solving key challenges for interviewers. This phase is all about improving the workflow.
Here’s how our new platform brings it all together:
I. Role-Based Experience
In most research sessions, many team members join the call, but only 1 or 2 people actively participate, while others mainly come to observe or take notes.
Having too many people present causing discomfort for participants, making them less inclined to openly express their thoughts and opinions.
Here’s what participants had to say:
To improve this experience, we introduced two distinct roles for team members:
- Moderators: Engage directly with participants without distraction.
- Observers: Join via a separate channel, where they can take notes and chat privately with the other team members.
Participants interact only with Moderators, while Observers remain unseen, reducing discomfort for participants.
II. Integrated Script Management
During my research, I observed that many moderators struggled to manage their scripts during interviews. They frequently switched between tabs/windows or relied on handwritten notes, making the process inefficient.
To solve this, we introduced a dedicated script tab within the interview window. Now, moderators can:
- Follow their discussion guides like a teleprompter, keeping conversations smooth and natural.
- Edit scripts in real time, with changes automatically saved for future sessions.
III. Dual Chat Functionality
During my conversations with researchers, they highlighted a key challenge — the chat feature didn’t allow private team discussions, as participants could see all messages.
To fix this, we introduced two separate chat sections:
- Participant Chat — A dedicated space where moderators can communicate directly with participants.
- Team Chat — A private channel for internal discussions, ensuring a distraction-free experience for participants.
The main goal was to make these sections visually distinct and easy to switch between without confusion. After multiple iterations, I landed on a solution that keeps navigation simple while clearly differentiating both chats.
IV. Enhanced Note-Taking Experience.
While researching, I noticed that observers play a key role in interviews by taking notes alongside the moderator. This helps moderators avoid rewatching interview recordings just to capture insights. So, making note-taking quick and efficient for observers became a priority.
Currently, observers rely on tools like Google Sheets and Miro. Through my research, I identified three key needs:
- Fast note-taking — Observers need a seamless way to jot down notes quickly.
- Easy editing — They prefer minimal clicks when updating notes.
- Timestamping insights — When multiple insights are shared at once, observers want to save timestamps for easy reference.
To find the best solution, I explored various note-taking tools and analyzed how they could integrate into our product. Some of them are:
After working closely with observers, I finalized a note-taking system that meets their needs while blending smoothly into the platform.
V. Sharing Masked Link
During my conversations with stakeholders, I discovered that many companies — especially larger ones — want to conduct usability tests with multiple participants but are hesitant to share prototype links. Their main concern? Potential leaks or competitors gaining access.
To address this, I collaborated with the tech team, and we found a solution: a one-time link. This means the link works only once — after it’s opened, it becomes inaccessible and shows a “Page not found” error.
However, directly sharing the link in chat wouldn’t work, as masking it wasn’t possible. So, I designed a ‘Send Link’ feature. Here’s how it works:
- The moderator pastes the prototype link into a text field and sends it.
- The participant receives a ‘Screen-share & Open Link’ button.
- Once they click the button and share their screen, the link opens in a new tab.
This ensures secure access while keeping the usability testing process smooth and controlled.
What’s Next?
Since the Moderated project is too big to launch all at once, we’re rolling it out in three stages.
Right now, we’re preparing to launch Phase 2, which includes:
✅ A public invitation option for observers
✅ A script feature
✅ Separate chat sections
✅ Shared team notes
✅ Mask link sharing
This case study covers all features except the Transcript feature, which is a bit more complex but highly valuable. Once implemented, it will help researchers gain deeper insights and save time.
Time to Reflect…
Business Impact
The Moderated module has made a positive impact on the way user interviews are conducted.
📈 Phase 1 of the Moderated tool is already making an impact:
- 2 enterprise-level clients and 1 well-established startup have adopted Phase 1 of the Moderated tool and conducted multiple studies.
- Our in-house research team actively uses the platform for user interviews across all service projects.
- We’ve also seen a rise in sign-ups from researchers and designers eager to try the moderated testing tool.
With Phase 2 on the way, we expect even more clients to join our platform.
What I Learned
This was a huge project that pushed me to explore and refine various design patterns — including navigation, hierarchy, layouts, and content structuring.
- Usability testing with stakeholders helped me validate prototypes, assess performance, and identify areas for improvement.
- Collaboration played a vital role. Working closely with the tech team and researchers gave me valuable insights and helped me shape better solutions.
And… That’s a Wrap! 🎉
Thanks for reading and making it this far! I hope this case study was insightful.
A huge shoutout to my team for their continuous support and valuable feedback throughout this project. 🙌
🤝 If you’d love to have me on your team, feel free to reach out via LinkedIn or Email — I’d love to chat! 😊