Screener Questions: Advice From 9 UX Experts

Tap into the collective wisdom of 9 seasoned UX experts, who share specific strategies for developing screener questions that go beyond mere demographic sorting.

Daria Krasovskaya
researchops-community
7 min readMay 2, 2024

--

An illustration showing how a screener question filters participants for a study, with a bee character guiding participants to two paths — one for those who fit the study criteria, and another for those who don’t.

Screener questions are the gateway to insightful user research and are shaping the success of every study we launch. A good screener gives you a way to filter out the people who are not qualified for the study, ensuring you only get insights from your target user base.

Crafting an effective screening questionnaire requires a precise balance of understanding the target audience of your research, and empathy to ensure that the right participants are chosen for the studies.

This article taps into the collective wisdom of 9 seasoned UX experts, who share specific strategies for developing screener questions that go beyond mere demographic sorting.

The expert advice that you’ll find below was collected as a part of Research Recruities project by UXtweak, where we wanted to round up the funny, relatable, and strange stories from recruiting. We also collected a bunch of tips and strategies for overcoming common recruiting struggles, links to which you can find at the end of this article.

An illustration of a person inspecting a website or online profile using a magnifying glass, with a bee character observing, representing the concept of closely examining online content or user data.

1st Step: Understand your participants’ contexts

Before jumping into writing your screener, make sure that you know exactly who you need for the study. If the participants are simply not the right people to give you feedback because they are not the ones who will eventually use your product, it doesn’t really matter how perfect your research is.

Respondent 42 from the Research Recruities survey, ResearchOps Specialist, 5–10 years of experience, emphasizes the power of teamwork in getting your participant criteria just right.

“Collaborate with customers to refine the participant criteria and set clear expectations regarding the timeline and potential challenges in identifying participants.”

This is an essential step that will help you figure out who you really need to talk to. Collaborate with stakeholders and other teams and create a clear description of who is the right participant for the study. Try not to rely just on demographics. Think about what people are doing with your product and what they know, not just who they are.

Larry Marine, veteran UX researcher and author, recommends to shift your focus on the user tasks, knowledge and experience levels for those tasks:

“Avoid the demographics and focus on the user tasks and the knowledge and experience levels for those tasks. Avoid recruiting existing customers (unless they are your target market) because they have already drunk the Kool Aid and are too familiar with your current product.

The best users are people who have never seen your product and people who have seen it, but didn’t like it. Unfortunately, those are also the hardest to find or recruit.”

Don’t be afraid to make your screener a bit longer. Think through every detail and ask yourself: “What kind of information should the participant be able to provide?”

Nikki Anderson, Founder @ User Research Academy, elaborates on this idea, stressing the necessity of a detailed screener:

“Think through every piece of information you need from the participant and craft that into a screener question — it might mean the screener is a bit longer (try not to go more than seven questions) but it will enable you to get the right people for your study!”

2nd Step: Don’t make it obvious

The goal of a screener is to filter out unqualified respondents. And in order for it to be effective, your screening questions and answers to them should not give a smallest hint of what kind of participants you’re looking for.

Here are a couple of tips from Parker Sorensen, Associate Director of Conversion Optimization, on crafting screening questions that avoid leading participants to ‘correct’ answers.

Example, don’t ask “are you planning to buy a car in the next year”. Participants can guess that you are most likely filtering out people not interested in buying a car in the next year. I have had tests where I can tell people faked their way through a screener just to get into the test, and it muddies the test results.

Instead, ask something like “what of the following are you planning to do in the next year” with around 10 possible choices, such as buy a house, buy a car, start school, go on a vacation, start a business, and include a none of the above option. Make none of the above filter the person out, and make all options except the one you care about (buy a car) optional, and the buy a car option required.

This way, there is no way to guess what you are asking for, so participants have to be genuine.”

Another UX Researcher, Respondent 75 from the Research Recruities survey, recommends adding a layer of verification, suggesting including decoy options in the screener. This type of approach will help you prevent dishonest participants from entering the study and skewing the results.

“For example, add fake brands, company names, and stores to a list of recently shopped at, when screening for specific shopping behaviors. Or add fake names of tools or services when asking if users have the experience you need. If they select this, you know they obviously don’t know what they’re talking about ALREADY.

Another is to call the participant after the initial screener and ask 2–3 questions to verify their most critical responses and ensure they weren’t misrepresenting themselves.”

An illustration of a data analysis team presenting insights, with characters representing researchers, analysts, and a presenter showcasing visualizations and findings to the team.

3rd Step: Add an extra level of protection against fraud and dishonest participants

Over the years, the increasing number of scammers, bots and fake participants in research studies became quite a big problem. With the rise of AI, this issue is becoming even more critical.

These people attempt to manipulate the screening process to qualify for studies, even though they typically do not reflect the characteristics of the target user base.

Michele Ronsen, Author, Researcher, Educator, Founder, UX Coach elaborates on this problem and suggests implementing a multi-level screening process:

“Teams of scammers will sometimes re-take screeners until they’ve “passed,” then use those answers to qualify under other additional identities, or share the answers with other scammers. Inviting participants to schedule their sessions takes longer, yet it offers additional benefits.

It enables a secondary screening process, provides a chance to confirm the accuracy of the contact information they’ve provided, and allows you to cross-reference responses to identify potential duplicates among the screeners.”

Another UX research leader, Ki Aguero, introduces a useful tactic to counter dishonesty: the “fear of god” question. This question serves to remind participants of the gravity of their involvement, effectively filtering out those who may be inclined to misrepresent themselves.

“If you can’t pre-screen participants to confirm they’re being honest, I sometimes include a “fear of god” question. It’s a little reminder of the consequences of misrepresenting themselves.

So at the end of all the other screeners, I’ve got something like, “For this study, we are seeking [whatever attributes]. If you do not meet these attributes and proceed with the test, you may [be expelled from the panel, have the session cancelled, not be paid for participating]. Are you sure you meet these criteria?”

It’s a great way to let anyone who’s being less than truthful out of your test before they get in there and muck up your data by pretending to be something they’re not.”

4th Step: Be mindful about sensitive topics

Screening isn’t just about logistical details, it’s also about handling sensitive topics with care.

Stéphanie Walter, UX Researcher, Inclusive Product Designer in Enterprise UX, recommends approaching those with a good amount of preparation and empathy, acknowledging that some topics may be triggering and it’s important to know how to work with those when crafting your screeners.

“Most of the time, screener questions are trivial, to be honest. But, in certain case, depending on the study, they can bring sadness, bring back some bad memories or become triggering for participants or potential participants.

I learned that it’s important to acknowledge that, with the team, when working on sensitive topics. To get a little bit more prepared, to what might happen. You will never write a screener you can 100% follow. But, we can try to bring more empathy in there.”

An illustration depicting a team collaborating on data analysis, with characters representing different roles like data scientists, analysts, and developers working together with visual charts, coding interface, and target icon.

5th Step: Test and refine your screening questionnaire

The last, but not the least important advice from Andreea Dalia Lazar, PhD, Senior Researcher, is to test your screeners just like you do any other research studies.

This is especially important at the early stages of research, to give you the confidence that everything works as it should, and you’re actually getting the right people for the study.

“If possible, test before launching to a wider audience, peer-review and opt for manual reviewing of the participants before getting the confidence that the screening process works as it should.”

So, what makes good screener questions?

A good screener is:

  • Designed to avoid obvious answers, preventing dishonest responses
  • Specific and crafted with attention to details that you’ll need the participants to provide
  • Includes diverse response options
  • Sensitive to the emotional impact of questions, especially on sensitive topics
  • Verifies participant honesty with follow-up questions and checks
  • And most importantly: Doesn’t give a smallest hint of what kind of participants you’re looking for.

Taking care of all that seems like a lot of work. However, with the right screening strategy, it can be possible.

🔥Discover a collection of participant recruiting stories: Hilarious & Strange: UXR Participant Recruiting Stories that You Need to Hear

➡️ Find more expert recruiting tips: UX Research Recruiting Tips from 19 UX Experts [+ Checklist]

--

--