Field Guide For Remote Testing and Lookback
Practical advice to improve user testing in this new normal.
User testing has moved completely online since most markets went into COVID-lockdown several months ago.
While this change is disruptive, we still need to get feedback from users to properly develop product and service ideas. Life goes on and so does the need to test new product and service ideas. At first glance, several upsides and downsides to remote user testing present themselves.
The most obvious downside is less familiarity and less control of the environment. With in-person testing, you have significantly more control of environmental factors — noise, interruptions, technology. We find it easier to recover when something goes wrong in-person because we’ve often already experienced and accounted for common issues or disruptions. This isn’t true yet for online remote testing.
The biggest upshot is that you now have access to a wider, more diverse audience base. The ability to screen for more diverse and inclusive audiences without increasing recruitment costs is now possible. As users become more familiar with remote conversation norms and video conferencing platforms continue to improve their offering, we can expect the quality of remote user testing to increase greatly over time.
Before we get to this remote testing nirvana, we need to examine present-day requirements and best practice. Here are some of the lessons we have learned the last several months from getting some things right and making mistakes in others.
WHAT IS COVERED IN THE ARTICLE?
Here’s how a team, with a minimum of two people, can set up and moderate remote user testing sessions for a day of testing. It includes simple guiding principles as well as a field guide to inform preparation, execution and synthesis of remote testing.
Most of the learnings can be applied generally to remote testing, while some content applies to a more specific use case — testing with Lookback while using another 3rd party video conferencing platform to host your observation room.
For the purpose of this article, I have excluded other common tasks associated with user testing like script development, participant screening and recruitment, prototype development or experiment registers and observation frameworks.
WHO IS THIS FOR?
It’s for people who are taking the first crack at remote user testing and want to understand what to expect. It will benefit those who are using Lookback most, but these principles can be applied to other common platforms including Zoom or even Google Hangouts.
WHY DO YOU NEED FIELD GUIDES AND CHECKLISTS?
Because things fall apart. Entropy is as powerful a force in user testing as it is in physics. Over time, disorder and unpredictability will inevitably show their faces during your sessions.
Here’s a sample of things that have happened in the last several weeks of user testing sessions:
- Participant doesn’t have headphones so mobile test in Lookback becomes inaudible to moderator and observation room because of deafening feedback.
- Participant hits the end of their data plan so their video drops out midway through the session.
- Participant is home with their kids who make multiple interruptions, makes the participant unsettled and unable to focus on the test.
PRINCIPLES TO KEEP IN MIND
Keep these as first principles as you go about your remote testing practice (or even mantras during a stressful testing session) to guide and focus your efforts throughout any remote testing project.
The more prepared you are, the better your life will be on the day of testing. The majority of all remote testing issues can be addressed in the build-up.
Things will break, no matter what you do. Relying on the fallibility of individual machines is never a personal reflection of you nor your competence. Accept that things will go wrong and embrace it, otherwise the day can be more stressful and anxious than necessary. With a little preparation, perseverance and positivity, you’ll come out the hero.
Work as a team, not an individual
Who wants to die on that ‘hill’ alone? No one. Work with a teammate before, during and after. It’s too much to manage this type of work solo. Plan, communicate and collaborate well together and you’ll be surprised how well you can navigate stressful situations which arise during testing.
REMOTE TESTING FIELD GUIDE
This guide is broken down into three simple stages:
- Before testing
- During testing
- After testing
I have detailed the desired outcomes for each stage, as well as associated steps and considerations you’ll need to achieve these outcomes. Lastly, each section also ends with a quick and practical checklist to increase confidence and accuracy of your work.
Reference this the week before and the day before testing to set yourself up for success. The effort upfront will be rewarded on the day of testing. It may seem like an optional step, but the reality is that if you don’t de-risk ahead of time, the cost to the project in the form of fewer insights, less data, team stress and reputation damage far outweigh upfront effort.
OUTCOME ONE — Participant is ‘test’ ready
Ensuring this outcome will eliminate 90% of your headaches on testing day. It means that the tech setup of each participant has been vetted before they sign on the actual testing session, where there will be a room full of observers (product teams, stakeholders) and little time to troubleshoot issues. If you do one thing from this guide and nothing else, then focus on this outcome.
Schedule tech setup with the participant the day before
A day before testing, every participant will need to schedule in time to set-up and validate the tech you’ll be using. This pre-test might include:
- Setting up Lookback using the browser plug-in or desktop app.
- Making sure participants have access to non-Bluetooth headphones.
- Testing that the participants’ internet speed is acceptable eg. x > 20mbps.
Block out an hour for tech setup broken into 10-minute blocks
We suggest scheduling these calls in 10-minute blocks, the morning before testing. If your designers or clients are still working on the prototype being tested, you can use a dummy during your tech set-up.
Update recruitment based on tech setup
If anyone does not pass the tech validation, make sure you have recruited a standby participant. We typically run five sessions in a day, with standby on hold. When we start the fifth session, a decision is made to keep or release the standby. This ensures that we are more prepared when (not if) testing goes ‘pear-shaped’.
We pay full recruitment costs, but only 50% of the incentive fees for our standbys. Standbys are compensated regardless if they are used or not.
OUTCOME TWO — You are ‘test’ ready
Rehearsal is key to ensuring testing preparedness. If it happens in testing, it will most likely happen when you practice. It’s much easier to discuss how the team will react to issues during practice than during actual testing, so use this to your advantage.
Review script and prototypes with the product team
You’ve done a final internal review of the script and prototype with the product team. Everyone should be aligned on the learning goals and “golden path(s)” for each task when it comes to usability or success when it comes to concept testing.
‘Hallway’ test the script and prototype
Dry-run the script and prototype afterwards with a practice “participant” in Lookback or whatever platform you’ll be using on the day. Treat it like a final dress rehearsal and do everything that you would do on actual test day. This should include testing different ‘worst case’ scenarios to increase our preparedness.
OUTCOME THREE — Everyone is aligned and excited
It’s important to consider that your team may have done in-person user testing many times in the past, but the context of remote online testing may be unfamiliar. Get everyone comfortable with the online platform that you’ll be using. Practice beforehand. Do not hesitate to over-communicate if you sense lack of clarity or comfort from the team. Remind everyone that they are learning new skills and they should enjoy the opportunity.
Connect with your product team and understand their objectives
Use this time to get intimate with the prototype, learning goals and experiment register. Ask lots of questions and push the product team to supply you with these details and understanding.
Share these remote testing practices upfront
It’s better to over-communicate expectations and build shared understanding at this stage. We share these key points in this field guide with our product team and our recruitment agency.
Here’s a simple checklist to run through before you start (so you can have an idea of the tasks in front of you) and a way to close out each phase of work.
Showtime! This section encompasses outcomes and considerations for the actual day of testing.
OUTCOME ONE — Team is prepared for turbulence
As stated previously, something will inevitably go wrong. Your best policy is to establish what you will do and how you will communicate to the participant and the observation room when something goes wrong.
Define clear protocols when things go wrong
Agree upfront on ‘termination’ protocol with the observation room (e.g. Three strikes policy, after the 3rd try we terminate the session)
Clarify team roles
Moderator to manage all participant issues while the other teammate manages the observation room (product team, stakeholders)
Establish your lines of communication
Create a 1-to-1 channel with your teammate (Slack DM, phone SMS). If you are the moderator, make sure you have a list of all participant phone numbers
OUTCOME TWO — Team is capable of providing support
Having the tools and the capability to overcome setbacks on ‘testing day’ is empowering and will give you momentum (who doesn’t like saving the day).
More importantly, it will relieve stress or anxiety that your participant and the observation room will be feeling when issues arise. The participant will be frustrated and lack technical familiarity in many cases to resolve common testing issues. Meanwhile, your product team is anxious because they are sidelined and unable to offer assistance in most cases.
Removing this stress allows everyone to focus on the task at hand and make better observations.
Have easy access to the Lookbook troubleshooting guides
These are handy when you need to directly call the participant and resolve issues:
Be familiar with the Lookback onboarding flow
Practice as the participant so you are familiar with the onboarding screens they are going through. If you have to call them directly, you’ll be able to quickly recognise from their description where they are in the setup or onboarding process and advise them. This can happen even after going through a tech set up the previous day.
OUTCOME THREE — Setup for success
Run through a simple checklist as soon as you start your session. It’s like the pre-flight check that pilots make before taking off. Once you’re in flight, you want to avoid interruptions to address hygiene issues like lighting, framing or noise.
Start your moderator script with a quick review of hygiene issues at the start of each session:
- Quiet well lit environment.
- Battery life is high or devices are plugged in.
- Software updated/downloaded (Lookback chrome extension, mobile app).
- Wired, non-Bluetooth headphones.
- Received all testing stimulus beforehand which are easily accessible during testing. This matters if you will be using additional props or stimulus with your prototype during your test session.
OUTCOME FOUR — Keep learning
To avoid repeating the same mistakes requires documenting what went wrong, then discussing possible causes of the problem and potential solutions. This conversation needs to be shared and reviewed with your team so they can learn as well.
When something goes wrong, make sure you get the following from the participant — device, operating system, screengrabs of the issue. You’ll need this documentation later to troubleshoot with Lookback and coach future participants/teammates/recruiters about common issues that can be avoided.
OUTCOME FIVE — Have a backup
If you have to terminate a participant because of technology or screening issues (e.g. they aren’t suitable to continue), you should use the standby participant scheduled at the end of the day.
Include a ‘standby’ in your recruitment
Just like normal qualitative user testing, you should recruit 6 people for the day. 5 as normal participants and a 6th as a standby which you can option.
Notify standby or recruiter if you need to use the standby
It’s also best practice to notify the recruiter/standby at the beginning of your last session that you will not need them if all previous sessions have been successful.
OUTCOME ONE — Playback immediately
As soon as the participant has left the session, it’s important to quickly run a retrospective with the group. Capture feedback and observations from everyone. Build consensus around the most important observations. You lose the clarity and sharpness to your insights and observations over the course of the day, which is why you must retro after each session. Also, it allows for people to jump in and out of different sessions throughout the day.
Plan for retrospectives between each testing session
As a rule of thumb, you’ll need a minimum of 20–25 minutes to review and discuss observations. Tack on a few additional minutes to prep for the next session. Inform all observers that they will need to stay the additional 20 minutes after each session to discuss and hear observations from the room. If they can’t stay, communicate that the retro is just as valuable as the actual testing session. If they persist in leaving early, encourage them to leave their notes with someone on the product team.
Retro after each session to ensure fidelity of observations
It’s easy to confuse your observations and stories after several back-to-back sessions. Capture any feedback to improve the script or testing session format at the end of each retro’s, particularly the first few sessions.
OUTCOME TWO — Clarity on how you will share
It’s critical to have a clear and articulated approach to how feedback will be played back to the group. Start coaching how observations will be shared before the session starts. Use simple observation frameworks (categories of observations, learning goals) to focus on what observations are captured. Timebox and limit the number of items shared based on the size of group and time between sessions. Encourage people to playback stories, because that’s what people remember.
Align on types of observations
Coach observers on the type of observations that the team is looking to capture and how this feedback will be shared at the end of the session. Make sure they are capturing observations in a word document during the session, allowing for easy copy and paste later.
Leverage collaborative tooling
Jump out of the testing platform and leverage collaborative space tools like Miro so the whole group can see and hear people’s observations in real-time. Have observers copy-and-paste their observations into a shared document/board as they present.
Timebox and limit the number of observations people can make
This forces people to share what’s most relevant, avoid repeating what others have already mentioned and allows enough time for everyone to speak.
Use voting to prioritise and align
If there is time, encourage voting around the observation areas that need the most alignment or prioritisation.
AFTER TESTING CHECKLIST
Mentally Friendly is a design and innovation studio located in Sydney and Canberra, trusted by Australia’s biggest and most complex organisations. We help teams and decision-makers build confidence and momentum together.
Our quickly evolving guide to making remote teamwork productive is also available to download on our website — acceleration.team
We have also made our toolkit for product development available here — Mentally Friendly Project Framework | Engineer