Over many years I’ve been working in the industry as a designer in the UI / UX space. I’ve run various user tests over my time, each unique and tailored to best fit the clients’ needs. Through this experience, I’ve learned that these sessions offer the most benefit to the client when you can follow up with a clear and concise User Testing Report to communicate your findings and key action items.
User tests are a great way to find out how people behave when interacting with a website or app.
I prefer to run remote user testing sessions as in my experience users feel less pressured to provide biased answers. Sometimes I conduct the sessions with user testers on my own, and sometimes with the client’s UX team. Guerrilla User Testing is another testing option; this method often leads to interesting feedback. Both testing options benefit from a document to share with your client and internal team.
Whilst running a user testing session is important, you also need to record the results. Not only is it impossible to keep all of your findings in your head, but you may also have a different recollection of events after the fact. Without capturing the results, you may end up with unintended bias or misrepresentation in your results.
You need to be careful about how you prepare a User Report. Creating a big report can result in a document that no-one reads. Creating one that’s too minimal, runs the risk that it may not be taken as seriously; resulting in the client not taking onboard your recommendations to the potential changes that need to happen.
“I’ve taken a brief look at the report — can you give me a quick run down on the results?” — Client
It’s important to keep your report lightweight and follow lean design principles so that it doesn’t slow down your process. After all, the tests are just a means to an end of improving your site experience.
Based on my experience I try to keep my report to under 10 pages, with clearly defined sections whilst using a key bullet point structure. Keeping the document clean and concise helps the user easily digest the findings.
Here’s a guide to making a comprehensive but light user testing report, with a rough indication of how big each section should be.
Summary of key findings
Whilst aiming to keep the report short, it’s a good idea to make room for an executive summary. Including this slide right at the start, shows your client the most pressing two or three issues that came from the round of testing. Even if a client or the stakeholders read nothing else, they still learn this key information.
Aim to identify the pressing issues that came from the testing, If you were to advise only one change to improve life for the users — what would it be? By deriving this at the summary it will help focus your findings.
Who you tested with
Summarise the demographics of the users you tested with and how many there were. Usually, I try to capture as much information as possible that I feel the client and stakeholders may find useful.
Initially, I ask many introductory questions to the user testers — many of which don’t make it into the report. I do this to help calm the user and to build a friendly rapport with them, before delving into the user flow.
The data that does make it into my report is usually, user testing participants, age demographic and everyday iPhone / Android user. If you wish to include additional screener questions into your report, feel free to do so.
Conduct five test sessions
The rule of testing with only five users is a solid one that has always stood me in good stead. Testing five users will help you find up to 85% of the core usability problems in your product. You learn a lot from the first person you talk to, a little less from the next, and so forth. After the fifth user, you’ll observe the same findings repeatedly, but won’t necessarily learn anything new.
User Story Tasks
Once identifying who you tested with, I usually identify the tasks that the users tested from. If working with a client it’s good practice to run this by your client beforehand — I usually share a Google Doc with my client early in the piece to ensure they are kept abreast on the evolving user flow script.
Key Issues and Takeaways
The main focus of this report is to identify key usability issues. Items that make this list are usually tasks that caused the most frustration to the user — you may find these items are reported more than once.
Aim to identify the issues almost as a bullet point like list, including a clear statement explaining the problem, possible solutions and client feedback on particular usability issues.
Ideally, bugs should be found during QA and not during user testing but it’s hard to catch everything: things can break over time and users can do things you just didn’t plan for. I aim to capture bugs into this Key Issues and Takeaways section.
Top user likes vs dislikes
Whilst conducting a user testing session, a variety of odd bits, bugs and usability problems can arise that hadn’t been accounted for. It’s good practice to additionally capture this feedback — especially positive feedback as it’s easy to get lost in the problems and negatives!
Ending with some positives is a nice touch, and this slide can also be used to include suggestive feedback from your users that perhaps the design team, client or stakeholders hadn’t considered.
Pre-test questions (optional)
Additionally, a lot of remote user testing platforms offer the opportunity to survey participants before (or even after) the test. These questions can be of varying usefulness, but any chance to gather a bit of extra detail from real-world users should be taken. Use it as an opportunity to find out a bit more about them.
As I mentioned earlier I usually try to establish a friendly dialogue pre-test to help calm the user. This especially helps when the user may be of a unique age demographic or unfamiliar with a different testing device.
Capture each user session
Recording each session is a great way to capture accurate results. I usually set up a small tripod with a mobile phone/camera. This way I can refer to the footage at a later date to double and triple check on each users findings. As I mentioned earlier- it’s almost impossible to keep all of your findings in your head.
Explain the purpose of recording the test, and mention you’re recording the testing of the product, not them.
“I’m recording the testing the of product, not you. No need to worry about making any mistakes. And please don’t worry about our feelings. We want to improve our product and we need to hear your honest reactions.”
Follow ”Think Aloud Protocol”
When moderating the testing, always ask the participant to think out loud. The think-aloud method is critical for getting inside the user’s head. It means asking the user to speak out loud everything they are thinking, so you can gain insight into the thought process behind the user’s actions.
As the participant uses the product, you should encourage them to think out loud and share their thoughts and ideas with you. Since talking while doing isn’t typical for people, you should ask them to do it:
“While you’re using the product, I would like you to think out loud. Just say what you’re thinking, what you’re trying to accomplish, what you expect to happen after an interaction, and so on.”
- Don’t hesitate to ask test participants questions like, “What are you currently thinking?” “What do you think will happen next?” or “Is that what you expected to happen?” during the testing session to stimulate them to verbalise their thoughts and feelings.
- Even when people are thinking aloud, sometimes they experience problems with verbalising their thoughts. That’s why you should ask clarifying questions if something seems unclear or you think there’s more information a test participant can share.
Get the template
If you are interested in seeing a real client User Test Report of mine, I’ve included the PDF report and the Sketch working file, which you can get a hold of here and customise as much as you’d like.
Thank you for reading! 😊