The principle concept of weekly user testing is creating testing patterns that need testing today, easier. Instead of finding a pattern that needs to be tested, and then trying to schedule users for the test, realizing it’ll be about 3 weeks away, and then focusing on that 3 week deadline; you create a weekly testing schedule that allows for immediate testing and validation.
Test like we build — in small, iterative tests. Schedule the same half day block each week, and make it reoccurring. The project I’m working on, we’re currently testing on Tuesday afternoons. We spend most of Monday figuring out what needs to be tested and how to test it. We’ll create a simple script for the test, and test 3 to 4 users in 20 to 30 minute sessions. The script helps keeps the test baselines accurate, and ensures we don’t deviate for any one tester leading them to specific results.
We have a ~20min Wednesday recap meeting with the whole team, where we discuss findings and patterns we’ve seen emerge. We chat about user feedback vs what we saw on screen, and what solutions are provided. Some weeks the testing provides a clear win. Other weeks the testing results are a bit more vague, and we realize that we need to rework bits of our UI and testing tasks.
When we started testing, we worked with the company employees who interact with customers on a daily basis, and asked for their referrals of customers who would be interested in testing. After the first few weeks, we now have a strong list of interested customers, scheduled for the next 5 weeks, each with their own experience levels in different parts of the application.
Customer feedback has been over whelming positive, and they love to join us for 20 minutes to chat and listen to their pain points. Often, you can’t listen to just one suggestion from a customer, but when you have several customers all discussing the same issue, you discover the pattern where there are gaps in the product and workflow. These are issues that are costing the company money, and would have never been addressed had user testing not been initiated.
Since most of the customers are remote and sit deep within large enterprise companies, we’ve had to figure out a workflow for testing that would be successful. With all of the great user testing video products out there, we actually deviated from all of them–and use WebEx for our testing, for a variety of reasons.
Our testers were already comfortable with using WebEx, and didn’t have to install any new software on their machines. This helps remove any barrier to entry for them, and eases any anxiety about testing or setup.
WebEx offers the ability to screen share and video chat (if the computer has a camera). We’re able to view their screen as they walk through the test, and watch their facial expressions if they’ve shared their screen. While this isn’t as perfect as being in person; it’s still very successful. We’ve had great results with this, and it’s been very interesting to compare what a user says, where their eyes are looking and what they’re actually doing to discover their intentions and roadblocks.
WebEx allows screen recording. Before we record, we always ask the tester for their permission, and so far, after testing over 20 people, we’ve only had one customer decline that offer. Screen recordings are great for documentation, and for post test review if we need to take additional notes or have specific questions about a user’s interaction with testing tasks. Recordings are also a great resource for sharing the test with other members of the team.
Speaking of sharing, WebEx allows multiple people on the call–which is great for testing. As consultants, we want to show the benefit and value of our work to our clients. With these tests, we’ve had our main stakeholder join the tests, so she can witness the user interactions; along with several developers who have been working on the team for numerous years. It’s really great (and I highly encourage you) to have developers join testing sessions. You don’t need to have every developer on every test, but having them join 1 or 2 a week, helps instill empathy in them as they watch users interact with the product they’ve been building. This creates a tighter bond in the project team, and you can visually see the increase of people caring about the product.
We’re consultants on this project, so our time here has an end date. We want to make sure that testing is continued, so we’ve thought about how to make it as easy as possible. We’ve created testing script templates for the team, and each week walk different members of the team through our scripts, and how we created them. We’ve shown how we can test a variety of materials; from low fidelity mockups in an InVision prototype to more detailed HTML/CSS prototypes for specific interactions. We have a long list of customers that are willing to do more testing or test for the first time, created a weekly schedule for testing, and demonstrated the benefits of it.
Reducing all of these barriers to entry helps ensure that the testing will continue when we move on; helping this organization move to a iterative, data informed process.