Experiments in User Testing: the Pop-up Lab
In my previous job as a UX designer for a large library system, I enjoyed a significant advantage when it came to user testing: a captive audience. Every day hundreds of people visited the main branch, where my office was — not to mention the thousands who used our website. Anytime I wanted, I could sit in the lobby with a laptop or an iPad and run some moderated tests on new features. Or I could put a link at the top of our home page that said: “Got five minutes to help the library?” and invite users to take a short survey or click test. We always got a good response to either method (endearingly, patrons were almost always happy to take some time to help us).
Testing Infor.com is a bit more challenging. Our users don’t walk in the door of our offices by the hundreds every day, so I can’t just go downstairs and grab a few. As for putting a link to a test at the top of the home page, I was wary of distracting users from what is usually a very targeted, focused hunt for information — I wouldn’t like to annoy a potential customer away from the site altogether. And anyway, any test deployed on the site would be unmoderated — where could I go to find our site visitors in person?
The answer, of course, is Inforum — Infor’s annual convention for customers, partners, and users across the globe.
I joined Hook & Loop in June of 2015, so I had never been to Inforum. But when we started the planning for this year’s event (the biggest yet: in NYC for the first time, hosting 7,000 attendees), I realized what a great opportunity was available to run live user testing at the single biggest gathering of Infor’s customers in the world. I proposed the idea — I was picturing a desk and an iPad, like before. But our creative director started sketching individual rooms, computers, a waiting area; in short, a real testing lab.
The first order of business was to round up my fellow information architects and start exploring ways to execute the idea. We wanted not only to test website designs, but also to collect data on product-related features that were being developed. We talked about what we were all working on, and what might be ready to test by Inforum (we started planning in early March, so we had a little over 4 months before the event). Eventually the plan coalesced into the following menu of tests:
· Ming.le collaboration tool
· HCM Candidate Space
· XM mobile design
· LN Automotive Workbench
· Mongoose PTO phone app
· SoHo menu design
· Infor.com home page design
Then we needed to decide what we required in order to run the tests. We were all going to have browser-based prototypes, so nothing more complicated than a standard modern browser would be needed to present them to the user. We all use InVision or UXPin to build our prototypes (although for my test I was using an HTML/CSS code prototype), so it would be easy to create a page of bookmarks from which all the tests could be run.
Some prototypes were specifically for mobile, so we would need phones and tablets in order to properly replicate the experience. Initially we discussed figuring out how users could run the tests on their own devices, thus allowing us to watch them in their own context, but that ended up being more trouble than it was worth. However, that consideration led me to investigate the feasibility of a mobile recording rig, which our excellent video team put together for me with a GoPro and an adjustable arm. We’ll keep this plan in our pockets for future testing!
We proposed the following to our colleagues in charge of planning the physical space:
3 enclosed testing rooms, each with:
· 1 MacBook Pro with webcam
· 1 Thunderbolt display
· 1 mouse w/ scroll button
Additional mobile testing devices:
· 2 tablets, 2 phones
(We specifically asked for mice with scroll buttons because most of our users have slightly older equipment — no use testing with a Magic Mouse if that’s not what they use!)
Once the physical space was approved, we were able to finalize our plan, including what kind of software we would need. To record the sessions, we decided to use Lookback. While Lookback is currently Mac-only, we knew we’d be getting MacBooks for the rooms and iPads/iPod Touches for the mobile testing, so it would work in all our contexts. (I had not used Lookback before, having used Silverback in the past, and was happy to discover how easy it is to use.)
The most important part of this plan was that we designed each test to take no more than 10 minutes. Those of us who had done similar kinds of testing before knew from experience that sign-up sheets don’t work that well; therefore, we were planning to grab attendees as they came into the space and invite them to do a test. That meant the tests could not be dauntingly long — no conference attendee wants to spend half their lunch hour in a 6x6 UX lab!
We shared the space with our colleagues who were giving product demos at “Apple store”- style desks. We planned that the dual purposes of the space could feed into each other: an attendee could watch a demo, and then a staff member could direct them to the lab to critique actual products not yet ready for demo. Or once a test was over, we could send a participant to see some finished products — the result of user input!
We didn’t get to see the space itself before the event, but it was very impressive — due in no small part to the fantastic wall design by Josh Davis, Art Director for the brand team.
Josh’s design of the lab walls was inspired by the idea that pieces of an interface make up a whole user experience. Josh drew the little wireframe sketches by hand on paper with a marker, and then scanned them. The shapes became a sort of background texture, like a print fabric, allowing the copy to visually come forward and be the focus.
The copy that was chosen (some stats about the effectiveness of a user testing process, and a quote from our UX Manager for Product, Parisa Bazl) was intended to educate attendees about the importance and impact of user testing. Josh wanted this mixture of words and visual cues to underline the value of what was happening inside the lab.
The only thing that was a little challenging about the lab design was that the doors to the testing rooms (which we rented, and didn’t see until Day 1) were on the “back” of the lab unit. That turned out to be ok, though; we learned quickly to make sure of few of us were hanging out on that side to intercept curious attendees and invite them to test!
Over 4 days (2 full days, 2 half days), we ran 96 tests total. Some tests were run more often than others (the site home page test and our collaboration application, Ming.le, were particularly popular), but all but one test drew enough participants to produce usable data.
We stationed some staff in front of the lab to receive and chat with attendees and to invite them to the tests, as well as behind the lab (which turned out to be a more trafficked area than we had assumed it would be). We quickly got into a routine of: 1) intercept/invite 2) run test 3) and the most important part, give “Beta Tester” sticker! There was a steady stream of participants throughout the day; we even had to turn people away a few times.
There was one more aspect to this experiment. Once Inforum was over, I didn’t want the testing to be over too. So I came up with the idea of a “Beta Tester Community.” We asked everyone who ran a test if they were willing to receive a handful of emails a year inviting them to run similarly short, browser-based tests. All they had to do was give their email. (We set out a bowl to collect business cards from anyone who didn’t have time to run a test with us.) Almost all our test participants opted in to this pilot program to continue testing prototypes and designs.
The results/next time
We ran 96 tests during Inforum, and 57 participants signed up for the Beta Tester Community. We all got valuable data, and we all reported that many of the participants were intrigued to by the concept of user testing and excited to take part in it. One of my fellow researchers (analyst Elan Nahman-Stouffer) put it like this:
“Thinking back, I realize most of the conversations I had were more about informing people what Hook & Loop is, what our purpose is and showing some of our work.”
Afterwards, when we all got together and talked about our experiment, we agreed that our experiment had been successful, but we did have thoughts about what to do differently next time.
First, we agreed we should cross-train each other on the different tests, so that we were not limited in terms of testing availability. We also agreed that we should have developed some self-guided surveys or tests — something a participant could complete by themselves on an iPad — so that during the moments all the testing rooms were full, interested parties could still participate.
Elan’s comment about showing off what we do is illustrative of the continuing importance of education and communication around user-centered design. So many users still don’t understand how important their input is to our work, and after this experience, we all wanted more than ever to demonstrate their vital contribution. Elan also commented that the design of the physical space left us on the outskirts of the central product demo area, and therefore more in need of contextualization. We’ll plan thoughtfully next time about how the physical space dictates the way we communicate our purpose, relate to the other areas, and engage participants in a way that describes the customer/designer relationship more fully.
(We also realized that we should have dressed up in lab coats. If we’re going to blind them with science, we should really look the part.)
You can make your own!
Obviously our pop-up lab had some resources behind it to help it look awesome, but yours doesn’t have to be fancy. All you really need is:
• A solid testing script: what questions you want to ask, what you hope to learn from the tests.
• A time limit: setting the participant’s expectation that the test will only take 10 minutes of their time goes a long way to gaining their good will.
• Reliable equipment: again, doesn’t have to be fancy, just needs to serve your purpose.
• A backup plan in case something goes wrong: for example, if you’re planning to use any kind of online testing environment such as InVision, make sure you have a “hard copy” of the test (could be the prototype downloaded onto your laptop, could be a paper printout).
• A designated location: this could be as simple as a folding table with 2 chairs and a sign. Just something that announces what you’re doing!
Here’s what you don’t need:
• Sign up sheets: they don’t work. Instead, invite people to take a test right away with a cheery smile and a simple “Hi! Would you like to take 5–10 minutes to help critique some designs?” (If you’re not good at this part, find an outgoing/extroverted colleague to help you! Full disclosure: I am terrible at this part.)
• A ton of participants: this kind of testing tends to show patterns of behavior after about 5 tests. If you can get 10 participants to take your tests, chances are you’ll uncover about 80% of the issues with your design.
Building your own pop-up lab is easy! Remember — people like seeing under the hood of design processes. Invite them behind the scenes and not only will you collect some great feedback, but also your participants will learn something too.
(And if you would like to join our Tester Community, you can sign up online!)
Update: want to see how user testing went in 2017?
Big thanks to my fellow IAs and researchers:
The H&L booth “barkers”:
Karen Van Houten