© www.gapingvoid.com

The sooner we share, the better things are

Kicking off product discovery with a panel of users.

Chris Clarke
7 min readAug 15, 2014

--

The product team I work in strongly believes in user research, but we were starting to get the feeling that we were becoming too reliant on a lab environment for all our research.

We wanted to try something different, something which was quicker and led to more iterations before we launched something. We wanted to create a user panel.

There are of course several ways to gather research feedback; online surveys, lab testing, hallway and guerilla testing. All beneficial, but not what we were looking for. We had rapid iteration of ideas and scale in mind. Hallway testing was a viable option in terms of cost, but due to being in-house, results could be biased and potentially skew the outcome. We wanted to test our assumptions with real users.

These real users needed to be happy to weigh in on half-finished designs to prove our own assumptions and confirm we were heading in the right direction. I proposed creating a panel of users, who we could contact directly and get feedback on our prototypes.

Quickly showing new and unproven designs with large, unmoderated groups of users was risky. When was it right to put in front of users? How would they react? Would they engage like they would in a lab? Could it be beneficial? Would it hinder those users experience with us in the future?

It was untested within the Guardian, so there was concern showing extremely rough design thoughts to users. We felt the risk/reward ratio was worth it.

Gathering users

We added a banner at the top of our homepage, linking to a survey tool with a five questions — specific questions aiming towards our first round of prototype testing: mobile navigation. The most important part was their email address, so we could easily contact them for testing. We made sure their details were protected through access restricted storage and a strict privacy policy.

Once the initial panel was full — 350 users in under 12 hours! And a good mix of ages, location and devices. Everything was collected in a spreadsheet, with contact through email and links to surveys which all fed back into the spreadsheet. (Spreadsheets are your FRIEND, good for cross referencing user habits against the type of prototype you want to test against them, with wide user group interaction and the ability to focus on an individual’s responses all at once).

Users viewing prototypes

We created most prototypes using a prototype tool called Marvel app or basic html prototypes running off our own servers. Any flat designs we tested focused on visual layout and the users grasp of the content on page. With functional prototypes, the test looked at the user completing a series of tasks and feeding back on the success or failure of each task. Nothing was too polished. Everything was up for debate. That was the point: quick to build and throwaway, try not to fall in love with an idea before we knew it worked.

Visual prototypes received the highest and most detailed responses and asking questions through a link to an online form, rather than simply writing them in the email, proved much more effective. 25 users out of a possible 110 responded to the first survey, and the second brought back over 70.

Testing our new navigation with the panel

When we started to test our new responsive navigation, users were then placed into blocks of ten, based on device, interest and general habits with the site. We started showing prototypes to two groups of five users at first. Every few days after that initial contact, we would update prototypes, and ramp up how many users participated.

Here’s an example of a first prototype iteration:

The response was immediate and informative, and resulted in small tweaks to the original prototype and sent to another ten users.

There wasn’t a huge difference between the first and second iteration, but crucially user feedback was already having an impact, if only minor at first.

Once we were happy the direction the prototypes were going we worked hard to re-touch the design and re-build in production code, ready for deeper testing, and sent to 20 users on the panel.

Design by Katrina Stubbings

This whole process took two weeks. Overall, we showed three prototypes to 30 users. We also tested in our UX lab the following week to get more depth on the initial feedback.

Things we’ve learned: the good

Panels (so far) are responsive

Users responded much better than anticipated. They were adept to the panel environment and detailed in their responses, as well as eager to help, timely, friendly and constructive. Not a single user mentioned their experience of the Guardian had been hindered from seeing early prototypes or concepts. There were some concerns about sending too many requests but we actually found that most users wanted to be contacted even more, and wanted exposure to many more concepts in the process.

Panels are fast

After the initial setup of the panel, contacting them was relatively straightforward and, if the tasks, questions and expectations are clear, we could expect responses in a matter of hours. Add scale to the mix and you’re looking at a lot of responses over a very short period.

Panels are free (if you want them to be)

The incentive for many of our users was simply to know that they were helping to improve the site for themselves, and others. They didn’t need a monetary incentive. That said, we are still exploring how we can incentivise our panel without skewing the results.

The not so good

We don’t always get the outcome we expect

  • Questions cannot be re-explained over email, or delved into. Panels can often miss the point or misinterpret the question.
  • Panels won’t always perform tasks the way you intended. Being explicit over email and can be a challenge when you want to get to the point quickly.
  • Anything the user doesn’t understand is on you for not being detailed enough from the start.

Panels take effort

Users will engage with the right amount of contact. Don’t ask too many questions and demand too much of them. And don’t email them too much. We kept a rule to only contact each user no more than three times a month, explicitly stating that in the first email.

Unless there is a monetary incentive, constantly contacting users often can put them off. To keep users interested with tests, vary your tasks and questions.

Most of the time the challenge paid off from both sides. The team shared their ideas with our real users and the panel got to see how their feedback helped to shape iterations of prototypes and, in a lot of cases, features which went live. The final step was to show participants the results, and of course, thank them for their input.

Final thoughts

Make a point to gather a set of users very early on — or even before — the discovery phase of a project. That way they will be somewhat prepared to be contacted and you will be prepared when testing is needed. When you approach discovery again, those users might still be willing to help.

Users don’t all respond when you want them to so be prepared to wait when you contact them.

Creating a large panel of users is of course much easier in large product organisations where you have hundreds ready and willing to offer their help. However even if you work in a smaller organisation, I would recommend trying to create a large panel especially if you don’t have access to a lab or face-to-face testing environment.

Any confusion on the users part is on you, not them. If you get confused feedback, make a note for following contact to be more explicit in what you’re asking them to do. If there’s pushback in your team when suggesting panels, setup a test panel with a small number of users first and try that. If you’re lucky and the turnaround is quick, it might just be the example you need to prove to other team members that it’s worth doing.

Next steps

Just over a month later, we’re still using the panel, and it’s getting better and better. We’re testing in week long stints, showing prototypes everyday, refining and always moving forward. The panel has helped improve lab sessions as well, making sure the tests we perform in the lab are with the most viable product.

From here we’re looking into one-to-one engagement to explore remote international testing without the lengthy trips and set up between them. If successful then there’s no doubt this is something we’ll recommend when user research and testing is brought up to discuss.

Thanks to Penny Allen, Nick Haley and GiGi Demming for helping out with this.

If you like this post I'd appreciate if you hit recommend or share with a friend! You can also follow me on twitter.

--

--