Measure Twice, Cut Once: Introducing Usability Testing Into Our Design Process
A few months ago our UX team made the decision to begin implementing usability studies and user interviews into our design process. Even after 5 years in the game this isn’t a huge deal considering Unbounce already has a pretty close connection with it’s customers (our Customer Success team rocks). To ensure we are building what our users want, we take feature requests into account to plan out our product roadmap. However, adding customer interviews and usability testing to the mix will allow us to improve the product experience well before we actually start building anything. This in turn gives not only our customers, but our developers and the rest of the team, more confidence in what we are building.
So, User Testing. Let’s do it. I’ll grab my laptop, some petty cash and head down to Steamworks and buy beer for any willing participants. Weelllll not so fast. Although this guerrilla style user testing may work for some products, Unbounce’s user base is pretty specific and a lot of the features work best if you have some knowledge of online marketing (i.e. knowing what the hell a landing page is). Not knowing how to conduct a proper test myself, I set out to research, compile, and create a process so that any Unbouncer could be well equipped to run a test themselves.
Armed with the internet and the network of the Vancouver tech community, I began googling and setting up coffees, beers, and lunches with people who had some experience conducting usability tests themselves. I knew that in order to have an educated discussion, it is best to do your homework beforehand so it is less of an interview and more of a conversation. I scheduled my meetings and started googling.
The awesome thing about this industry is that because it is growing so rapidly, a lot of ideas are really fresh. UX professionals are experimenting with new concepts and then immediately sharing their findings on Medium and/or their own company blog *wink wink*. The open-source web is not limited to just code; case in point: Google Ventures. By way of some articles on Medium I found myself on the Google Ventures site which is loooaaaddeeddd with resources. Michael Margolis, and the GV team have put together an excellent package for anyone who happened to be in my shoes. Which was no one. Because I work with my shoes off… Heh heh heh.
Feeling more comfortable with the idea of user testing, I reached out to Bradley Smith, Senior Interaction Designer at Habanero and my mentor from my time at VFS. Pen-ready and moleskin in hand I picked his brain about things like: recruitment, prototyping tools, in-house testing vs. remote, and how he sets up his usability labs. Brad shed some light on a number of things that I had not considered, one in particular was the importance of having a device for each type of user.
Mac users, remember the first time you made the switch from PC? Or the last time you had to switch back to a PC? It can be pretty frustrating. Is it Ctrl or Command? Which side do I close the window on? Where the hell is my right click?
Now, this isn’t the end of the world, but if you are conducting user tests, you do not want your participants to have that immediate feeling of discomfort when they sit down at the computer (or other device if you are testing for mobile). In a perfect-usability-tested-world, you would have a device to accommodate all of your users’ technological preferences.
With this in mind, I dug through Intercom to get an idea of what OS our customers are using. For active users in the last 30 days (at the time), 8,046 were using Windows; 7,644 were on Mac; 179 on Linux; 55 on Ubuntu; and about 1,679 were unknown. With roughly half of our users using Mac and the other half on Windows, it immediately suggests that we have at least two testing stations: Mac and PC. Not only is this insight valuable for testing, but it also provides great insight into what our users expected behaviour may be based off of their OS.
My next meeting was with my product-partner-in-crime Vivi and her mentor Kristy Streefkerk, Senior UX Designer at Quickmobile here in Vancouver. Kristy taught Vivi and I Information Architecture and Usability Testing at VFS so she seemed like the best candidate to teach us more about setting up our own tests at Unbounce. We went for beers to catch up and then pretty quickly got down to business. Before I knew it, Kristy’s insights were scribbled in my notebook and Vivi and I had a bunch of things to go off of.
She outlined two reasons for testing: testing to identify existing pain points and testing a new feature. The first is intended for new users with fresh experiences and would require some form of recruitment from outside our existing user base. The latter depends on previous experience with the product so that the user is not concerned with learning an entire new system. Instead they can focus on testing the proposed addition to what they already know. This seemed like a better fit for our current situation and I will explain why in a second.
The common element between my research and the meetings with Brad and Kristy was the importance of creating an environment that puts participants at ease. The whole idea of user testing is not a trial of your users’ ability, but a trial of your design’s functionality. In order to get the best results out of your test, you want to try to provide a setting that gives the same level of comfort as if your user was at home or in their workspace. Remote tests can obviously be done in these locations but in house testing is different.
So, are they used to working in a quiet space? A busy office? Do they work at home with their cat? Again, what kind of computer do they use? Most people are not comfortable in a foreign office being recorded by a stranger who is asking them to perform tasks on a computer they have never used before. It can be a little “Buffalo Bill-ish”. Now I’m not saying that if a person is used to wrestling with their cat for keyboard space that you should have a kitten on hand for tests. Just try to remember that your participant’s level of comfort will affect the success of your tests based on the feedback that they give. The more comfortable they are, the more honest their feedback will be.
At the moment we don’t have a super fancy usability testing lab with 2-way mirrors, designated computers, and comfort inducing kittens. We also happen to be growing like wildfire and are (*have*) running out of space. Having a customer come in for a chat, let alone to do some testing, could be a bit of an intimidating experience. This immediately spoils our testing environment.
So ok, who do we know that has definitely used our product before and will also feel comfortable enough to give no bull-shit, honest feedback? What about our team?
Well that is exactly what we are doing. We have been encroaching on our Customer Success team’s webinar space and use a combination of different software. MarvelApp has allowed us to set up clickable prototypes with high-fidelity mockups and we have been using GoToMeeting for screen sharing as well as recording for review at a later time. It took some time to settle on what to use as there are a lot of options 1and a number of tradeoffs between each. Nothing has given us exactly what we want (Mac and PC compatability, screensharing, audio and video recording, picture-in-picture display, and click tracking) but for the time being we are using GoToMeeting. It allows any member of our team to watch the live tests from their desk and also lets us screen share and record. Please comment below if you have any recommendations for a better solution.
Now, there is a lot of talk from people that hold strong in the opinion that you should be testing fresh users for fresh experiences but stick with me and I will explain why what we’ve done is ok.
Since I started at Unbounce, I have been working on a feature that would allow users to add, edit and delete script tags on multiple pages from one central location 2Conveniently, it is a pretty hefty feature that affects the entirety of our product (not just the app side or just the page builder, but both). This is great for testing because it allows the user to spend a lot more time with it and really get a feel for things. By giving the user tasks that require them to go from one end of your product to another you get to test: learnability, efficiency, memory, errors and satisfaction. 3
As opposed to testing the effectiveness of one street sign, we are testing the navigation of an entire city, allowing us to pick up on behavioural trends and common “bumps” in the proverbial road that is our design.
Now since this is a *NEW FEATURE* , and therefore a ‘fresh experience’ for anyone, we decided to go ahead and use our immediate user base of co-workers. We ran 8 tests with people from our customer success and marketing teams, each with different levels of experience between the App and Page Builder. Again, what is great about this is that every participant was guaranteed to have used our product before, but none of them had ever seen this new feature. Also, because everyone is so comfortable with each other here, it allowed for candid, unfiltered, unintimidated feedback.
If you’re still reading, congratulations, you get a prize. And if you’re still, still reading I’m going to do you a favour and wrap things up.
Regardless of the situation, we knew the value we were missing out on by not testing and so we decided to go ahead with what we had. Out of the 8 tests we ran we uncovered a number of issues that until that point had gone unnoticed and most likely would have made it into production. I used to work construction and my boss always said, “measure twice, cut once”. This is exactly the idea behind testing. By doing the added work of interviews and testing before going to production, you end up saving time and resources that likely would have been spent on support and feature improvements further down the road. I am extremely happy with our first round of testing and cannot wait to see them running on a regular basis.
If you want to play around with the prototype we used for testing you can do so below. Keep in mind that it is a clickable prototype and although it does look real, a lot of the functionality is not there. Input and form fields will not work, just click on them and they will auto fill with the inputs that we have predetermined. I am also attaching the script and interview guide we used so you have some more context (I used the Google Ventures guide and made changes to suit our needs).
Also, as we are looking to start testing way more regularly, we’d love to have YOU come get tested. If you are interested, you can sign up here (new users and power users alike).
Lastly, I just want to say a big thank you to Bradley Smith and Kristy Streefkerk, as well as Michael Margolis and the GV team.
2 this description was actually derived from a combination of the feedback I received from user testing. At the end of each test, I would ask, “How would you describe this feature to the rest of the company?” I used the common elements of each description and this is what I came up with.
3 shout out to Nielson Norman Group
Originally published at inside.unbounce.com on June 17, 2014.