DIY Usability Testing from the Trenches

Or how FiscalNote jumped into usability testing on a shoestring budget — and you can too.

John Zoshak
FiscalNoteworthy
9 min readMay 26, 2015

--

DIY usability testing is cheap, easy — and incredibly valuable.

When people mention usability testing it can conjure images of one-way mirrors, expensive software, nightmarish recruitment, and a gigantic investment of time. However, usability testing doesn’t have to be arduous, and it’s one of the best investments your company can make in improving the product. In fact, with your laptop, a screen-sharing service, and some patience, you can get a powerful but lightweight usability testing operation started at your company.

Why We Conduct Usability Testing

When we sell FiscalNote Prophecy into an organization we take a lot of time to onboard customers — making sure they’ve received appropriate training on the platform, detailing best practices, and getting them comfortable doing their jobs on FiscalNote. You may wonder, if we spend all this time with the customer as they’re being onboarded, why we conduct usability testing at all. Shouldn’t every customer be well-versed in the platform after the onboarding process? Ideally, yes, but since we’re dealing with human beings, things almost never work out that way. Employees come and go, people miss onboardings, and people forget or don’t pay attention. All of this is to say that, usability still matters greatly, even when you have a robust customer training process. Customers need to be able to log onto your platform and just “get it.”

The best way to identify your usability flaws is to conduct usability tests on your product, with tasks focused on specific workflows. From there you can make simple tweaks and greatly improve the usability of your product. Even better: DIY methodologies make it inexpensive and easy.

How FiscalNote Does Usability Testing

At FiscalNote we subscribe largely to the methodology outlined by Steve Krug in his book Rocket Surgery Made Easy. We conduct usability testing once a month on three users — giving each user the same set of tasks that focus on a certain aspect of our application. The person who conducts our testing (me) is also one of our product managers (generally larger companies have a dedicated usability tester), but anyone in your company can perform testing as long as they are commited to the methodology and aren’t using it to further their own views about the product. Our methodology can be broken down into five simple steps: tester recruitment, task creation, setting up, testing, and the debrief.

Usability Tester Recruitment

The maxim in Steve Krug’s book is “Recruit loosely and grade on a curve.” We take that to heart in our usability testing recruitment. The core users of FiscalNote Prophecy — our legislative tracking product — are state legislative affairs professionals who track bills in multiple states. Luckily, we’re based in Washington, DC so it’s fairly easy to engage these people. However, we don’t limit our tester pipeline to just legislative affairs professionals. Policy professionals, regulatory affairs workers, lawyers, and even our new employees have been included in our testing pipeline. The key point is to grade on a curve — for example if an employee struggles with something we know a state legislative affairs person would get (e.g. typing a bill number into search), we discount that in terms of a potential fix. We also get our whole company involved in recruitment by having a shared document of our testing candidates so anyone can type in a potential recruit.

How do we get people to agree to come do testing for us? Just by asking (see sample e-mail below). We’ve found that people are more than willing to give an hour to a startup that is attacking a problem they face every day. It’s a good way for them to get out of the office, stretch their legs, and try something new. We haven’t run into any issues in getting people to agree to come test the product. To make a good impression, our testers leave with a handwritten thank-you note and a gift card to Starbucks, it’s the least we could do.

It was great seeing you in Nashville. Hope everything is going swimmingly at [redacted].

I was wondering if you wouldn’t mind helping me out with something. I’ve recently transitioned into a product manager role at FN and part of those duties include conducting usability tests of the product to identify areas where we are weak/confusing. It’d be a 30m thing and hugely helpful to me as we continue to improve what we’re offering.

Do you think this is something you might be interested in doing? Let me know if so, I was thinking end of the month depending on your schedule.

Do’s and Don’ts of Recruitment

  • DO leave a thank-you note and some kind of gift (everyone loves coffee)
  • DO keep a running list of potential testers and ask the whole company to contribute
  • DON’T hesitate to ask people to come test for you because they’re not your target customer

The Tasks

For our most recent usability testing we wanted to test the basic workflow of our platform. So we wrote out five simple tasks for users to complete on our site. Each task built on the last one, so we could test if the workflow, as we imagined it, made sense. For example, we created tasks to direct the users to search for a bill, add it to their bookmarks, and create an alert for the introduction of new bills on that topic.

We stray a little bit from the Krug methodology for creating our tasks. Instead of writing up stories for the users, we just ask them how they would complete a task on our application, and then ask them to attempt to do it. For example, we wanted to see if performing a search was as easy and intuitive as we thought it was. So we asked “How would you search for a specific bill on our site?” This type of question prompts the user to start speaking (which is where you get all the valuable information; more on that below) and then they attempt to do the task.

We recommend keeping the tasks around a set theme for the usability testing for that month. This will help keep the testing debrief focused and the tasks will be easier to write. Since you’ll be conducting this type of testing monthly, you’ll have plenty of testers to test every aspect of your site at some point. We also recommend writing the tasks well in advance, this way you can confirm with others that the tasks are well written and clear.

Do’s and Dont’s of Task Writing

  • DO keep your tasks around a theme
  • DO get a co-worker to review the tasks before the testing
  • DO write tasks that encourage users to speak their thoughts
  • DON’T write your tasks last minute (there will be clarity problems)

The Set Up

When we conduct our usability tests, we have our observers in one room, and myself and the tester in another room. We connect the two rooms by a conference phone line and a screen sharing service (we use Join.Me but any will work). My computer also has Apple’s native screen recording software running to record the entirety of the user’s experience. We also use Mouseposé to highlight clicks during the testing — our observers find this helpful, but it’s not strictly necessary.

Amy, our marketing director, joins Michael, Rishi, Peter, Grant, and Kris in the observer room. Together they represent policy, business development, engineering, and Q&A. We try to recruit from across the company.

So the basic set-up is a computer, a screen sharing service, some kind of voice connection (either VOIP or phone), and a screen recorder. Camtasia is a very popular choice for screen recording, but given that Mac OS X comes with a free one, we haven’t taken the plunge on anything advanced yet.

Do’s and Don’ts of Set Up

  • DO make sure everything is working 30 minutes before the first tester arrives
  • DON’T forget to turn on your screen recorder
  • DON’T spend too much on any unnecessary software — the basics work for your first several tests

The Test

Once you have your tester present and everything set up and ready to go, the test is the next (and trickiest) part of the process. Before you jump into the tasks, I highly recommend that you run through Steve Krug’s script (adjusting it slightly for your own needs). This sets expectations, gets the tester comfortable, and reminds them to think aloud. Getting the tester to think aloud is the the most important part of usability testing and where you will get the most actionable feedback. For example, they’ll say “When I clicked on this, I expected xyz to happen.” This is very helpful, because those type of issues tend to hide in plain sight and are usually easy to fix.

However, managing the tester is often the most difficult part of usability testing. First, it’s incredibly unnatural to be narrating your activities. Getting the user to do this is fairly simple, just ask them, “What are you thinking right now?” However, asking them over and over can be annoying to both the user and yourself, but you owe it to your product to keep asking that question. The second the user starts speaking, insights begin to pour out. This is why setting expectations by reading the script first is so helpful, the tester will expect to be annoyed just a little.

Our tester (right) navigates through the first task, while I observe trying to look as professional as possible.

The next issue with testers is that usability testing can make them nervous, particularly since there’s a room of people watching every action that they are taking. To mitigate this issue, I introduce them to the observer room beforehand and make sure to emphasize that we’re testing the product not them.

The last issue is watching out for too much frustration. The rule in Rocket Surgery Made Easy is to leave participants no worse off than when they came in. I have yet to run into a situation where the participant gets so frustrated they want to quit, but setting expectations is key. This is where Krug’s script comes in handy, but I find it helps to emphasize this pre-script too. Explain that you’re looking to improve the product and, in order to do that, you need to see where users struggle. Emphasize that your goal is to make a product that is intuitive enough for anyone to log in and understand, regardless of any customer training.

Basically, it’s going to be a little awkward, and that’s the point.

Finally, when you’ve completed all the tasks, hand them your handwritten thank-you note, ask them if they have any questions, and profusely thank them for their time.

Do’s and Dont’s of The Test

  • DO make use of Steve Krug’s script
  • DO put the user at ease and set expectations
  • DO keep the user talking during the test
  • DON’T fail to ask them “What are you thinking right now?”
  • DON’T let your user get frustrated with a task

The Debrief

Once all the testing is complete, I gather my observers in one room and we discuss what surfaced as major usability problems. To facilitate this we open up a shared Google doc and we each write down the top three usability issues we perceived for each user. There is generally broad agreement as to the biggest problems. We save a separate document for each month of usability testing.

Once that document is complete, it then goes over to the product team (composed of our product managers, our UI/UX folks, and our policy managers) who then discuss improvements and scheduling for tweaks. If anyone from the product team wasn’t able to make the usability testing, they are required to watch the testing videos beforehand.

One final note on observers — we try to pull from everywhere in the company, but we make sure at least one front-end engineer is present and our product team is required to be there to the best of their ability. Other than those requirements, we pull in folks from marketing, business, data science — anyone who has an interest in usability testing. However, try to keep the room from becoming too big. During our first test the entire company showed up and it was chaotic. Now we aim for six to eight observers.

Do’s and Don’ts for The Debrief

  • DO focus on the biggest usability problems
  • DO recruit observers from across your company
  • DO have at least one front-end engineer and your product team present
  • DON’T invite the whole company — it will be an unfocused and chaotic

This is not the end

The above are all of the insights we’ve gleaned from our usability testing thus far. We’ve made some mistakes, done some things right, and overall have learned a lot about our product. We absolutely see value in conducting small monthly testing — and we firmly believe that you can obtain actionable insights even with a DIY methodology.

That said, we’re going to continue to iterate and improve upon our process. For example, we’d love to edit our usability test videos down to bite-size chunks for the rest of FiscalNote to digest, but we’re hampered a little bit by our freeware tools. We’d also like to tweak our debrief process — three usability tests plus a debrief meeting in one day is a lot of time to ask of anyone in the company.

We hope you found this view from the trenches useful. If you have any comments, questions, or tips, feel free to send them to john@fiscalnote.com

--

--

John Zoshak
FiscalNoteworthy

Chronic disease fighter, healthcare nerd, biologist/product manager. Currently at Noyo— where we’re helping make healthcare simple.