Case Study: Design Testing and Validation

C. Kyle Jacobsen
7 min readJul 19, 2015

--

Allow me to set the stage. I’ve been building product for a long time. I’ve built product in waterfall, Agile, Lean, you-name-it methodology and the result was always the same. At some point I would synthesize customer feedback, mostly gut feel, dump it into the laps of the development teams and the mistakes continued to grow. About six months ago I started a new job with yet another methodology (insert sarcastic smile), little did I know the effect it would have.

We had talked to a few customers and understood the problem — getting people into the application was not easy. In fact, it was bad. Here we were… a software company offering group plans and getting people into the application was impossible. So, we talked to clients. We analyzed market offerings. We checked out competitors. Then we wireframed some ideas and started to test.

The following describes my experience with Kenji Bankhead, the best designer I’ve ever worked with, testing our wireframes from start to finish.

Round One: Wireframes and Wasted Time

So somehow we thought the most logical place to start was by asking the user to describe their organization structure. This seems logical, right? They are buying licenses for their people and have teams all across the world. So let them describe how their organization is structured and they can build teams and people at once. Within minutes of the first call we knew there was a problem — our design was wrong. Really, really wrong.

Our first tested wireframe. We tested with seven clients and immediately knew it was wrong.

Going into this I was confident that we had created a great design. In fact, so confident that I had scheduled seven (yep, that’s 7) design tests for the day. Wow, what a mistake. By the end of the day… we were smoked. Worse, we were frustrated because we had to listen to seven different people tell us our design was stupid and broken. Result — sadness.

The next iteration was very different. We killed the whole, “tell us about your organization structure” concept and went directly to creating a single team. At this point you may be asking yourself, “but wait, isn’t the problem about getting people into the application and not just creating a team?” well, yeah. We made another mistake here — don’t worry, we will make many more. Clearly, we needed more Voice of the Customer calls before we started to design.

Version 2: adding users at this point but still not focused on solving the pain.

Next we added some purchasing to the mix. We needed to give our users some context so from this point forward we tested purchasing and inviting at the same time. We also started to issue objectives to our testing partners so they knew what to do. This really improved the discussion because asking someone to click around on a screen and provide feedback without objectives is like asking someone to kick field goals in the dark — frustrating. (Note: the experience we designed included ~12 screens which the user had to click through)

Version 3: We added purchasing for contextual purposes. People were going to buy before they added new people and we needed to set the stage for our users.

So, there we were, Round 1 of testing in the bag. Our designs were making progress but not without major mistakes and wasted time. We learned that providing context improved feedback. We learned to space the product testing sessions out over time. We learned that we needed more Voice of the Customer calls before we started to design. And having objectives make the testing users comfortable.

Round Two: Door Number 1 or Door Number 2?

At this point we were ready to move into hi-fidelity mocks and decided to make a major shift in the UI. Following our first round of testing we settled on two solutions to test:

Door Number 1: Users can invite people by adding emails and let Pluralsight invite the user, OR

Door Number 2: Users can invite people by sharing a URL which let anyone (on their domain) create an account

These solutions were both fast and easy for the users. What we didn’t know was how to present these options. So we started by presenting both options on the same page and immediately learned it was a mistake.

Version 4: We offered both options and it failed miserably.

Having both options in the same screen really confused our users. In test after test our users thought the two options were connected (they were trying to add emails of their teammates and then tried to get the link to share with them) — apparently they didn’t see the “OR” part of the screen.

Kenji and I were frustrated. But they were right. Our design was flawed. Too many CTAs (Call to Action) on a single screen. So… we broke out the options and let users toggle between the two.

Version 5 / Option 1: Users can simply paste emails into the screen and we will send the invites out.
Version 5 / Option 2: Users can copy a link and send it to anyone.

Success! Breaking the options out made it easy for users to understand what they had to do. However, now we were seeing users struggling with Door Number 2 (more on that later). So our next was question… which option should be presented first?

Turns out it was 50/50. After testing ~12 times there was no clear winner. We hoped the users would select the default option but no dice. So, we weighed the options and selected for them. Door Number 2 was chosen because of two reasons.

First, this is a much easier path for the user to get people in the application.

Second, we believe this option will drive higher product usage (this is just an assumption but we plan to test this assumption in production).

Round 2 lessons learned… too many options is a bad idea. People get confused and can mix them up. Also, testing doesn’t always reveal the correct design — sometimes you have to pick a path and test it after you roll to production.

Another note worth mentioning… we really struggled in this phase because I was speaking too much in our testing sessions. I wasn’t teaching them how to complete the tasks but my questions were leading them to the solution which was a terrible mistake. Luckily, this was pointed this out so I was able to shift my approach and as a result we immediately got honest responses.

Round Three: They just don’t get it?

It seemed so simple. So. Very. Simple. But explaining the “invite people by sharing a URL” option (Door Number 2) wasn’t easy. On the surface, this solution is simple but not for our users. They are accustomed to more traditional methods so how did we make it clear for everyone? We started by creating several versions and asking for people to explain each.

Here we tried to put the CTA first and let them add domains. Domains control who can get an account.
When people showed concern over sharing a link, we flipped the design to make sure they understood they had control.
We then stripped out adding / editing domains.

We exhausted just about every possible scenario. We let them edit their domains (which is the gateway to users signing up). We changed the order by swapping the link and the domain. We stripped everything down.

The final version. Well, not really we pulled out the Twitter reference.

After testing this extensively Kenji and I decided to provide loose directions and this seemed to do the trick. People understood it.

Done! We got it. The users love it. Getting people into the application will be so easy now. Which path will they take to invite people? Will usage go up for those that have to go and get a license versus those that require little action? The design is about to hit the store shelves and I can’t wait to see what happens.

Lessons learned:

  1. Providing context to the user improved feedback.
  2. Spacing testing sessions out made it easier to digest the information people were sharing.
  3. We needed more Voice of the Customer calls before we started designing.
  4. Testing objectives make the testing user comfortable.
  5. One page equals one Call to Action.
  6. Finally, not all testing reveals the correct design — sometimes you have to pick a path and test it after you roll to production.

--

--

C. Kyle Jacobsen

CPO and Co-Founder at Everee / Podcast: We Need Another Meeting! / Personal Website: ckylejacobsen.com