Case Study #3 : Feature request development and conducting usability testing

Mike Ivanchyshyn
7 min readDec 3, 2018

--

Product: NetApp Cloud Volumes

Task description: In this case study I’m showcasing the design process of working on a new feature request, applying new hypotheses of the new design system, testing and verifying a variety of hypotheses for the best button placement.

Tools: Abstract, Sketch, Atomic.io, UserTesting.com

Feature request development

In this case study, I started first by discovering different options for a feature request development and later focused on testing one of the hypotheses I encountered. After implementing and updating all the UI elements, patterns and paradigms to the new design system guidelines, I explored the possibilities for the best solution of already existing problems and also submerged during the process. I finished by running usability testing of the one hypothesis for a CTA button placement.

An initial design of the form

Feature request

In this case, the feature request here consisted of adding a new protocol to Export policy section called “X”, which is available only as an addition to NFSv3 rule and required Access and Secret Key user’s input to enable it. Also, there’s a problem with the table flow already: “Active Directory” settings of “SMB” rule is out of rule section.

In Option 1, I resulted in developing a hot fix for the problem which consisted of adding X protocol setting added below the table. Slightly modified toggle component was added which required further user testing to validate its clear functionality.

In Option 2, I tucked in those setting in the Settings button that will solve one of our contextual problems of this case and explored a possibility of using a modified label component.

Label element that allows to enable/disable added tags
Exploration of possible options of dealing with the needed functionality of protocols — Option 1, Option 2 (left to right)

Having those options developed I focused on fixing the table flow with these followed options:

In Test A, it couldn’t work mainly because of the lack of distinction between added and to be added rules.

In Test C, it was solving the previous problem, had a possible faster speed completion, but I wasn’t happy about the Add new rule button placement.

Exploration of possible protocol options — Test A, Test C (left to right)
Popover with protocol’s configuration settings

In Test D, by placing the CTA button in the context it belonged was solving the user’s workflow problem with expected and clear behavior.

Exploration of possible options of dealing with the needed functionality of protocols — Test D1
Exploration of possible options of dealing with the needed functionality of protocols — Test D2

The Test D option is solving all of our main problems in a very clear and manner and fulfilled all of our technical requirements.

The developed version D consisted of:

  • improved protocol toggles that may enable/disable protocols that are dependent on others
  • configuration element for adding/editing protocol’s setting in a popover element that doesn’t obstruct the view and workflow and can be displayed when needed
  • an improved version of the data-table
  • preserving the clarity and consistency of forms according to the new design system.

These developed options needed to be validated backed up by performing user testing session. I’ll highlight different methodologies I used depending on our needs, context, and constraints.

User testing

I will highlight the best-chosen method for user testing my hypothesis of above described CTA button in “Export policy” section.

Option A: app.atomic.io/d/EcQaPaf8y56Q (CTA inlined the first row)

Option B: app.atomic.io/d/xXa9OY7e6z05 (CTA at the bottom right of the form)

Option D: app.atomic.io/d/SN4mIIDpUjLK (CTA centered at the bottom of the form)

Method 1. Quantified feedback for quick UX/UI design decisions.

I chose the “Quantified feedback” user testing method for this case which requires a low amount of time and effort and great for optimizing user experiences and interactions by getting the details right. The recommended amount of participants should be from 25 to 100, using tools such as User Testing, Usability Hub.

Tasks to do by a user:

  1. You’re in Volumes tab. Find “Export policy” section
  2. Create rules:
  • to allow clients from IPs 10.10.1.10/255, NFSv4 protocol enabled, with “Read only” access
  • to allow clients from IPs 10.20.1.10/255, SMB protocol enabled, with “Read & Write” access
  • to allow clients from IPs 10.20.1.10/245, X protocol enabled, with “Read & Write” access

3. Delete the rule with the IP 10.20.1.10/255

4. Add a rule with the IP 10.10.5.50/200, NFSv3 and X protocols enabled, with “Read only” access

5. Make sure that access of rules with enabled SMB protocols is “Read only”.

Questions asked after performed tasks:

  1. Was it clear what “Add new rule” button did? Options: Yes/No/Not really
  2. Did “Add new rule” button behaved the way you thought? Options: Yes/No/Not really
  3. Did the form behave you expected to? Options: Yes/No/Not really
  4. Was it clear what rule was added in the list? Options: Yes/No/Not really.

Results:

Measured success of each tested hypothesis.

Method 2. Experience poll with a quantitative test.

In case I had to test the whole experience of completing the form and overall user experiences, I’d choose “Experience poll with a quantitative test” method. The amount of effort and time can quite differ and usually takes low/medium heavy. It’s great for testing whole user experience on a large scale and requires tools like HotJar or other polling tools.

Tasks to do by a user:

  1. You’re in Volumes tab. Find “Export policy” section
  2. Create rules:
  • to allow clients from IPs 10.10.1.10/255, NFSv4 protocol enabled, with “Read only” access
  • to allow clients from IPs 10.20.1.10/255, SMB protocol enabled, with “Read & Write” access
  • to allow clients from IPs 10.20.1.10/245, X protocol enabled, with “Read & Write” access
  • Delete the rule with the IP 10.20.1.10/255

3. Add a rule with the IP 10.10.5.50/200, NFSv3 and X protocols enabled, with “Read only” access

4. Make sure that access of rules with enabled SMB protocols is “Read only”.

Measure:

Time of tasks completion.

Questions for polling:

  1. How would you rate your experience today from 1 to 5?
  2. Do you have any feedback or comments on using the Export Policy section?

Results:

Task completion time, qualitative and quantitative feedback based on poll questions.

Method 3. Quantified user testing feedback.

In case I had to test the whole experience of completing the form or just a new feature, identify pain points and usability problems, I’d choose “Quantified user testing feedback” method. The amount of effort and time this method usually is medium to high. The number of participants depends on timing and budget, in this case, I’d go for small groups of 5–10 participants.

Questions before performing tasks by a user:

  1. Describe your experience of removing a rule?
  2. Describe your experience of adding a new rule?
  3. Describe your experience of removing an added rule?
  4. Did you have problems with the placement of “Add new rule” button?
  5. What did you expect when clicking “Add new rule” button?
  6. Do you have any additional comments?

Tasks to do by a user:

  1. Create rules:
  • to allow clients from IPs 10.10.1.10/255, NFSv4 protocol enabled, with “Read only” access
  • to allow clients from IPs 10.20.1.10/255, SMB protocol enabled, with “Read & Write” access
  • to allow clients from IPs 10.20.1.10/245, X protocol enabled, with “Read & Write” access
  • Delete the rule with the IP 10.20.1.10/255

3. Add a rule with the IP 10.10.5.50/200, NFSv3 and X protocols enabled, with “Read only” access

4. Make sure that access of rules with enabled SMB protocols is “Read only”.

Every data point is scored by users by giving a difficulty point from 1 to 7 (1 — most difficult, 7 — most easiest), and success score from 0 to 3 (0 — fail, 1- indirect fail, 3 success).

Data points:

  1. Difficulty of adding a new rule
  2. Difficulty of removing a rule from
  3. Difficulty of removing an added rule
  4. Difficulty of editing when adding a new rule
  5. Difficulty of editing a rule that has been already added
  6. Difficulty of changing IPs of added rules
  7. Difficulty of changing Access of added rules
  8. Difficulty of changing Protocols of added rules

Results:

Qualitative: Comments and feedback from qualitative questions.

Quantitive : Overall score = Success score + Difficulty score.

--

--