I joined Nutanix as a UX designer 1.5 years ago, and one of my first projects was redesigning an application called X-Ray. In this article, we will explore why we redesigned X-Ray, explain how we approached the redesign, and look at the features and benefits that this redesign enabled.
What does X-Ray do?
X-Ray gains its name because it is a tool for examining and understanding IT infrastructure, specifically hyper-converged infrastructure, much like a medical X-Ray is a tool for examining and understanding the human body. It simulates typical datacenter scenarios to evaluate various characteristics of the infrastructure, such as performance, scalability, resiliency, etc. This helps an IT admin make an informed decision on which infrastructure to purchase for creating or expanding their datacenter.
User research helped the team get onto the same page
When I started working on X-Ray, I learned that it was designed 4 years ago and hadn’t changed much since then. After working on several smaller bug fixes, I realized that there were structural and fundamental interaction issues that weren’t being addressed by bug fixing. The idea of redesigning X-Ray started germinating.
Even though I felt a redesign was warranted, the development team wasn’t entirely convinced because we had little knowledge about what users actually thought about X-Ray. So I decided to do research to learn how users use the application, understand their mental model, identify their pain points, and validate the idea of designing X-Ray. To do this I performed usability testing with 15 users, including users who were completely new to X-Ray as well as users who had been using X-Ray since the tool was first released so that I could get a holistic understanding of a wide range of user perspectives. Using a conventional think-aloud protocol, the user was asked to perform key task flows such as running a test, finding the results and making comparisons of the results.
The following items are the main findings from the research along with the proposed solutions, which helped convince the team that a redesign was needed.
1. User’s business needs are not met. The user comes to X-Ray wanting to know different aspects of infrastructure, such as performance, scalability, resiliency and so on. Even though the tests in X-Ray were able to answer those questions, the design didn’t directly communicate it using the user’s language. The screen below shows the tests page where users struggled to know which test to run because the tests were all named in a very technical way and listed in random order. When the user scanned through the tests, they didn’t understand the purpose of each test and why they should run it. They felt overwhelmed and got lost due to the large number of items which they had difficulty understanding.
The solution is to classify the tests in a meaningful way. To reach that goal, I first ran a few brainstorm workshops with the engineers and PM to try out different ways of grouping the tests. Engaging engineers to the design process turned out to be very helpful since they helped me digest the system so that I can contribute more from design perspective and I inspired them thinking from user’s angle and see the problem through different lenses.
Later, I tested out the classification that came out of the workshops with 5 users using card sorting method. In the final design, we classified the tests into groups that emphasize the performance of infrastructure in different stages of its lifecycle. The benefit of grouping the test has been huge. For instance, users of the old system may not know what a database colocation test would tell them. In the new design, since the Database Colocation test belongs to the Infrastructure Scalability category, the user will understand that this test will show the infrastructure’s performance as they scale their data center by adding more application workloads.
The grouping of tests was well received by users and they found it easier to determine which tests to run. In the redesigned X-Ray UI, this concept is applied to the home page where the inventory of tests is presented to users.
2.Users had a hard time understanding the test and its result. X-Ray is often used during the proof of concept phase when a company is making purchase decisions. Numerous factors influence these decision-makers, one of which is the live performance of a functioning system, as opposed to information from a salesperson, peers, or a website. That’s where X-Ray shines because it’s providing the objective raw data.
To understand the raw data, the user has to first read the description of each test that instructs the user what it will provide for them, and then interpret the result that demonstrates how well the infrastructure performed in the test.
However, X-Ray wasn’t doing a good job of making the test easy to understand and helping the user to interpret the test result effectively.
To increase the effectiveness of communicating what the test does, we added a diagram to the test description that illustrates the test process as well as its goal.
To increase the understandability of the data produced by X-Ray, we created a baseline of performance for the user to compare against the datacenters’ performance and added a tool tip on top of each diagram guiding the user to interpret the data.
3. The application was not very intuitive to use due to the navigation issues. Users had trouble locating content due to unclear naming of the main UI pages. After running one of X-Ray’s tests the user could go to the tests page to see the resulting performance data. As the screenshot below shows, the menu says “Tests” but the content showed the results. The consequence was that the user had trouble finding the test results as well as the tests themselves, and generally had confusion about the purpose of the page.
The information architecture was also confusing because the most important content, the listing of tests, was not represented in the main menu at all. Instead, it could be accessed by clicking a button called run tests at the bottom of the results page as shown below. This was observed during usability tests, where users had difficulty discovering and initiating tests.
After analyzing the testing data and brainstorming new design alternatives, we decided to keeps three major top level navigational items — tests, results and targets. These three items correspond to the three major type of workflows that users of X-Ray perform.
1. Find and initiate the desired tests
2. Check and analyze the results of their tests
3. Configure infrastructure targets on which to run the tests
4. It was not easy for the user to compare test results. In X-Ray comparing results means that users can overlay results of performance from different types of infrastructures to know which one is better or compare results from the same infrastructure that has different configuration.
Below shows a pop-up called Analysis where users are presented with the results in the tiles. They had to choose at least two results and hit the “create” button to generate a comparison. There were two main problems with this design. First, this pop-up was not highly discoverable according to the user feedback. Second, users were not able to finish the task during the usability testing because they couldn’t identify the results that are represented in the tiles due to the lack of information in them.
The new design allows the user to compare the results in the results page directly and provides tooltips about how to compare when hovering on the compare button.
Feedback from the field
We did an Early Access program to get feedback from people using the new version of X-Ray before it was widely available to everyone. Below are some of the feedback from the EA program.
Overall I enjoyed X-Ray 3.4. I found the interface to be very intuitive and clean | Nick (Senior Systems Engineer)
wow X-Ray 3.4 is unreal! Huge shout out to the team for making this so clean, more predictable and easy from a deployment perspective | Ryan( Senior Systems Engineer)
New UI looks ace and needs a few tweaks | Grant (Senior Systems Engineer)