Case Study #1 : Optimizing data tables for enterprise-level platforms
Product: Qstack, NetApp Cloud Volumes
Task description: In this case study, I’ll explore different ways of optimizing data viewing and managing options on the example of a cloud computing platform redesign.
Tools: Abstract, Sketch, Principle for Mac, inVision
NetApp today is a provider of data management solutions, which simplify the complexities of storing, managing, protecting, and archiving enterprise data.
NetApp acquired Qstack in August 2017, as part of its Data Fabric strategy and integration with its management software. Qstack was built by a private company that created an orchestration and management platform for hybrid cloud and multi-cloud environments and have partnerships with Microsoft, VMWare, Hewlett Packard Enterprise, NetApp and Hitachi.
NetApp launched Cloud Volumes Service in May 2017, a SaaS offering, targeting born-in-the-cloud enterprises and users who demand a high performance, fully-managed file storage with multi-protocol file services support. This service is available for use with AWS, Azure, and GCP.
We’re a data management company, not storage. Data is the raw material of today’s global economy and the success of future data-driven companies is dependent on two main things: advanced data collection and ability of acting fast on it. Otherwise data is useless without it.
In my opinion, the biggest challenge with designing enterprise-level platforms is the lack of patterns that work or don’t work in specific scenarios, good data tables allow users to scan, analyze, compare, filter, sort, and manipulate data to get insights and perform actions on it.
Considering users context and delivering solutions in a clear and transparent manner are very vital factors as well. Let’s identify the main objectives and pain points for our target audience:
- Browse lots of data at once (show multiple items and their statuses)
- Determine and commit actions quickly (bulk actions, combining elements)
- Compare information (completed items vs in progress)
- Allow sorting, multi-selection, batch actions, and the ability to group data depending on the context
- Give users control over when the change affects them
- intern, new to the interface, needs assistance when using the platform
- SDE, priorities is to browse through data in a quick and easy way in day-to-day tasks
- Dev/Ops Engineer, his priorities is to create and manage data resources on a big scale
- a capacity of displayed data in the single view
- handling different types of data
- filtering and sorting data
- lack of extra levels of working with data (popovers, modals)
- missing deeper levels of digging into data (expansions, modals, etc)
- customization of views
- the consistency of UI elements across different views
- bad accessibility
- big variety of elements
- Provide personalized experience in varieties of different contexts
- A unique form of managing data
- Prevent crisis situations
- Increase productivity and performance
Taking these factors into consideration and having the main goals and motives determined I went throughout a 5-week process where I broke down the main components into more smaller parts:
- expanded view
- edit mode
- treatment of actions
- table customization
- cell truncation
- and unread/new indicators
- hover mode, etc.
I assembled a team with representatives from development, project management, QA, operations and customer support. Later I assigned those to each participating member of the process brainstorming session to shotgunned some initial ideas for each component. We ran those ideas through a validation test to weed out different types of usability, functional, technical and other potential issues. We explored different versions of design structures, interaction patterns, and techniques to optimize user’s workflow with data tables, enhance UI development implementation and better workflow between me and developers, ensure scalability of our solutions.
Then I went back to work creating a round of iterations to accommodate the issues that were brought up. Throughout the process, we also had a meeting to check on progress, identify what is “done” and what needs more work, and on the last meeting right before the presentation to the team I narrowed down different options (including states) for final presentations.
After the feedback from the internal team and getting many of the components, patterns, and flows would get approved it will get sent off to UX researches for validating solutions in quantified user testing feedback. Taking the feedback from participants I ensured to eliminate all the pain points and proceed for further development and documentation.
Here are our solutions to scenarios of most commonly occurring problems and ways we would solve them.
Calls to Action
Fixed headers and columns
Row to details
All these patterns by is an extensive list of solving most commonly occurring problems, new suggestions and some experimental ways of visualizing, managing, and performing actions when designing complex data-driven enterprise platforms.
Despite the wide array of problems these patterns might solve, it’s essential to validate these assumptions and hypotheses by teaming up with user researches and performing usability tests as early as it can be. It will ensure to eliminate the bad ideas out fast or to evolve them to the ones that work.
Crafting these experiences and mastering these kind of patterns for the tables is just a start but they taught us a lot and the main key takeaways in the process should always be:
- be customer obsessed by always listening, inventing new paradigms, and personalizing experiences
- enabling agile and dynamic environments for starting small but scaling fast
- iterations is the key — release, refine, repeat
- experiment and embrace failures — making bets, that’s when you learn
- be openminded, objective, and curious to have an ability to step away to see the problem as the whole
- always be biased for action, done is always better than perfect
- try to be data-informed instead of data-driven when performing any kind of user testing
- being frugal in decision making to ensure the sustainability of the development process.
Case Study: Optimizing forms and contextual help
Case Study: Feature request development and conducting usability testing