Case Study: Simplifying data configuration workflows to scale


Challenge: Simplify the sensor data and analytic configuration process of an industrial asset management product suite to increase time to value and address long-term product sustainability and scalability concerns from customers.

Solution: A multi-phased workflow redesign to reduce time to setup sensor data and configure analytics from 2 weeks to 2 hours end-to-end.

Duration: 7 months.

Team: 1 UX Designer, Product Managers and tech leads from 12 different product areas, VP of engineering, VP of product management.

My contributions: Design lead. Conducted user research and feedback sessions. Created artifacts (storyboards, prototypes, flow diagrams) to help stakeholders understand users’ pain points and created a phased UX vision to alleviate these. Partnered with Product Management and Engineering to help define the product roadmap.


  1. Designed a data configuration process and experience that reduces time to value and operational overhead for asset operators and admin users. This involved a shift from services to a self-service setup and an improved time to setup data and analytics from 2 weeks to ~2 hours.
  2. Helped create empathy for the users and a sense of urgency to improve the workflow in order to retain customers and support their growing operations.
  3. Facilitated organizational alignment to identify work that needs to be done to support the experience, impacts and dependencies across the product portfolio.


A lot of the challenges in the Industrial Enterprise space involve scale. I worked on the setup and configuration process for an enterprise solution suite to manage different aspects of Industrial Processes and Equipment. A customer site can have thousands of data points that are collected and measured for the different machines being monitored, and this data needs to be setup and configured to be used in the software.

Historically, the product setup had been done through a series of paid engagements with professional services. This allowed much of the pain points of the data configuration process to remain invisible to most customers. As customers mature in their digital transformation journey they are shifting to do this work on their own to remove dependencies on external teams.

Several customers provided feedback on the data setup process and found it overly complex and time consuming. Many of these customers have their own roadmaps to deploy and scale their monitoring operations and the inefficiencies in the process would create additional cost of ownership and become a real bottleneck for scaling. This feedback was also consistent with product audits and user research conducted by the UX team.


Understanding the problem

The key users are asset operators and implementation engineers. Most of their day to day is spent setting up the machines that need monitoring, making sure data in the system is accurate. I setup interviews with these users to understand the current process and get to the root cause of their frustrations, like the need for training to do the setup, duplicate setup steps, and lack of tools to complete some pieces of the process on their own.

I also interviewed the professional services team, who were the expert users in the data configuration process. This provided a more complete picture of the process, and also helped me identify some tasks for which there were no product tools available, forcing users to develop their own tools to automate some of the repetitive steps.

I cross-pollinated the research findings from my interviews with user research done some years prior (this was still relevant as much of the process remained unchanged). A lot of the pain points stemmed from teams working in silos to deliver their functionality even though it created additional work for the user, and overlapping configuration from acquired products that had not been properly integrated (for example, the data model required double configuration, or having to navigate to two different places to setup analytics)

Concept definition, key considerations and validation

From the research I was able to identify key steps in the process, as well as the time on task spent to setup one machine in the product, add sensor data and setup analytics for consumption. This was used as the baseline to improve the process.

I learned there was a lot of variability in the frequency and size of the configuration tasks:

  1. Initial setup: This could range from hundreds of machines and 15 thousand data points to 10–20 machines and ~200 points. This was usually done over a long and continuous time period.
  2. Data maintenance: As new data points get created or details change, updates need to be made on a smaller scale. This happened infrequently, once every 2–3 months, so the steps to complete the task had to be relearnt.
  3. New machines added: For an existing setup, adding new machines to be monitored. This happened infrequently, with no predictive pattern for when a new item would need to be added.

With the task frequency and scalability concerns from customers in mind, I partnered with Product Managers for each of these areas and created storyboards for ways in which the process could be improved.

Low fidelity concept iteration leveraging storyboards

Through this exercise I defined some key principles to guide the design decisions:

  1. Provide data entry flexibility. Don’t restrict users to a single path for data entry, but allow UI and bulk data upload parity. Both should be easy and fast to do.
  2. Automate as much as possible. Move away from manual associations and setup, and look for ways of automating tasks based on rules or patterns, and provide opportunities for verification.
  3. Provide guidance when needed: Help users navigate through the workflow end-to-end, but stay out of the way when it is not needed.
  4. Use recognizable data formats: Instead of requiring the creation many different file formats to upload data (like XML or JSON files), focus on formats that users are comfortable using (like Excel).

The short term design concepts that emerged following these principles included quick fixes that would reduce the steps needed to complete the work. For example, to create an object in the application, the user had to fill out a basic form, navigate to the object and open each property to modify the value. This meant a lot of waiting for page loads and navigating back and forth. So I identified the key fields that were needed 90% of the time and surfaced these in the initial creation form. A simple change that reduced creation time of the object dramatically.

By surfacing fields that were used 90% of the time the the time on task decreased dramatically
Sample workflow improvement gained from a small form redesign

Medium term concepts included guided workflows for single items as well as bulk, and consolidation of the setup process into a single view (instead of requiring multiple jumps in the application).

Sample explorations for data entry

The long term concepts required a complete reimagining of our data structures and the conceptual model by which users defined data relationships. In the current workflow users had to recreate data that existed outside of our application, which created a challenge to maintain accuracy. As part of the redesign I explored ways of allowing users to make direct associations between data outside our system, and automatically creating the data structures needed in the back-end.

Associating data directly from external sources

Because this involved at big conceptual changes in the application, I held sessions with users to validate the storyboards and get their inputs on the general direction. The feedback on the direction was mixed. We got positive responses to the workflow improvements, but several users were confused with the terminology and new concepts. This insight helped drive a simplification of what gets exposed to the user and hide all the “plumbing” specific to how we make the back-end work. I then fleshed higher fidelity mockups of the long term vision.

Defining the roadmap

Throughout the project I partnered with Product Management to define a phased approach to delivering the vision experience. This was based on available resources and sequential order of solve. I fleshed out how the user experience would change on each phase and identified expected time-on-task gains per phase.

To help align the teams internally on scope and determine the work they would need to do, I created a storyboard for each phase. This was key to illustrate in a concise way how the experience evolved with each phase, before drilling into the specific details. This also helped maintain a target view of resources and time needed to complete this work.

Sample of workflow evolution and time reduction at different milestones in the project

My takeaways

When dealing with a lot of stakeholders, level setting is key to having more productive discussions. I found the use of storyboards and flow diagrams to help teams understand the current state and customer pain points was an effective way to generate empathy for the users. This got them interested in learning more, at which time they were ready to consume a more in-depth research report I had prepared.

In large projects like this, coming up with a design solution that will solve the problem is the easiest part. The hardest part is getting teams and leadership to understand why it is a problem and the value of solving it for the success of the organization. This requires a lot of communication, coordination and aligning priorities (and budgets) to plan out the execution.

Angelica Rosenzweig | Case Studies

UX Case Studies and lessons learned