Human Centered Data Viz for Data Driven Humans

Is data visualization a branch of UX? It certainly doesn’t get the same attention that User Research or Design receives. I spent the first three years of my UX career with the title Data Visualization Engineer and often wondered if my experience really fit the mold of the “User Experience” industry.

Ultimately, I had to reason that there were only two things to consider:
> Does my job involve taking a human-centered approach?
> Could my job benefit from taking more of a human-centered approach?

Data Visualization has many facets: data science, business analytics, engineering. But data, like software and web pages, has an audience, and that audience has needs and goals and pain points. Data tells a story; that story is either helped or hindered by the user’s experience around it.

In practice, my job did require a human-centered approach to design. Rarely did I get the resources and client buy-in to use the skills I learned in the HCDE program to improve the experience for our audience. When I did, though, the difference was notable.

A User Centered Case Study

Most clients approach me with a need for a dashboard showing certain exact requirements with no time, budget, or interest in probing questions. This client, though, wanted analytics for an education tool that had never tracked scores or metadata before. Even when I asked them what they needed to know, they said they weren’t sure, but they’d be happy to find out.

An undefined set of users? An undefined set of features? No prior legacy designs to work from? Bingo. This is a prime chance to break out my UX toolkit. The first step was to make sure that the client and I agreed on our problem statement:

How can we provide analytics such that the results can influence schedule, curriculum, and regulatory requirements using data-driven evidence?

Research Process

I arranged 10 interviews with some stakeholders that the client helped me identify. The first few interviews were strictly Q&A, but they provided contradictory suggestions and abstract answers that were difficult to turn into actionable advice. I was designing for a new space — the users didn’t have requirements or pain points yet. I needed to try a different approach. For my last seven interviews, I employed some card sorting exercises.

The engineering team had given me a list of 18 metrics they knew they could produce metadata for. I wrote each of these on separate cards and asked my interview participants to perform the following tasks:

  1. Sort the cards into however many categories seem appropriate, then label the categories.
  2. Sort the cards in order of importance, from most important to least.
  3. If you could only pick five of these metrics, which would give you the most holistic view of the questions you want to answer?

By choosing these three specific tasks, I got insight into which categories are most intuitive to the user, which metrics are most/least useful to know, and how much overlap these metrics included. Most importantly, asking users to rank importance required them to vocalize their use cases, which had been harder to extract from open-ended interview questions.

Telling the User’s Story

We categorized the stakeholders we interviewed into three archetypes based on their common use cases. From there, we organized our interview insights into user stories, highlighting what was important to each user and why.

List of user stories in the format “As a <user type>, I want/need to <requirement> so that I can <goal>.”
User Stories I created based on our interview insights. Sensitive information has been redacted.

Shifting Personalities

My original hypothesis was that our user archetypes would map to the job titles of our primary stakeholders. After we storyboarded some scenarios, we realized that the needs of each job title overlapped with the others. Mapping out these overlaps, we discovered that the more appropriate archetypes should be based on the motivations of each user, not the users themselves.

Matrix of user stories outlined on post-it notes. Columns are the user types. Rows are “Info,” “Why?”, and “Where?”
Creative Director Chris Hannon helped me organize the use cases into a post-it matrix.
Venn diagram showing the overlap of Efficiency, Curriculum Creation, and Engagement/Learning for each of the 3 user types
The second version of user archetypes: Efficiency, Curriculum Creation, and Engagement, each of which suited our stakeholders in different ways. Image credit: Chris Hannon

The Final Design

I used the insights from our research process to deliver three separate data dashboards that prioritized our three types of performance: Efficiency, Curriculum, and Engagement. This would allow each stakeholder to bookmark these dashboards based on their individual needs. Best of all, this three-pronged approach paralleled our problem statement: how can we influence schedule, curriculum, and regulators?

Unlike many of my projects whose data requirements are based on hypotheses and assumptions, I was proud to deliver some PowerBI reports that had user research evidence to back up my design decisions.

Power BI dashboard showing the “Question Effectiveness Overview.” The dashboard includes bar graphs, text, and dropdown menus
One of the three final PowerBI dashboards we delivered to the client. The back/next navigation leads to the other two dashboards. Sensitive information has been redacted.

Conclusion

User Experience encompasses a wide range of disciplines and UX practitioners spend a lot of energy arguing over what “counts” as UX. The truth is that your job title doesn’t make or break your ability to be a UX designer. There are plenty of ways to add user-centered processes to your job—and you and your colleagues will likely benefit from it.

--

--

Joe Bernstein
University of Washington Human Centered Design & Engineering Alumni

UX designer, wordsmith, thought leader. Specializes in data viz, Figma, and design systems. Unwinds with trivia, softball, and crosswords. Resides in Seattle.