Telling a Story with User Behavior Data

Grace Greenwood
Jun 6, 2019 · 6 min read

How we derived an analytics approach that is working: theorize, segment out your users, and enrich your data with related metrics. Repeat.

Image for post
Image for post

TLDR: Data on how users interact with your app or website is easy to come by and exciting to review, but does not immediately answer one basic question: are users successfully navigating this thing I built? Using the review of a single form to set the stage, we demonstrate how pinpointing and combining specific behavioral data sets can help tell a story about your users, even when it doesn’t explicitly define them.

After launching an app or website, we immediately want to know how it is performing. Is it easy for my users to navigate around and find what they need? Are they happy and satisfied? At first, it may seem like quickly pulling data on user behavior will provide easy answers to these questions; just count pageviews on the site or app, track link and button clicks, and review the biggest referral pages or sites, and suddenly you have all the answers, right?

Right?!

Turns out understanding the user experience requires a deeper synthesis of data that goes beyond simple pageview tallies and session durations. At Braintree, our Knowledge Managers and Sites Designers have teamed up to create our own three-pronged approach to solving this problem and answering the ultimate question: is my app or website any good?

Learning on the job

Image for post
Image for post

To see how users interacted with the form, we first pulled data from Google Analytics and Mixpanel. While it was interesting to see the raw statistics — like how many people visited the form, clicked on certain things, submitted it, or bounced away — we soon realized that none of these metrics specifically indicated whether the form was well-received. In fact, we found ourselves in a common data conundrum of needing to identify which metrics should be seen as problematic. For example, was it bad if the conversion rate of users selecting an issue and then submitting the form was low? Did that imply users were giving up on a confusing form, or were they successfully clicking out of the form to find their own answers in our suggested documentation?

While the big pile of data in front of us didn’t immediately illuminate problem areas with the form, we suspected there might be some valuable nuggets in there somewhere. Finding these nuggets of insight within the raw data was an interesting journey, and through it we found ways to improve the experience for thousands of form users. Ultimately, we discovered three key steps for using data to identify data insights: theorize, segment our users, and enrich the data.

Step 1: Theorize

While extremely vague, this hypothesis forced us to then ask ourselves a few fundamental questions to determine what those bad experience indicators may be:

  • What behaviors do we want our users to do?
  • Why are those behaviors important?
  • What other behaviors could users opt for instead?

In the case of the Help Contact Form, the answer to one question fed into the next. We wanted users to either click on our suggested documentation links to find their own answers, or to have a painless experience reaching out to our support teams for quick assistance. This all starts with users partaking in the important behavior of selecting an issue on the Help Contact Form; by doing so, users allow us to either redirect them to relevant documentation or route their question to the team best equipped to help.

So what behavior could users partake in that would completely subvert this intended experience? This is where our vague hypothesis became solid theory: users who selected the “Get help with something else” issue (or the generic “Other” issue option) were having a bad experience on our site. These were the users who either read through our list of issues and didn’t find one that matched their actual problem, or those who (understandably) didn’t want to read through our robust list of issues in the first place.

Steps 2 and 3: Segment and enrich the data

At this point, we enriched our original data set by pulling all of the support tickets that came from form submissions with the “Other” issue selected. Our support teams assigned reason codes to these tickets, effectively defining the exact topics our users had questions about when they reached our form and couldn’t find what they needed. If the topic was the same as one of the issues already existing on the form, we knew our issue-selection UX needed some work; if the topic was something we didn’t have listed on the form already, then we knew we needed to add some new issues. In our case, we saw a lot of both.

Suddenly, we had a process for identifying the specific types of questions our users had that our Help Contact Form didn’t help them with. And knowing this gave us a clear, measurable course of action: iterate on issue content and design changes to decrease the number of users selecting the “Other” issue on the form.

How’d we do?

  • Rewriting the issue content into quick snippets that would be easier to skim
  • Adding new issues for the common topics assigned to “Other” form submissions
  • Adding more (and more relevant) suggested documentation links for each issue
  • Redesigning issue selection with search

In December 2017, over half of the Help Contact Form submissions had the “Other” issue selected. In the first quarter of 2019, after we had implemented multiple rounds of these improvements to the form, the number had dropped to only 25%. On top of this, the average number of form submissions per business day remained consistent from month to month, while both the number of users viewing the form and the number of users clicking into our suggested documentation links steadily increased.

Given this, we can confidently say that our Help Contact Form users are now finding more of their own answers in our docs without waiting around for human assistance, or getting derailed by a frustrating form experience.

Conclusion

We still have so much to learn, but we derived an analytics approach that is working: theorize, segment out your users, and enrich your data with related metrics. Repeat.

Braintree Product and Technology

Essays on design, engineering, and product development at…

Medium is an open platform where 170 million readers come to find insightful and dynamic thinking. Here, expert and undiscovered voices alike dive into the heart of any topic and bring new ideas to the surface. Learn more

Follow the writers, publications, and topics that matter to you, and you’ll see them on your homepage and in your inbox. Explore

If you have a story to tell, knowledge to share, or a perspective to offer — welcome home. It’s easy and free to post your thinking on any topic. Write on Medium

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store