Product Gals
Published in

Product Gals

The Art of Surveys: Analyzing Data to Improve the Docs Experience

Act 3

Introduction

Earlier this year, we conducted the first of many regular docs surveys. These surveys enable us to have a holistic understanding of the quality and satisfaction of our docs.

This is the final act of “The Art of Surveys” series. In Act 1, we went over research best practices. In Act 2, we applied these to create the best research plan possible.

And finally, in this Act, let’s go over what we learned.

Analysis methods

By the end of the survey, we collected over 1,000 responses — lots and lots of data. There are many techniques to analyze data. However, we found the following techniques the best for us to parse through and extract those golden nuggets.

Quantitative Analysis

To understand our audience from a quantitative perspective, we created charts, graphs, and tables. Using pivot tables, we were able to dive a little deeper into multi-factor analysis. We investigated if the user’s role, experience level, or preferred language influenced the quality metric, as well as each component.

For our Quality Metric (explained here), we took the average of each of our 5-point components to create the main KPI that is the outcome of our survey.

Easy to understand is highlighted yellow because that is the component in which our Docs have the most room for improvement.

Using our questions around accessibility, we were able to derive a user journey map highlighting key stats for how our users navigate.

Qualitative Analysis

To understand our audience from a qualitative perspective, we used a UXR technique called affinity diagramming.

Affinity diagramming refers to organizing related facts into distinct clusters. Affinity diagramming is also known as affinity mapping, collaborative sorting, or snowballing.

- NN group (source)

By doing affinity diagramming, we were able to extract greater themes and identify the biggest gaps our users were facing.

To understand our qualitative feedback at a high level, we created a word cloud to extract the most common phrases mentioned by survey participants.

The main takeaway here was to revisit the documentation for these topics with a fine tooth comb and identify what needs to be improved — whether that be to bulk up the detail, provide more examples, add a visualization, or something else!

Our Golden Nuggets

Who our users were

A huge insight from this research was understanding our user base: their characteristics and their behaviors.

Some characteristics we learned were:

  • 60% of our users identify as software developers.
  • Most participants have more than 6 years of programming experience, but less than 3 years of MongoDB experience.
  • The most commonly used programming languages are Node, Python, and Java.

Some behaviors we learned were:

  • 43% of users will use “search” functionality while 36% of users will use the left hand table of contents.
  • Most participants visit Docs at least once per week.
  • The majority of participants supplement their Docs learnings with Stack Overflow and Youtube.

Understanding all of these components that make our audience really helps influence the future of our Docs, how we present content, and the way we communicate with our users.

How we were doing

Our docs scored an overall 4.18/5 for our quality KPI. If you’re interested in defining your own quality KPI, check out how we came up with ours here.

We also identified that our lowest measure was to understand. With this information, we know where we are missing the marks and where we should focus on improving in the next upcoming year.

What our users need more from us

The biggest golden nuggets from this research was identifying how our users felt we were missing the marks. Having a product is one thing, but continuously ensuring your users’ needs are met is a whole other thing.

For us, we’ll be focusing on:

  • Improving our examples in terms of variety, frequency, and relevancy.
  • Elevating our level and detail of troubleshooting content.
  • Introducing new touchpoints of interactivity.

Conclusion

Conducting this research was really important for us to understand holistically how our docs were performing. Many times, we can get stuck in our work and we forget to talk or check in with who this is all for: our users. To ensure your product is meeting user expectations and your time/work is going towards a key problem, user research is of the utmost necessity.

If you’re thinking about conducting your own type of research, check out these resources. They were extremely helpful for us and we hope that they are helpful for you as well.

Thanks for reading!

--

--

--

We discuss ways we deliver developer experiences that empowers and guides users to achieve their goals with their modern applications.

Recommended from Medium

Apple Music New Feature

The 47th Kips Bay Boys and Girls Club Annual President’s Dinner

The two most important words in UX Design

Actually getting started

5 Ways Startups Affect the Practice of Product Design

Rubik’s cube

The story of haBEEt

Case study: Designing A sales tool for relationship managers of a bank (Axis SBB)

Experiments with ATtiny’s and 3D design

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Allison Mui

Allison Mui

Product Designer @ MongoDB

More from Medium

UX Research in Design Thinking

Empowering the voice of the customer

An icon of a human head speaking

Preparing for a user research: common mistakes

5 tips for recruiting for your remote user research sessions