Data makes better products, period! Part 2/3

Johannes Schauer
sclable
Published in
4 min readAug 11, 2023

Your analytics should be as well planned and executed as the front end of your digital product or service.

How to collect, interpret and leverage data

In life success is subjective. When it comes to continuously improving a digital product or service however, taking a data-driven perspective is essential. In my experience, setting up and testing analytics is as vital as setting up and testing the front end. The benefits include true user focus, objective understanding of performance, fair feature prioritization, transparency on goal achievement, quick identification of problems and feature requests as well as the possibility to define behaviour-based user personas.

At Sclable, once goals and associated metrics are defined, we help clients set-up and execute 10 key steps to ensure their analytics are in-place and functional from day one of go live (and onwards).

Sclable’s 10-step analytics process

  1. Derive analytics requirements from goals, metrics and moments-of-truth
  2. Define regular user feedback collection
  3. Clarify data privacy and security
  4. Select and implement analytics framework
  5. Implement analytics
  6. Implement feedback feature
  7. Test incoming data
  8. Implement dashboards and reports
  9. Set up process for working with analytics and feedback data
  10. Run usability tests (e.g. A/B tests)

Keep reading as I outline some of the common challenges we help clients solve at this stage.

Defining data points (and avoiding over-collection)

Though it’s tempting to collect all data at your disposal, at Sclable we always advise clients to avoid it. Instead, looking back to defined goals and metrics should guide data point definition. It allows reporting teams to check which user attributes, user events and system events are needed. Here, clients can also check the needs of user segmentation to complement user attributes and challenge which attributes are needed for captured analytics events.

Choosing data points like this ensures only the information really needed to measure the success of goals is collected. To learn how to define your goals, check out Part 1 of my series: “How to set relevant goals”.

Example for a user flow in a feedback feature

Identifying moments of truth

It’s important we complement goal- and metric-based analytics by diving into user journeys and identifying events that are of added value. These “moments of truth” are points in the user journey that have potential to be decisive moments for user satisfaction. For instance, user churn, i.e. the likelihood a user stops using a product or service.

Once we’ve identified these moments of truth, it’s important to derive which analytics are required to understand a user action including — as far as possible — its potential reasons. In my opinion qualitative feedback during moments of truth is indispensable for providing highly relevant additional information — especially when it comes to something as significant as user churn. To get this kind of feedback, I recommend adding a direct customer feedback option close to your moments-of-truth events.

Choosing framework

Once analytics requirements are collected, we often assist clients in choosing a suitable analytics framework. If one is already in use, then we verify if there are any gaps (in terms of requirements) and ensure everything surrounding data privacy and security is following best practices and regulations. In particular, creating transparency to users about the purpose of data collection for continuous improvement is critical.

The next step is to implement the analytic events for each feature you want to report on. Here, communication between data analysts and developers must be clear and direct. The ideal scenario we foster at Sclable is having data analysts and developers create test cases jointly, especially non-standard cases. With focus shifting to testing analytics, we recommend doing this in parallel to testing features (e.g. a direct feedback feature).

Turning data into insights

In my experience a thoughtfully developed dashboard and report structure does wonders when evaluating the quality and coverage of analytics and feedback data. At Sclable, we recommend enlisting the assistance of a designer here, since the usability of dashboards and reports is of high importance. Their job — together with the analyst — is transforming analytics and feedback data into actionable visualizations.

Before launch, you won’t have a full range of data available, but we aim to connect dashboards to their appropriate data sources as soon as possible. This is a test in and of itself because it’s important to ensure all visualizations can be filled, run (analytics and reporting) user tests, document issues, identify gaps and create required actions. We’ll also develop monitoring views, which include alarms triggered e.g. by unexpected values. At this stage, it is advised to run performance/load tests on the analytics framework.

Another important element is dedicated usability testing, which also tests analytics and provides additional data. As we approach go-live, we usually prepare A/B tests (or similar) to randomly check users on different versions of the same feature. Whichever has better performance indicators (defined before testing), goes live.

The bottom line: Do not launch your digital product or service without an ironclad and tested analytics system in place. Once you have, the next step is making sure the key stakeholders in your company use all that data for continuous product and service improvement. Find out how we tackle this in Part 3 of my series “How to optimize data-driven product management”.

Follow Sclable on LinkedIn for more content like this or check out our website to see the work we do!

--

--

Johannes Schauer
sclable
Editor for

As Director of Data & AI Transformation at Sclable, Johannes is dedicated to driving business impact by translating strategic goals into trusted data solutions.