Great product analytics is not about tools, it’s about you.

Focus on your culture and the stack will follow.

Clément Caillol
ManoMano Tech team
11 min readFeb 14, 2019

--

ManoMano AAAA principles of product analytics: Agility, Autonomy, Accessibility, Agnosticity

I started working at ManoMano as a Product Manager (PM) in April of 2018, after 4 years of data analysis at Google, consulting for top advertisers on the french market. Early on I took the lead in a project aiming to improve the analytics capabilities of its Feature Teams (FT).

The PMs at ManoMano had been doing a fantastic job way before I joined, but when it came to product analytics, made do with solutions that wouldn’t really scale. The habit and spirit existed, to be sure, but access was restrained by default and data was difficult to extract, share and make informed use on. Overall, finding an interesting piece of data felt like a chore, and was avoided when possible.

Product analytics should be central to technology companies, informing product roadmaps and, along with qualitative user research, shielding PMs from the dreaded HiPPO effect. But in my experience it is a lot like teenage sex: everybody talks about it, but few people really do it.

This stems from the novelty of the field and the scarcity of expertise amongst executives.

At ManoMano, after 4 months of reflexion, many internal alignment meetings and a few external interviews, we eventually changed our stack of tools and came up with a new process to improve our product analytics.

Contrarily to what some readers might assume, changing our stack of tools was not the solution in and out of itself, but instead the result of a redefined data culture, revolving around 4 principles, we call AAAA: Autonomy, Accessibility, Agility and Agnosticity.

Agility

Professionals who have had to set up product analytics processes know that it comes with a high cost of entry, because the first thing you have to do is build a measurement plan.

A measurement plan is a document that contains the definition of your high-level KPIs along with the events and events properties that you need to record in order to populate these KPIs.

Say, for instance, that you want to know if your conversion rate is affected by shipping costs — spoiler alert, it is — you will need 5 pieces of information:

The measurement plan: Translating business questions into recordable events and properties.
  1. the total number of visits to your website — also called sessions
  2. the number of visits that start the purchase funnel — add to cart
  3. the number of visits that end the purchase funnel — transactions
  4. the number of visits that exit at shipping stage — abandon
  5. shipping fees for both 3. and 4.

Apply the same logic to all of your business, UX and technical questions and you’ve got yourself a measurement plan.

At ManoMano, the product team was in debt because we had never paid that high price of entry. And the legacy tool that we used actually compensated for this flaw: instead of being precise in what it collected and why, it collected everything that happened on our website — clicks, hover, scrolls, mouse movements — everything.

For relevant product analytics collecting everything is not the right approach, as it increases the ratio of noise (uninteresting bits of data) to signal (the juicy stuff), and never provides the satisfying level of granularity, whilst slowing down query execution: data analysis becomes a chore and is avoided when possible.

Exhaustivity shouldn’t be the aim of your product analytics strategy. Instead, you should focus on measuring what really helps your decision-making and improve your tracking with small, incremental, regular improvements.

Product Analytics is not a one-off effort, it is continuous. In your choice of tool, keep in mind that you should be able to pass regular improvements to your measurement plan at a low cost for your team.

Specialized consultancies can help kickstart the process but in order to guarantee Agility, you also need to find out how your measurement plan is going to evolve and be maintained over time.

This raises the question of ownership: Who is responsible for maintaining the measurement plan?

Autonomy

What our experience and some of the interviews we had highlighted is that, for most companies product analytics is owned by a dedicated team of experts. This “tracking team” is solely responsible for maintaining the measurement plan in accordance with FTs’ roadmaps. But this is wrong for at least three reasons:

First, it creates a dependency on one team and fills its backlog — for most companies the “dedicated” team generally has other assignments too — , where parallelization could take place.

Second, it establishes a transaction-like relationship between FTs and product analytics team, which is recipe for resentment and poor understanding of business issues.

Lastly, no expert in their right mind would join such a team: the ones that have the expertise want to set up processes ( 😉), but surely not take on an operational role. Settling for second-rate expertise translates into lower performance and higher employee turnover.

At ManoMano, instead of a dedicated team, ownership is given to each feature team who are responsible for their own tracking, in autonomy. At FT level, tracking becomes part of the acceptance criteria: not only should the software bring value to the user and be bug-free, but it should also be measurable by design.

This shift in ownership allows a similar shift in culture: success in product development is defined earlier and can be assessed by anyone.

Having each team responsible for their own product analytics makes sense for reasons cited above, however it exposes you to a well-known risk: silos. As your product team gets bigger and its analytics needs complexify, the threat is that your measurement plan becomes inconsistent and harder to understand. Paradoxically, to guarantee autonomy you still need a governing body responsible for keeping your product analytics accessible.

Accessibility

Accessibility to product analytics data is our most important principle. It is what makes a company truly data-driven. By which definition?

I would argue that data-driven companies stand out by the strength of their data culture and the depth of their employees’ data literacy.

Decisions are taken with data in mind, of course, but above all, data-driven companies provide an environment where data is ubiquitous, easy to understand and make use of. In my opinion, you’d have a better measure of the data-driven-ness of company by the data proficiency of its janitors than by the achievements of its data department.

Google cafeteria (credit Roman Boed)

Google is arguably the most data-driven company on Earth, and what truly makes it so for me isn’t necessarily the number of its PhDs in data science, but rather the fact that the food team is incentivized on satisfaction surveys. Menus are built and quarterly bonuses awarded based on the ratings that spoiled Googlers give their free-lunches. Early definition of success and accountability from all.

By nature ManoMano meets this criteria perfectly: a place where the most mundane conversations often end up with a debate over impact, causality or representativity.

To reinforce this nature in our product development and guarantee accessibility we put in place a product analytics council. Its main objective is to make sure the measurement plan remains coherent and accessible: resources are not duplicated, events and properties are easily understandable and usable by everyone. To do that, the council is responsible for defining the guidelines of our tracking and validating the changes to our measurement plan before they are enacted.

Event and trigger naming conventions defined by ManoMano’s product analytics council.

Agnosticity

Finally, the interviews that we had with similar companies allowed us to identify a pitfall in traditional product analytics organizations, which over time reduces their ability to adopt better tools: hardcoding the tracking.

If each FT is responsible for their tracking, it means that every event they collect is explicitly written in the code, and most of the time gets forgotten there. But, what happens the day a superior product analytics tool appears on the market? You get stuck with years of event tracking, buried deep into your code, facing months and important costs of migration to find yourself back to square one, locked in another tool.

To avoid this pitfall, tracking needs to be centralised and live in an environment where it can easily be managed. It sounds counterintuitive to have each FT working in autonomy on a central tracking system, but that’s exactly what is needed to reduce dependencies on your product analytics tool.

How do you achieve this distributed-centralisation oxymoron? That’s where we need to start talking about our stack of tools.

ManoMano’s product analytics stack and process

When it came to choosing the right tools for us, the main issue was being able to reconcile our somewhat opposed principles of Agility, Autonomy, Accessibility and Agnosticity.

Our tracking system needed to be centralized yet editable to all and under supervision by our governing body. It shouldn’t be buried in our code, allow us to pass regular agile improvements and guarantee our freedom to change our stack the day we choose. We identified two solutions: Tag Management Systems (TMS) and other data collectors like Segment.

Data collection: Segment vs Google Tag Manager

Data collector and 3rd-party integrator: Google Tag Manager vs Segment

In the world of product analytics, the advent of Segment has made some noise. It seems ubiquitous and its adoption rate has been high for the past few years. From my understanding (full disclosure I’ve never worked with it directly), its main selling point is integration. Segments collects data and distributes it to any 3rd-party solution you choose with no extra development on your side. It’s fast and reliable (again, from the feedback I’ve gotten) and reduces the number of javascript calls made on your site.

However, the more we studied its pros and cons, the less obvious it appeared for us to adopt. As a matter of fact, the data engineering team had already set-up SnowPlow for logs recording and implemented the free version of Google Tag Manager (GTM) to install marketing tags. Segment’s main selling points, integration and data distribution, didn’t seem like our priority at the time.

Tag Management Systems, of which GTM or Tealium are the best known iterations, emerged as an underrated solution to organize our product analytics. Generally used by business teams to implement advertising trackers, they actually allow granular event triggering, on/off code management and intuitive user interfaces. At its core GTM is a code management software: users work on branches (called workspaces), they ask peers for reviews (approval) and pull-requests (publish).

We use GTM as an event-tracking aircraft carrier: all our events live there, where they can be easily monitored by our product analytics council, upgraded and eventually rerouted to any tool we want, the day a better solution emerges.

Around GTM, we were able to put in place our agile enrichment process, with weekly iterations: from Monday to Thursday, any member of any FT is free to add events to a public branch (copying a pre-established template and observing strict naming guidelines); on Friday the council reviews the branch and pushes code into the world.

ManoMano Product Analytics process

The last question mark in our stack was the actual solution to visualize our product analytics data.

Dataviz: Google Analytics vs Tableau vs Amplitude

Product Analytics visualization tools: Google Analytics vs Tableau vs Amplitude

They say old habits die hard, and I can only agree. For the past few years working for Google, my answer to any analytics question had been two words: Google Analytics (GA).

I stand by it, because I still think GA is a fantastic tool to get actionable insights on online marketing strategy and improve attribution. However, I came to realize that it fell short on product analytics, that is, in its ability to cross various dimensions and visualize them easily. Sure, it allows event tracking, complete with event properties (custom dimensions, custom metrics) but in order to visualize this data you often end up creating flat tables and exporting them on a spreadsheet — which sucks. Google Analytics 360, the enterprise version, pretends to solve this problem by exporting website logs on a Big Query instance: effectively drowning you in a sea of data which would cost you an entire data analysis team (twice the licence cost) to make use of.

As stated earlier, ManoMano already has a structure for collecting granular website data, through SnowPlow. Data collected by SnowPlow is refined, versioned and channeled to our data lake, where it is exposed to our end-users by a popular data-viz tool: Tableau. I won’t delve on Tableau for lack of real expertise, but I know it is also an incredible tool. However, using Tableau for our product analytics needs created a dependency on our data engineering and data analysis teams, necessary to build relevant cubes of data. This contradicted our principle of autonomy and accessibility and was later discarded.

The solution we ended up choosing, Amplitude, came up a lot in our interviews. Along with Segment, it seems to be defining the standards of product analytics, most definitely as a result of having a product company design a solution for product people: dead-simple, uncluttered, fast and intuitive. Although we are still in the process of mastering its use and probably lack some perspective, it is already making waves internally, mostly thanks to its collaborative nature (visualizations can be bookmarked, shared and commented) which fosters the type of data-driven mindset we value at ManoMano.

ManoMano product analytics stack to date (Q1 2019)

If you are curious about our feedback, dedicated pieces on Amplitude will be posted on this blog.

Conclusion: Continuous improvements

There is a widespread belief, amongst company executives faced with a problem, that the solution always lies with adopting a new tool.

It is an extremely simple and seducing idea: it’s the salesperson’s bread-and-butter, the executive’s silver bullet. “We suck at Y or Z? Well, it’s only because we are missing the miracle tool that will solve all our problems in an instant!” Thinking so is a way to avoid hard, cold, analytical look at a company’s weaknesses.

Startups, I discovered, cannot afford the luxury of escaping their reflection in the mirror.

At ManoMano our tool didn’t help, that’s for sure, but it turned out what we needed to solve our problem was greasy, painful, effortful culture change.

Every system has its flaws and blind spots. Some readers may argue that while we claim to be agnostic to tools, we are in fact heavily relying on Google Tag Manager — and that would be fair. We could also mention that our governance took some iterations to get right, that we assumed too much and still have to explain how the process works internally.

However tools aren’t the solution: it helps if you can acknowledge your shortcomings, confront your reflection in the mirror, get your hammer and nails and actively work towards improvement. But we are a DIY company and that’s just the way we see the world 🛠.

Appendix: ManoMano’s implementation

Even though there is no one-size-fits-all solution when it comes to product analytics, you can find our technical implementation on this GitHub repository and kickstart your implementation.

--

--

Clément Caillol
ManoMano Tech team

Head of Product @ Monisnap — Helping users everywhere send money back home to support their families. Ex Google Ex ManoMano