7 Components of Successful Web Analytics

Andy Semenihin
5 min readMar 18, 2020

--

You might have already realized how important the web analytics measurement strategy is when it comes to making business and marketing decisions based on the actionable data. In fact, the latter is quite impossible without a consistent measurement and data culture in the company. I’ve gathered, as I believe, the most crucial keys to a successful digital analytics strategy below. These will trigger a sizable uplift in your paid marketing efficacy by establishing a robust data measurement foundation.

Is your data actionable?

Speaking of the prerequisites, the reason you are doing this is to make your data actionable. You are not just collecting it for the sake of having web analytics in place but utilizing it for marketing effectiveness and UI optimization. Trust in the collected data among the key stakeholders and its actionability is a valid justification for developing and implementing a structured measurement approach.

Business Goals, KPIs and Conversions

Not a secret at all, of course, business goals should be directly translated into macro and micro conversions in the online measurement strategy. It’s not necessary to go too far down the rabbit hole, but ‘good enough’ approach would at least entail aligning on the macro conversions, optimization objectives, transaction tracking accuracy, and the consistency across multiple vendors if they exist in the mix.

This alignment means that your conversions should match the back-end CRM system (no revenue discrepancy). The marketing conversions for Google Ads, Floodlights, Paid Social should use the same triggering logic. Finally, your UI/UX optimization should be targeting those confirmed macro conversions as the primary objective.

Now, what’s more important is that the web analytics team should know the business context, not just the list of KPIs. Do include the people that put forward optimization proposals to the higher-level business meetings to understand the dynamics better.

Measurement Governance

Don’t underestimate the necessity of documenting the measurement strategy. In the long run, having a governance process is the main prerequisite for scalability and informed advancement. Have your analytics team plan and develop a robust measurement plan (a tracking brief) based on the laid out business requirements. This documentation should be updated with every prod release for the sake of future references, regression testing, and knowledge sharing across the company.

For a reliable data roadmap, make sure to have these items developed:

  • Business requirements documentation including the major KPIs laid out (BRD)
  • Measurement roadmap (tracking brief) with the tracking logic explained
  • Metadata roadmap indicating the logic of the collected custom attributes
  • Tag governance documentation describing your current TMS system state including your 3P vendor tagging logic
  • Event governance document listing all collected events, locations and corresponding attributes

Meaningful Collection and Benchmarks

Don’t waste your time, money, and resources on tracking ‘everything’ unless you are running ML on this data and require as many signals as possible to capture (trajectories analysis, churn rates, next best actions, etc.). Tracking every single element on the site doesn’t always lead to incremental value and will impose a significant burden on the implementation teams. In all other respects, measure as much as it’s required by the business and enough to make an insight leading to action.

  • Start by tracking only essentials that can provide a valid input for UI/UX optimization.
  • Move faster, plan iterations, deploy analytics in sprints synched with the production release schedule.
  • Set internal benchmarks for vital ‘Health of Business’ metrics.
  • Act on the data that you collect or stop wasting time on meaningless reporting for the sake of reporting.
  • Make changes, improve, repeat.

The key takeaway here is that collection has to lead to action. Don’t accept any reports without clear takeaways and subsequent next steps.

Integrations

Utilize platform integrations where necessary and aim to end-to-end approach in the web measurement.

  • Cross-platform data: aggregate the customer journey data across web and app properties. You’ll only get meaningful information if these flows are unified by a user identifier (deduplicate users)
  • Integrations for unsampled data: use BQ and API connectors, sampled data is useless if you want to be data-driven.
  • Integrate the back-end: expand the funnel by mapping the online data to the offline CRM pipelines, optimize against the objective that’s further down the pipeline.
  • Integrate marketing cost and impression data: add the impression touchpoints into the online conversion paths for a more sophisticated attribution, stream marketing cost data into your web analytics solution. If native integrations aren’t available, use click and impression trackers.

Regression Testing

The successful implementation doesn’t guarantee the stability and scalability of your web analytics tracking. The analytics measurement team needs to run regression testing cycles regularly to make sure web deployment doesn’t impact your tracking code.

  • Run regression testing of analytics code in every release cycle during active analytics implementation or at least once a quarter after all tracking is deployed for sustainability reasons.
  • Implement automation testing of the most critical features (e.g., ecommerce code on the confirmation page)

Test, Analyze, Act and Repeat

As long as you have a reliable pipeline of data, make use of this information to run quick HADI cycles in your organization. Keep your optimization initiatives as fast and cheap as you can and aim for a shorter turnaround with clear next steps to apply changes.

Hypothesis > Action > Data > Insight

  1. Start the cycle with the hypothesis laid out in an “If.. — then..” format. Not all of them require testing, and you can choose those that have the most impact on the business (or a business growth metric) and require the least effort (cost less).
  2. Act on the accepted hypothesis, run A/B testing to confirm the lift in the chosen metric over the original state.
  3. Collect the data to analyze the delta of the success metrics. If your technology allows tracking multiple objectives, have several critical KPIs analyzed.
  4. Generate insights based on the collected data. If the hypothesis is confirmed — scale the change, if not — analyze what didn’t go as expected and why.

Keep continuously running the HADI cycles to provide the business with incremental value steadily over time. Ideally, this process should never stop.

Final Thoughts

In my experience, things went wrong when at least one of these pillars wasn’t paid enough attention: Process, Governance, Sustainability, and Actionability.

Put the right process and people in place, manage and document your measurement program, verify its stability over time, and act on the data you collect. This will allow trust in the data and provide the business and marketing with incremental value add.

Thank you!

Andy Semenihin | Head of Analytics @ DELVE

--

--