The Lean Accounting Compass

(Join the conversation using the hashtag #Dotchat , @ByngSystems)

Part of our series of thought pieces on beating the competition with an agile and iterative mobile product strategy, we look now at one of the guiding principles of Lean Methodology; measuring what’s actually going on with your product.

If the assumption that we’re not always right 100 percent of the time stands, then we know that we’re going to need to make a decision at some point to change something that’s not working. If that is the case we need to track what’s working and what is not.

Tempting as it is to keep going back to our customers and presenting them with intercept surveys, to see if they prefer version A or B of the product (known as split testing), can we really keep on surveying time-crunched users? What happens when we want feedback on a weekly basis? Perhaps we can we look at metrics in our business which infer what we’re doing is better.

With digital engagement, users kick up a whole host of eminently trackable information and the reality is that we can use this data to attach success criteria to our assumptions.

WE ALL KNOW ABOUT ANALYTICS, BUT WHAT TO MEASURE?

Lean Methodology avoids vanity metrics such as visitors, users or page views (these are often function of marketing rather than the product) but draws on actionable measurement such as metrics around retention and referrals — KissMetrics has a great 9 point list.

In the case of a financial services business, key success factors for the marketing team are likely to be conversion or in the case of advertorial revenues, then how long someone looks at the content (known as dwell time) is probably more important.

So how do actionable metrics look?
Let’s see with a couple of assumptions…

Example Assumption 1)

Push notifications will drive repeat visits to the app

Way we could measure this.. 1)

Percentage difference in average number of return visits over 30 days for users with push notifications enabled

Example Assumption 2)

Providing an option to select the time for a call will reduce the number of calls

Way we could measure this.. 2)

Number of people who click “contact” and opt to entering a preferred time to call rather than calling from the app.

In our workshop on 22/10 in London we will look at some detailed assumptions with visuals to show how we can hook in tests into the app build process.

NOT GETTING YOUR METRICS IN A TWIST…

Looking at the examples above it’s pretty clear that these metrics are complex measurements rather than simple volume based charts. We’re going to have to get the learnings from these metrics reflected in our product, through new or amended features, via technology changes and in associated development cycles.

Agile philosophy and methodologies help us where metrics should drive the product requirements. Choosing actionable metrics as key project success factors, which sit alongside the requirements, are defined in a project initiation type document (PID) generate dependent tasks to implement tracking in the project by asking the following questions:

Can success be measured by an existing metric and we split test it with users?
Does this feature need a new metric to be measured? If so what is it?
If it can’t be measured then is it actually of value?

Using the principle of “One Metric That Matters” for a project or start-up, for any metric we want to test, it should be compatible with our cohorts (different user groups), split tests, app versions and campaigns and be tracked in our monitoring tool, to see if any user activity has significance (in terms of success or not).

What is more, we’ve got different version releases going on, different split tests, different cohorts and a whole bunch of different channels feeding the apps. To make sense of all the metrics we’re tracking, we’re going to need to tool up with analytics software!

3 TOOLS FIT FOR THE JOB

So, what tools do we use to gain insight from our metrics? This motley blend of tools forms our usual stack at Byng

Google Analytics (or other visitor tracking tools) — start with the user tracking tool but also with powerful tools and integrations to measure cohorts and also hook into your existing stack.
Apptimize or Optimizely— for multi variant or split testing within the app. Some hybrid frameworks are also rolling out their own A/B testing tools (hello there Ionic)
KissMetrics or Mixpanel — to bring it together with other metrics as a reporting tool but with specific actionable metrics and with a focus on who is doing what.

Our vote goes with KissMetrics on their alignment with bringing their product into reality. As a software engineering house our focus is on creating software which users love and any platform that helps product owners get that insight is worth it’s weight — err bytes — in gold.

Written By: Ollie Maitland, MD @ Byng

Byng is a tech consultancy and software engineering house based in London and Leeds, specialised in developing digital & mobile services and products that add value to business.

Follow us for more updates and insights:

https://www.twitter.com/byngsystems

https://www.linkedin.com/company/byng

https://www.byng.co

One clap, two clap, three clap, forty?

By clapping more or less, you can signal to us which stories really stand out.