Thinking of useful web KPIs, we discovered…

Michael Parker
Planet 4
Published in
13 min readOct 20, 2016
Photo by SergeyNivens

The Discovery phase is drawing to a close and we had some really interesting discussions around Key Performance Indicators (KPIs) for Planet 4 as part of the ‘Analytics Review and KPI Selection’ track.

The track leaders were myself and Pasquale. A fantastic group of volunteers from within and outside of our organisation were kind enough to join the discussions and input into this track.

Our mission was to conduct an audit of KPIs and metrics being used in Greenpeace at the moment and then to propose some new KPIs that we could measure in the future on the new and improved greenpeace.org website.

How we approached the task

  1. We looked at the current metrics and KPIs that people were using in Greenpeace by sending out a small, focused survey
  2. We looked at the overarching objectives of the site
  3. Broke these objectives down into specifics
  4. Listed all of the desired interactions/action people could take on the site to reach these objectives
  5. Listed all KPIs we want to measure in a brainstorm
  6. Listed the segments we would want to analyse with these KPIs
  7. Improved the list of KPIs by thinking about how useful they are and who they are useful for

Anyone who has ever attempted to devise a set of truly useful KPIs, knows that it is not an easy task. What do we mean by useful anyway? Well, we had a go at determining what that meant.

What are useful KPIs anyway?

To start, the below graph is helpful when thinking about what useful KPIs are.

Are your KPIs truly useful?

Useful KPIs should be insightful, actionable and ideally both. Insightful KPIs are not always actionable (e.g. Number of Unique Visitors/Users are insightful in telling us how many people are visiting our website, but it is not really a KPI that changes the way we invest resources). On the other hand, a KPI such as MeRA (Members Returning for Action) gives us both insightful and actionable data (e.g. If we know that supporters are not coming back to the website and taking action on a regular basis we might want to investigate why and think of ways to provide more engaging opportunities on the website for them).

Another way of thinking of metrics that are nice to have, but not necessarily useful, is to think of them being ‘Vanity Metrics’. The Mobilisation Lab and a group people within the NGO sector produced a very interesting report on Vanity Metrics that we recommend reading for more information. In the report, Vanity metrics are described as “data that are easily manipulated, are biased toward the short-term, often paint a rosy picture of program success, or do not help campaigners make wise strategic decisions”.

In attempt to list why Vanity Metrics should be avoided, the report listed “The Five Pitfalls of Vanity Metrics” as being:

  1. Flawed understanding, leading to poor decisions.
  2. Short-sighted decision-making.
  3. Bad staff incentives.
  4. Failure to engage members, leading to unsubscribes and people tuning out.
  5. Organisations that don’t live up to potential.

In addition, useful metrics should also be benchmarked and have a set target in order to understand what is successful and what is not e.g. A metric such as the Number of Petition Signs is not very useful by itself if we don’t have a clear target or have historical data to compare with on hand.

Ideally you will also use segments/cohorts to analyse your KPIs as well. KPIs measured for an entire audience are not as useful as looking at specific segments of the total data set (e.g. returning users, users on a mobile device, users spending a long time on the site, etc.).

KPIs should be designed with the audience in mind as well. And by audience we mean the people who are going to use them. What good are metrics that no one will find helpful?

Scope of the track

We are only looking at what a user of the website would typically consider as the website: our home page and all linking pages from the menus, advocacy pages (e.g. petition pages), and donation pages. We have microsites, volunteer community platforms, crowdsourced campaigning websites, and a range of other portals, but for this exercise we chose not to focus on them.

To clarify, the people participating in this group felt that the main difference between a metric and a KPI was that we have many metrics, but KPIs are more valuable in measuring performance. We wanted to touch on metrics, but focus on having a set of KPIs as outcomes of this process.

The Process

1. What we are measuring at the moment?

The first task was to find out what sort of web KPIs we are measuring at the moment in the Greenpeace realm. We sent a survey to a group of identified staff within Greenpeace that would be measuring and using web KPIs in order to find out what they are typically using.

This helped us understand the starting point before we could begin to thinking about any new KPIs that would be useful to measure and track in future.

We found that most people in Greenpeace are still using “Vanity Metrics” that are not very actionable, insightful, compared against benchmarks, or segmented. We did however see that some people are using more useful metrics to measure such as New petition signer vs. already existing, Funnel conversion rate, % petitions signs with phone number, % petition signs above 25 years old, Number of users made a donation after sign a petition, Conversion x day of week, and Conversion x hour of day.

There are certainly more useful KPIs being used in Greenpeace, but we wouldn’t manage to get a full audit done in the 2 weeks that we had. Nevertheless, we pushed on and began to think about some more useful KPIs we should be using moving forward.

2. The objective of Planet 4

We felt that it was important to start this process off by thinking and agreeing on what the objective of Planet 4 is. Basically, why does this website exist in one sentence? In one sentence because experience tells us that if you can’t sum it up into a very clear one sentence goal, it’s probably not clear enough.

We drew the goal from the article on Medium written by Matt Browner-Hamlin and we all agreed this to be the goal of Planet 4:

The goal of Planet 4 is not to be only a vehicle for putting content on the internet, but for driving people to action”.

So in summary the goals are:

a.) A place that drives people to action, and

b.) A place for engaging content

3. Breaking up this objective into more specific objectives

From this overarching goal, we broke it down into more specific objectives based on the three pillars of engagement: breadth, depth and open. As Planet 4 will primarily be an engagement platform, it made sense to break up the objectives in this way.

4. What are the ways in which people could interact with the site

In this step, we took the 6 stated objectives mentioned and thought about the different ways in which people could interact with the website. The purpose of this exercise was to think about the behaviours of people with the lens of the three pillars of engagement. It would hopefully be a good starting point before jumping into a KPI brainstorm because it forces us to think of the behaviours first before thinking of ways to measure them.

5. The KPI brainstorm

The brainstorm took place over video conferencing and we used a Google Doc to collaborate. I think it was a successful process and I was happy to have the great minds that we had in the call, even though we could have had more people attend, but it is always a challenge getting large numbers to join from many different time zones.

The brief to the group was to list all useful KPIs that they could think of given the over-arching and more specific objectives of Planet 4. We made sure to spend a few moments before the brainstorm to discuss what useful metrics or KPIs are so that we were clear what the expectations were.

We used the objectives table already mentioned in heading 3 to focus the brainstorm on the different levels of engagement and the overall objective of Planet 4. I won’t put the entire table into this blog post because it is quite large, but we had a lot of input and it was very valuable. Some of the KPIs were not in the correct boxes which I expected and this is why we chose to start very top-line and get more granular as we all had a good idea of how to focus our thoughts and stick to the brief.

6. Which segments/cohorts to analyse

The purpose of this step was to help everyone think about the types of segments that they should analyse using the KPIs they came up with. This step is important because it helps us understand how we would use the data that we collect a bit more and it connects KPIs to real people — making it more personalised and powerful.

Here are what the group came out with:

  • For signups: new emails vs. existing emails
  • For email: users who have signed up a petition at least once in the last 12 months
  • For email: users who have at least opened an email in the last 12 months
  • For signups: Users who have clicked share / donate buttons after signup
  • Users with traffic source = social media, Users with traffic source = email, Users with traffic source = Pay per click
  • Users with more than X visits (3 for example)
  • Mobile users
  • Desktop users
  • User that have signed at least one petition
  • Donors
  • Super-cyber activist
  • App users
  • Users that have visited/converted from 2 different sources (email > social)
  • Users grouped by interests based on page sections
  • Age groups
  • Video fans (if it is possible to measure how much time a user spend seeing videos)
  • Social sharers
  • Comment fans
  • Passively Engaged users (visited more than 3 pages AND on the site for more than 2 minutes BUT didn’t convert)
  • Participatory Engaged users (visited more than 3 pages AND on the site for more than 2 minutes AND converted)
  • Macro Converters (took ‘Macro’ conversions e.g. signed a petition, became a donor, etc.)
  • Micro Converters (took ‘Micro’ conversions e.g. shared on social media, watched full length of video, scrolled to the bottom of a blog page, etc. )
  • Job seekers

An interesting set of segments indeed. I think it was a healthy exercise for people to go through this thought process in thinking about the groups of people we are interested in analysing. Even more interesting, it helped us go back to the list of KPIs and rethinking whether or not they would be useful.

7. Who will use these, how could we benchmark, and how would we track these KPIs?

This process was more anecdotal and not completely solved, but it was mostly useful for further optimising the quality of our brainstormed KPIs in order to ensure that we are suggesting truly useful ones.

We are not entirely sure how we will track the metrics yet, but we have some pretty great ideas. Our most widely used web analytics platform is Google Analytics and we mostly believe that we can use smart tracking implementation setups (e.g. using apis, user IDs, etc.) and well defined account customisations to gather most of what we need. It was clear that we need to start linking our database with our web analytics platforms (while ensuring that we are meeting privacy standards of course).

For visualisation we would most probably use Google Data Studio as our Global Insights team have been doing some great experimentation and evaluation to assess the capabilities of the platform. The findings so far have been positive and exciting to say the least.

The Outcome

In an attempt to bring all of the brainstorming, conversations, and comments into something clear, concise, and easy to digest, we summarised the KPI recommendations in the following sections.

Macro & Micro Conversions

It’s helpful to split up the types of conversions as Macro and Micro conversions.

What do we consider as Macro conversions? ‘Macro’ referring to conversions that are outcomes.

Macro (the main reason the site exists)

  • Sign a petition
  • Give money
  • Volunteer your services
  • Write article
  • *Start a petition
  • *Attend events
  • *Organise meeting/event

*We will prompt people to take these actions on greenpeace.org, but these actions will occur off-site e.g. Greenwire, GreenpeaceX, Facebook, etc.

Micro (More than why the site exists, for deeper analysis)

  • Share on social
  • Apply for job
  • Download content
  • Scroll depth
  • Page depth
  • Total attention time onsite: How long users spend engaging with content across their site
    Total attention per piece: How long a user spent engaging with a singular piece of content
  • ROE (Return on Engagement)
  • ROI (Return on Investment)

… And many others mentioned in this section.

Conversion rate

Conversion rates are important for measuring the success of content to drive people to action. However, we need to ensure that conversion rate is not a stand-alone KPI, but is broken down into more useful types of conversion rate.

Conversion rate … of landing page x … in comparison to other landing pages

Funnel conversion rate: % users convert from Step 1 to conversion? E.g. From main donation page to donation success

Assisted Conversion Rate (first interaction conversions / users): actions that are on the conversion path but are not the last interaction conversion

Task Completion Rate: Pop up to ask the question to the user upon leaving the site — “Did you complete the task you set out to?”

Re-engagement with users

In order to understand the breadth and depth of the website, we need to understand how many new vs. existing users are accessing the site, how often, and what type of actions they are taking. Below are some example KPI’s for measuring this:

New Users to Database (First Engagement): When a member takes an action (i.e. supplies an email) we need to know if they are new to our database at least within 24 hours of a user signing up. This is one of the most in-demand KPIs that offices are struggling to technically set up

Members Returning for Action (MeRA) (Re-engagement): Number of returning users that took action. Returning users = synced with EN to set the userID.

User Frequency: How many repeat visits there have been on the site by users (unique visitors)

User Recency: The number of days between the visitors’ latest visit and the visit before that

User Conversion Frequency: How many repeat actions there have been on the site by users

User Conversion Recency: The number of days between the visitors’ latest action and the action before that

Viral Actions: Users that visited from shared content to take action

Conversion or Engagement Score

It is suggested that we place a value on all conversions in order to keep an ‘engagement score’ of users. This can be achieved in Google Analytics by assigning a numerical value to conversions. We could invent an ‘engagement currency’ that would help us weight the levels of engagement for all actions taken by users on the site.

User Engagement Score Variables:

These are variables that could be used to determine the engagement value of a single user of the website. These would play a role into the analyse of engagement of users and possibly hold a numerical ‘engagement value’.

Estimated Recurring Donor Lifetime Value: At point of recurring donor signup, what is the estimated long-term financial value of that donor. (Recurring donor signup amount X Average number of months retention).

Gift amount: The amount of the one-off donation given.

Total Financial Value: The sum of all gifts over time + Any recurring donation value.

User Frequency: How many repeat visits there have been on the site by users (unique visitors)

User Recency: The number of days between the visitors’ latest visit and the visit before that

User Conversion Frequency: How many repeat actions there have been on the site by users

User Conversion Recency: The number of days between the visitors’ latest action and the action before that

Total Macro conversions: A value assigned for each type of macro conversion eg. donation = 50 points, petition sign 30 points, etc.

Attention time, scroll depth, page depth, etc.

Content Consumption/Engagement

If our website is will exist to be a place for engaging content, then we need to have a set of KPI’s to measure how users consume and interact with content.

One of the most important points made was that we need to move away from page views when measuring content engagement and start working with the industry standard, Attention Time. Attention time covers all of the content experiences that keep users engaged and on the page for longer. This includes the time the audience spends engaging with the comments section of an article.

These are some suggested KPIs that measure how people engage and interact with content on the website.

Total attention time onsite: How long users spend engaging with content across their site

Total attention per piece: How long a user spent engaging with a singular piece of content

Scroll Depth: % How far visitors scroll down the page

Page Depth: Number of pages user viewed during session

Bounce Rate: by channel or by page — A ‘bounce’ needs to be defined and configured i.e. A bounce is anybody who lands on a page AND doesn’t scroll more than 10% down the page OR doesn’t click on any elements

% Reading by section: Content is grouped into sections and split into pie chart of the most popular

Most popular tags

Top rated evergreen content / URLs per month / year

Total Comments per Day / Month

Total Comments Votes Up / Down (Sentiment) — Disqus

Total Facebook page likes/shares from web page

Total Twitter follows/tweets from web page

A note on Benchmarking

It is clear that KPIs are not useful unless there are benchmarks to gauge whether or not a KPI telling us good or bad news. We need to keep track of these KPIs as to draw from them for comparison and thus need good templates and processes in place to accommodate this.

And that was it!

That sums up the end of the discovery phase for the KPI track, but it doesn’t end there. What do you think? Would you recommend any other KPIs?

We would love to hear your thoughts. Drop a comment in the comments section below or if you want to reach out directly, you can contact me on michael.parker@greenpeace.org.

Keep an eye out for an update on the next phase of this track, the Concept Phase.

Special thanks to the people who participated in the discussions and added their valuable input and feedback. We couldn’t have done it without you: Chris Nolan, Cody Skinner, Davin Hutchins, Diego Solari, Gabor Galgocz, Kelli Tolen, Kritsana Srithanomwong, Michiel de Brieder, Nadav Savio, Natasja Zwier, Osvaldo Gago, Sina Nägel, Stefanus Wongsodiredjo, and Sylvia van Heck.

--

--

Michael Parker
Planet 4

Engagement Support Manager at Greenpeace International. We provide support to Greenpeace offices and project teams to engage with their audiences.