An[other] apology for digital metrics
Several months ago I was invited to speak at APG’s Noisy Thinking in defence of digital effectiveness from the POV of a digital agency. My talk can be viewed here and it’s core point is “people moan about digital metrics because they have taken them for granted, not questioned them and used them improperly”.
But this article is not about that. This is about this — an article written by Professor Mark Ritson and entitled “Why can’t marketers see that digital metrics are bullshit”. A clickbate-y title if I ever saw one, but I clicked. This article needs to be thoroughly analysed and humorously combatted (we don’t want to seem tense, right) if you work in digital and have a bit of respect for your job. Why? Because it makes a big sweeping statement in the title and brings some shoddy arguments afterwards, and we know very well that we live in an age where people read through titles and often don’t bother to read through the arguments.
So let’s start with the title. Digital metrics are bullshit. What does that mean: are they incorrect? are they used incorrectly? are they measuring the wrong thing? In theory, what he could be saying is that digital metrics do not help us get a clear view of the effectiveness of our marketing, right? That is a pretty serious accusation and one that people seemed to think is echoed by people like Les Binet, whose paper, The Long and Short of It, seems to suggest that same thing. I read that paper, by the way, and it does not say that. All it says is some people use certain digital metrics incorrectly to demonstrate the wrong type of effectiveness.
Now, onto Professor Ritson’s arguments, humorously clad in a very entertaining comparison of him measuring his penis every day and comparing it to his friend’s. Here are the arguments for his sweeping statement:
- Facebook made a lot of mistakes with measuring stuff — (I’m assuming he means all the video metric scandals we all know) unlike him, whose measuring of his penis with a tape measure every morning should be completely trusted and not questioned at all.
- Facebook overstates the number of users they have — paragraphs 7 and 8 in his article, again relating to recent scandals exposed by the media world, again unlike him who, when questioned, gives everyone his penis’ exact measurements.
- Digital metrics are inelegant and too many — Facebook has 220 metrics that need to be checked as opposed to his penis whose only relevant metrics are length and girth (let me tell you, there’s a feminist article in here about what men think is relevant when it comes to their winkies but let’s leave that for the time being, shall we?)
- Facebook does not let anyone double check their metrics or
- Facebook will not allow anyone to compare their metrics with anyone else’s so that we have similar benchmarks across multiple digital platforms.
First, may I point out a pattern here? We speak of “digital metrics” and yet all the examples used are from one single platform. YouTube is name dropped in there a couple of times but mostly the culprit is Facebook. I find it slightly unnerving that one can argue against digital metrics using one single player as an example. It speaks to our unhealthy bias for simplification which I’ve spoken about here.
Second, let’s all agree that measuring the viewing habits of millions of people at the same time requires a bit more computing power than taking a tape measure and reaching down to your penis to measure it. I love a good comparison but this is not a good one. A better comparison would have been if Mr. Ritson had sent tape measures to all his friends, asked them to simultaneously Facetime him as they were ALL measuring their penises and shouting out the measurements back to Mr. Ritson. I wager that might have generated a bit more confusion.
Third, too many metrics and oh-so-confusing. As I pointed out in my APG speech, these metrics are NOT God-given. Most of the platforms work with them because they are legacy metrics from the olden days of the Internet and they have had to come up with makeshift metrics as video consumption became a thing. Nobody is preventing the “advertising establishment” from deciding that the way we measure effectiveness is through reach and affinity and telling platforms they need to ladder up their metrics to those two. Nobody is preventing us from making better metrics. I have championed a more thorough understanding of video completion rates in talks with all the platforms. In my work with Facebook, I frequently urge them to present meaningful combinations of metrics such as video completion percentages for engaged audiences and provide benchmarks for different industries and marketing objectives. It’s up to us to look at the metrics that make sense or consider better ones.
Finally, cross platform comparisons and comparisons with other mediums. This is pure lack of information on Mr. Ritson’s part. Of course most metrics can be compared cross platforms and they regularly are. When we look at PCAs, we compare reach across platforms, completion rates across platforms, formats and many others. To say digital metrics are not transferrable and no general picture can be gleaned from looking at them in aggregate is patently not true. Moreover, when I was working in Google, our Polish office had built a success model to compare YouTube metrics to TV’s own which made it possible to calculate incremental reach but also compare VTRs.
I always enjoy Professor Ritson’s articles. He writes with gusto and is incredibly adept at highlighting what the industry secretly moans about. And this particular moan is correct in some places. The platforms have become large enough to no longer feel they are accountable. Some people in the industry have become lazy and stopped asking questions. But this does not mean “digital metrics are bullshit”. Some platform mistakes need to be (and have been) called out and we need to work together to fix them. We definitely can get better at measuring effectiveness on digital. But what we should never do is question an entire industry and it’s approach to effectiveness on the basis of one example which is more about business practices than metrics really.