Ways to #$@! Up Your Marketing Analytics

Marketing analytics is complicated. Exciting, potentially powerful, but very complicated. In this episode of Marketing Made Difficult, we talk about a few of the ways you can run into to trouble managing your analytics, and hopefully, a few of the ways to shift your methods to make your work both more efficient and more productive.

A lot of things that help explain how you might be engaging effectively with your marketing analytics or ways that you might be engaging ineffectively. There are a few patterns that we have seen emerge in talking with lots of people about how they think about marketing analytics and we want to talk about four of those patterns today.

We also talk about:

The Trouble With Endless Data

That’s how it became 15 pages; we actually started out at 4 pages, which still can be a lot. Every time somebody would ask a question in our weekly meeting or e-mail me something, I would say at least three or four times out of ten, that resulted in something additional being added to the weekly reporting.

What Happens When You’ve Overconfigured

This specific example was the most sophisticated Google Tag Manager, Google Analytics setup I’ve ever seen. Like you’ve essentially built an app on top of both. It’s really impressive if you’re into tag management and capturing every event imaginable. But it was completely debilitating for this client.

How Many is Too Many KPIs?

Let’s say you’re tracking the right amount of data. You got all of your configuration set up in the right ways, but how do you know if you’re actually looking at the right performance indicators, and what happens if you pick performance indicators that actually overlap?

Managing Unrealistic Technical Expectations

A lot of people think that, automatically, you can just jump back and forth between what steps a person entered on and where they left off, and segment out by age and by time, or by device.

Full Transcript

Vincent Barr:

Welcome to Marketing Made Difficult. I’m Vincent Barr and I’m here with my co-host, Justin Dunham. On this podcast, we’re going to talk about stuff marketers don’t usually talk about, analytics, code, process and what we can learn from engineers, designers and really, everyone else we work with. A lot of marketing content focuses on the right way to do analytics, but the thing is, marketing analytics is pretty difficult. With that in mind, we’re going to pause and talk about bounce rate for a minute. Justin, how do you feel about bounce rate?

Justin Dunham:

Yeah, so I wanted to bring this up because bounce rate is like the simplest measurement. You log in to Google Analytics, it’s right there. Intuitively, it feels like a really good way to measure, let’s say, the homepage of your site. I remember when I was first interviewing for marketing jobs and I got questions about websites that people ran and they said, “All right. Well, how would you tell whether this page is successful or not?” I would say, “Oh well, look at the bounce rate.” Bounce rate, just to define that for all of our listeners is, the rate at which a visitor comes to your site and then immediately leave, i.e. without visiting another page on your site. If 90% of your visitors only look at one page or only look at say, the homepage, the bounce rate for that page is 90%. I thought it was really-

Vincent:

Yeah, and no interaction on the page too.

Justin:

No interaction. Now, that’s another good question which we should come back to because people do I think define it different ways. I mean, what do you count as interaction?

Vincent:

For example, if someone is on the homepage and you have a video that’s part of your homepage banner and someone clicks play, they many not visit another page, but there’s one school of thought that says, that’s an interaction so it’s technically not a bounce. If they left, that would actually be counted toward the exit rate and that would be where they exited, but not a true bounce.

Justin:

Already one wrinkle in this seemingly simple measurement and the fact Google Analytics, when you send an event to it, it gives you an option to determine whether an event counts as bounce or not bounce, just like you’re saying. You used to think it was really simple. Vincent just added a wrinkle to it. Then actually, when then you and I started working together, you had another idea which was, well, bounce rate is actually … can be good in a way, because if you’re bouncing visitors who are not interested in your content, who you don’t really want to engage with, isn’t that actually a lot better?

Vincent:

Yeah. I think you could signal two things, right? One is that you’re disqualifying the traffic that wouldn’t be relevant to you, the same way your web form should qualify leads for a sales team if you’re a B to B company. On the other hand, it raises the question of, “Well, are we attracting the wrong traffic and where from?” If that’s paid advertising, for example, and it’s a high bounce rate, that’s really bad. I think the bounce rate leads to a lot of additional questions, but one of my general rules of thumb is that I typically don’t pay that much attention to bounce rate and I think there’s some superior metrics to bounce rate. One of them being average page depth or conducting a qualitative studies on the homepage.

Justin and I were talking about bounce rate and whether that’s a good metric for measuring the performance of a new homepage or comparing the performance of two homepages. Justin, how do you think bounce rate is a good metric for measuring homepage performance for example?

Justin:

Yeah, so in step three of my journey. I started out thinking it was great as a metric and then I moved to thinking, it was not so good. Now, I’m kind of back to thinking that bounce rate might actually be a pretty good metric after all because how else are you really going to compare to, for example, different homepage designs. Your idea of using average page depth I think is great. How often do visitors come in and look at three pages versus four pages or two pages? How engaged are they really with your content? But that’s something you’re going to improve just in the course of improving your site or, by the way, you could also improve that by just splitting your site into tons and tons of additional pages.

But I’m now thinking that bounce rate actually, because it’s so simple, might be a good thing to look at.

Vincent:

Yeah, absolutely. Good point about splitting things into multiple pages because you could just superficially optimize four page depths and just drive curiosity clicks, is what I call them, so you’re kind of leaving your audience hanging just to encourage the action of a next step. I don’t think … Yeah, I could see cases where page depth isn’t optimal, but my argument against bounce rate is that, it doesn’t quite give you any information as to, why people are leaving or if it’s really a problem. In other metric, I would prefer the bounce rate on the homepage would be task completion rate or the speed of completing a task or the speed at which people could find the information they’re looking for.

I know in working together in the past, we set up some user task where we would compare the company’s homepage to several competitors. We might ask people who are completing the user task a few questions about what services the company provided, who it was for, whom our competitors might be, our value proposition. And the content comprehension piece is really important too because you can’t expect all visitors to be returning visitors, especially if they’d only visit the homepage, even though marketers … The reason a lot of people add the modal windows and popups to capture e-mail is to have a way to bring user back to your site.

Of course, you can use advertising and retargeting, but generally, if you’re just doing that on the homepage, that’s a pretty low intent audience and you don’t know for what reason they found your site. I like the color that some of the metrics like task completion time and task completion rate provide when you’re looking at the homepage.

Justin:

Marketing analytics is complicated, as you can tell from the fact that Vincent and I literally just spent several minutes talking about bounce rate. There are a lot of things that help to explain how you might be engaging effectively with your marketing analytics or ways that you might be engaging ineffectively. There are few patterns that we have seen emerge in talking with lots of people about how they think about marketing analytics and we want to talk about four of those patterns today.

The first one that we really want to get into is what I like to call, “endless data.” I have worked with a company, it was a B to B software company where their weekly reporting was about 15 pages long. Not only was it very time-consuming to prepare, but I think the major problem with the endless data anti-pattern, if you will, is that it lets you act as if you are being data-driven because you just are looking at so many numbers, but it doesn’t cause you to actually think about what you’re doing with that data, as we saw, can important with the bounce rate discussion.

The first segment in our show today is talking about his anti-pattern I like to call endless data. I am working with a company where our weekly reporting is huge. It’s dozens of pages long, and these are things that I spit out every week for people to look at. It’s a problem, as you can probably imagine. I think the fundamental problem is that having too many numbers that you’re looking at at once lets you appear to be data-driven without actually thinking about what you’re doing, which is kind of the whole point of actually looking at the data in the first place. Vincent, what do you think about this?

Vincent:

I totally agree. One thing I’m curious about is, did you guys start with a 15-page report or did it gradually emerged into like this endless data problem? Was it at the onset that, “Hey, we want to capture all the data that’s available to us?” Or is it that over time it was, “Hey. Actually, it would be nice to measure this metric as well,” and maybe different teams came to you and asked you to monitor additional metrics. I’m curious how it became 15 pages.

Justin:

That’s how it became 15 pages; we actually started out at 4 pages, which still can be essentially a lot. Every time somebody would ask a question in our weekly meeting or e-mail me something, I would say at least three or four times out of ten, that resulted in something additional being added to the weekly reporting.

Vincent:

I think that that can be challenging too because on the one hand, you want to provide people with answers to the questions they have and if they’re not able to serve themselves within the analytics platform, that obligation and that responsibility falls on you. On the one hand, I think the intention behind having endless data is good. It’s that you want to have a well-informed team and everyone has all the possibilities and data they would need to make informed decisions, but I’m curious. Yeah, if you have 15 pages of analytics and reporting, what percentage of that is actionable? Second that is, you have Hick’s Law, which basically states that the more decisions you have, the longer it’s going to take you to make a decision. Imagine, that must definitely be in effect with 15 pages because you just have more things to prioritize too.

Justin:

Right. There’s a lot to prioritize and each new number that you put in to what you’re reporting leads to more questions about those numbers. If you have your pattern set up so that every question you get results in more weekly data that you have to prepare, it gets a lot worse and not better. I think part of the way analyst data emerges, as you’re pointing out, Vincent is, it’s great when people ask questions, but it is important to be pretty structured about the responses to those questions and look at the downsides of actually adding all that data, which isn’t just, “Oh, I’m going to be spending more time preparing these things.” It’s that a lot of those things are impossible to act on because you have to choose between millions of different data points.

I think there are some others as well too, Vincent. When we sort of talked about this in prep for the is episode, you mentioned the idea of having a lot of different sources of truth.

Vincent:

Yes. I’ve seen a lot of the cases where we’re building dashboards on top of dashboards and we have different reporting mechanisms and so forth and there’s no real source of truth. I think this gets even more complex though when you have, say, you’re trying to reconcile data between different advertising platforms. You’re trying to look at third party data with first party data, so maybe, you have lots of customer data about how frequently people purchase, and demographic information, age information, personally identifiable information that obviously you don’t have on third party platforms, and you’re trying to figure out, “Okay. How do we transform and clean this data in a way that it’s easy to interpret and understand?” That can be really, really hard and also really frustrating too, because it’s likely that as you empower other teams with data and if you’re an organization where everyone has access to analytics and all the analytics platforms or looker and they’re running SQL queries. It’s probably common that everyone is going off of different sources of truth or they have different means by which they get at the end metrics they’re understand or monitor. Trying to reconcile that across platforms is just a huge effort. It’s really time-consuming and it just causes a lot of communication overhead that can be a little bit frustrating. That’s another thing I see.

Justin:

Another argument kind of against having too much data can be really painful for everyone. I think one of the things that’s sort of ironic about this is, as much as, Vincent, you and I love numbers and we love marketing analytics, I think part of what we’re saying is kind of like, “Yeah. I mean, ask questions but be thoughtful about what you’re going to do with the answers and be thoughtful about creating a culture where just looking at more numbers is viewed the same as actually thinking critically about what you’re doing.

Vincent:

Yeah, absolutely. I think far more important than capturing this huge range of numbers and maintaining that dashboard and that reporting is just providing good context for the few numbers that you do monitor, because that’s really time-consuming. That’s you taking your memory of the last four weeks of changes or updates that you’ve made, maybe to marketing campaigns to web campaigns, to UX. There’s so many changes that could have influenced a spike or a dip in analytics. I think it’s really important to focus more time on asking why because the numbers will kind of show you what’s happening, but it’s your job to then investigate. I think most of the time should be really focused on investigating what’s going on with the critical few metrics that you decide on. But the challenge is that yeah, we have too many numbers and we’re looking at way too much stuff, so it’s really distracting and you kind of want to strike the balance of having as little data as possible while still being informed and not making any egregious mistakes.

Justin:

Having as little data as possible let’s you also spend a lot of time getting as much context as you can for the data that you do collect, so context is super, super important.

I think the second thing that we want to talk about which is kind of related to this is, Vincent, you had a great story about a Google Tag Manager installation that you dealt with that had sort of taken on a life of its own and its own aspirations and hopes and dreams.

Vincent:

Yeah. I think, yeah, this idea of overconfiguration comes from the desire for endless data. It’s like we need to collect everything, so how do we do it? Most platforms don’t do that out of the box and most platforms won’t answer all your attrWhat Happens When You’ve Overconfigured

This specific example was the most sophisticated Google Tag Manager, Google Analytics setup I’ve ever seen. Like you’ve essentially built an app on top of both. It’s really impressive if you’re into tag management and capturing every event imaginable. But it was completely debilitating for this client.

How Many is Too Many KPIs?

Let’s say you’re tracking the right amount of data. You got all of your configuration set up in the right ways, but how do you know if you’re actually looking at the right performance indicators, and what happens if you pick performance indicators that actually overlap?

Managing Unrealistic Technical Expectations

A lot of people think that, automatically, you can just jump back and forth between what steps a person entered on and where they left off, and segment out by age and by time, or by device.

ibution questions about of the box. Heap Analytics actually is pretty awesome at doing those things. That said, so I started working with a company and their current analytics system and process basically wasn’t working for them in a sense that it was very hard to intuit what was going on. The reporting structure, the naming conventions, kind of every element you could imagine was very difficult so that the team ended up following a lot of the protocol that had been given to them, but they just weren’t able to extract any insights from it.

They have the burden of having to maintain all this reporting, but they’re just unsure on how to act on it or how to dive deeper and answer the questions that are prompts. This specific example was the most sophisticated Google Tag Manager, Google Analytics setup I’ve ever seen. Like you’ve essentially built an app on top of both. It’s really impressive from like, I guess if you’re into tag management and capturing every event imaginable. But it was completely debilitating for this client. Really, my job was to not under configure, but bring things back down to a level where, “Hey. We don’t need to know the specific sub, sub, sub, sub, sub topic or idea of this particular campaign,” and kind of reset a lot of the notation that they used and processed so that it was much more intuitive and actually able to be acted upon.

This, to me, was a great example of overconfiguration because really, I mean, there was a brilliant implementation, but to manage that, you would absolutely need a full-time analyst to be able to use that data and use so effectively. It also comes down to understanding your users and their skillsets and what they want out of that.

Justin:

Which is something I think that people often forget, is first of all, whenever you’re measuring anything, going back to our point about having enough context, you need somebody to because able to understand those measurements, to pull them and then to investigate them. Every other page that you add to your weekly marketing report kind of needs somebody to support that page. Similar to that, I think about as a marketing technologist constantly hearing, “Well, we should layer in this system. We should layer in this piece of technology. We should layer in this piece of technology.” What people forget is that whenever you add technology and whenever you add analytics, you have to have at least some of time set aside to support that. My rule of thumb is every new marketing technology system requires between have and an entire full-time employee to actually make use of it.

I want to tell a quick story too. We were working at a previous engagement with a paid search agency. The paid search agency insisted that they set things up so that for every dollar that we won, and this is a B to B company so every dollar that we win is the product of dozens of touches of the efforts of a bunch of different people of month or two or three. But that for every single one, we could tell what specific ad that went back to and they really want to just set up this whole thing that would do that and so I said, “Okay. That makes sense.” I advised my client to let them go ahead and do that, put them in Google Tag Manager.

Number one was, over the course of working with this for over a year, we never used the data one time, but the other thing was that one day, the website broke and we couldn’t figure out why there was a giant blank space showing up on our homepage. Well it turned out, there was actually a small bug in the Java script that they had written in order to track these things and this particular bug had been triggered by a change that was made somewhere else, which was a totally legitimate change. Another danger of overconfiguration is: you collect data, you set things up to get data, and it causes problems somewhere else because it’s not being maintained properly.

Meanwhile, as you said, Vincent, use case is not a use case that’s appropriate in the case of this particular company. They were just getting started with paid search. There was absolutely no need to track revenue all the way through to what ad was actually being clicked. It was certainly good enough to just understand performance and aggregate and which ads are generating the most leads.

Vincent:

Yeah, and that’s a hard problem to solve too. I mean, just attribution in B to B and B to C companies, there are a lot of stats that you think in theory, “Oh, that should be easy to collect because it would be so important to have,” but that’s known problem. Then accounting for word of mouth and lots of other variables that can have an outside effect on what’s actually working and what’s not. I mean, there are some people that argue too, “Hey, don’t even bother with attribution.” What are your thoughts on that? Do you think at hindsight, it’s an exercise that you guys completely just should have skipped?

Justin:

Well, I think attribution is really important, obviously. I think there are different ways to approach it. There’s the first touch influence, last touch. There’s that part of the model. Then there’s also things like, “Well, do we just need to attribute things to a specific source like organic search or paid or social or whatever? Or do we need to like attribute to a specific tweet? Do we need to attribute things to other dimensions like geography and other things like that?” I mean, how much attention do we want to pay there? I think you just have to think critically about it. To me, for a lot of smaller B to B companies, it’s definitely enough to do it by channel to start out with and then the individual channels, you measure them by their success on their own terms.

For example, with AdWords, don’t try to go in to figure out which ad is driving the revenue, just figure out that it’s the AdWords channel. Then within the AdWords channel, figure out which ads are driving the most leads, which is a different measure. I think it’s really important, but that’s sort of where I want to start is something that’s as simple as possible while still being plausible. I think the other part to this too is first touch versus last touch versus influence, as I said before. I actually think that’s really important and part of what you have to be influenced by is how easy it is and how natural it is to set these things up.

Those measurements are very easy to set up, work consistently and well in sales force versus if I wanted to do something like geographic attribution. I’m not even sure how that would work for some other non-intuitive measure. That would be much harder to set up and that would give me a lot more pause.

Vincent:

Yeah, absolutely. Yeah, I think knowing what technology is actually available, what you need to build in-house and then what assumptions you’re making are all … can save people a lot of time as far as how far should we go with attribution. I’ve worked at both ends of the spectrum on the attribution side, so yeah. I have a few stories where every single marketing activity just required at least 10% of the work, was just making sure that it was going to play well with the attribution partner, especially when testing a new channel marketing network, marketing partner, any big changes in strategies. Then kind of understanding why measurement didn’t align with attribution. There are lots of moving pieces to manage and I think those are some of our problems with overconfiguration is one, is that it takes a lot of effort to maintain.

To your point that every new major piece of marketing technology you add or analytics technology you add is arguably half to a full-time role. Then the second thing being that, if you are going to over configure, you really need to write out your steps and your thinking process behind doing it. What I’m getting at is, the more data you collect, the more documentation you require. I think generally, marketers can be very averse to documentation. There are definitely some exceptions, but Segment is this blog post about why tracking plans are really important, and they are because it communicates expectations and settings and configurations between marketers and developers. It’s accounting for both languages.

As a marketer, it’s not really enough for you to just document, “Hey. This tells us average order value and how many people … what blog post get the most Twitter shares from our webpage,” or whatever it may be. Whatever landing page gets the most engagement, you have to be able to communicate those events to a developer as well and know how is that event actually configured. What’s the idea of the element maybe you’re selecting? There are lot of things to consider and the more precise you are, the easier it is for people in the future to be able to make changes and understand what you did. Then also, I mean to your point, not have things break.

Justin:

I also want to talk about, let’s say you got your configuration right and let’s say you have the amount of data that you’re collecting right. There is this phrase that I hear a lot which drives me crazy and the phrase is “dual focus,” which is a contradiction in terms as far as I’m concerned. I mean, I could see where that might be a useful idea in some cases, but it’s one that is overused and used to avoid making decisions often times. Let’s say you’re tracking the right amount of data. You got all of your configuration set up in the right ways, but how do you know if you’re actually looking at the right performance indicators, and what happens if you pick performance indicators that actually overlap?

Vincent:

In my opinion, if you pick KPIs that overlap, just focusing on that last part. If you have overlapping KPIs or competing KPIs, then in my opinion, it makes it much harder to make clear decisions because you’re always weighing trade offs where you don’t have a clear priority. For example, I think it’s common where a customer will say, “Hey. We want to reduce our customer acquisition cost, but we also want to maximize sales.” What they’re kind of saying is, they want to maximize revenue and they want to maximize profit, which are two very different things, because you could maximize profit by selling one product at the highest profit margin that you would ever achieve.

Then from there, every sale is going to have an incremental cost and it’s going to cost a little bit more to acquire a new customer typically at a certain volume in e-commerce. Optimizing for both those goals means you never really fail and you never really succeed. On the one hand, it kind of ensures from ever completely failing because you could say, “Okay. Well, profitability was way down, but our sales our way up,” or vice versa. I think it makes it hard to make, to really do a good job in either direction. That said, I think you do want kind of safety metrics to make sure that you’re not going wildly in one direction at the cost of another.

With revenue for example, yes. Maximizing revenue sounds good, but it’s at a profit margin that isn’t competitive or it doesn’t take into account your customer lifetime value, that could be a problem. I would recommend having some threshold where it’s, “Hey. We’re going to maximize revenue within this profit margin range.” You have these guardrail metrics to make sure you’re on the right course and so long as you’re in between those and you have one clear KPI that you’re optimizing for, I think it just allows you to go a lot faster.

Justin:

Yeah. I think that’s a really good point, and there are lots of example of these kind of overlapping KPIs, Vincent. You talked about customer acquisition cost. We talked about revenue profitability, all those other things. Now, there area places where it might be okay to have different KPIs, like fore example, we are trying to maximize revenue and we think that social engagement is going to be a really important part of the way that we bring in new customers and therefore, accomplish that. That’s, to me, an example of where you might have a dashboard that says, “Here is our revenue and here is our engagement on social media.” If you are optimizing for one, there isn’t necessarily a trade off with another, but some examples that we just talked about are ones where there is indeed more of a trade off and you have to be more careful.

Vincent:

I mean, another problem just with competing KPIs is that it’s sort of the first step into having endless data. It’s like, “Okay. We want to just-”

Justin:

Right, totally.

Vincent:

“We want to balance traffic acquisition or audience development, so we’re looking at new visitors with bounce rates.” That could be pretty careful because you’re not trying to optimize for those two things at the same time and sort of like trying to brainstorm and edit at the same time. You don’t really do a job at either. I think this touches on vanity metrics versus actionable metrics. With vanity metrics, you might follow things like social media followers, net new followers, maybe likes. Social media signals that don’t provide you with a lot of information, there are engagement metrics that I think are really valuable.

Avinash Kaushik describes some of those as amplification metrics, so that would be like a retweet or a share where you’re getting a lot more mileage out of organic content than you would if you were … You didn’t have that following or you didn’t have people sharing your content and it also gives you a signal of what content and topic people are interested in. It can be a much more cost-effective way to spread message than paying for sponsored content across the internet.

Justin, you have a lot of experience in marketing for B to B companies specifically. What are some of the competing metrics and overlapping metrics that you see there?

Justin:

Well, I think what tends to happen is companies tend to optimize for adjacent metrics in the revenue funnel a lot. One thing that will happen is we want to maximize traffic to the website, but we also want to maximize leads. You think you can do both of those at once and you kind of can, except that again, sometimes, you want to be more targeted with who you actually bring to the website so that you can have a higher conversion rate. But also again, there isn’t really enough time to do both of those things, so you end up doing both of them pretty ineffectively. I also find that just maximizing for any two things in the revenue funnel can be a little bit difficult. I see teams in marketing at B to B companies doing that a lot.

Vincent:

The final segment that we want to talk about is unrealistic technical expectations. We touched on this a little bit earlier when we were talking about the challenge of attribution projects and when enough is enough and where you should draw the line in the sand for, “Hey. This is the amount of data or the granularity of data we need to make decisions.” In other pitfall we see very commonly with poorly run analytics programs is just the poor understanding of the technical requirements and the technical expectations needed to run and maintain and learn from analytics.

Justin:

Right. The thing is that marketing technology vendors especially don’t want you to know is analytics are pretty fragile, especially things like conversion funnel analytics where you’re trying to track every single step in a purchase process, especially in B to B where the purchase process is 60 days long and may go across multiple systems, your website and your marketing automation tool and your Salesforce.com like tool. That’s one area that’s really fragile. Are there other ones that you’ve kind of seen, Vincent, where it’s sort of surprising, like you’re trying to measure something and people think it’ll just work and actually, it takes a little bit more effort than that?

Vincent:

Yeah, absolutely. With conversion funnel analytics, yeah. A lot of people think that automatically, you can just jump back and forth between what steps a person entered on and where they left off and segment out by age and by time or by device. And you need the infrastructure in place first to be able to do that and you need to already have a tracking plan that clearly outlines all of those steps. That’s reinforcing the analytics being fragile point. The second thing is that when you’re doing that, it assumes that everyone has perfect communication across the organization so that when you’re adding steps or you’re adding pages within a checkout flow or within a web form that has multiple steps across pages, that your developers and your marketers area all on the same page. It makes things really …

It’s very easy for things to go un-tracked or undetected, so you might find that you have 1,000 add to carts, but you only have 200 visits to your product detail page, which probably happens pretty frequently. That can be frustrating and it can take teams surprisingly long time to detect, especially if they’re inundated with way too much data. That’s another reason that you shouldn’t have too much data.

Justin:

Right, because it’s a lot harder to make sure it’s all accurate and working properly.

Vincent:

If you only have a few critical, a few metrics and you’re gathering contexts, you’ll, I think just organically quickly collide with those sorts of issues, but whereas otherwise, you might not notice them until you absolutely need them. I think another area where technical expectations can be unrealistic is around customer lifetime value. That can be really challenging. When does a customer’s lifetime actually ends? What’s the number of orders they’ve made after their first purchase. Some of those metrics can … they require infrastructure and planning and then estimates and assumptions that are stated that the entire team is comfortable with.

Justin:

Coordination across the team, making sure that your funnels are set up correctly, making sure that metrics are actually pulling correctly all very important stuff and surprisingly difficult to do. I would even add a couple of other things too. One is that, even within a specific tool, it can be very surprising how data is gathered and what its limitations are. One example I’d like to give with this is Google Analytics, which is a very widely used tool, supposedly extremely basic. I mean, we started off this podcast talking about bounce rate and seeing that in Google Analytics, but it often happens that somebody will check a number at Google Analytics. They’ll say, “Number of website visits from organic between January 1st, 2016 and December 31st.”

They’ll check that number and then they’ll go back to Google Analytics and actually be adding up, for example, the individual months because they want a monthly view of the performance and they don’t add up to the same number. This is a really interesting example. The reason that happens to Google Analytics is because Google Analytics does statistical sampling on all of its numbers and so, for larger date ranges, you actually are getting an approximate view of the the data. But they kind of mention that to you, but don’t really tell you. That’s one really interesting thing. Then another thing is, there are all these other really weird limitations. Let’s say you’re trying to look at things across data sources like, what did Google Analytics tell you in terms of your number of conversations versus what is your marketing automation tool telling you, Marketo or Pardot or Eloqua or whatever?

It might be hard to match those numbers up to. Number one, because Google Analytics natively has no way of telling whether somebody is reconverting, i.e. they were already in your database and they’re simply converting again. Number two, because there’s lots of issues with lots of people block Google Analytics Java script, but they don’t block Marketo Java script or Pardot Java script because they don’t know about it, so you get different numbers from that perspective. Or there are slightly different technical implementations or there are ways that tools hook in to different events on your website like form submissions and things like. Which means, they’ll work in some cases and not others.

That’s just another really interesting thing as well that feeds into this unrealistic technical expectations thing, as you have to be aware that a lot of times, the data you’re going to get, the exact numbers are not super reliable, but what you want to pay attention to is the changes and the orders of magnitude and the ranks that things are in.

Vincent:

Yeah, absolutely. I think a few other things you’re touching on to require pretty fairly deep expertise in each of those platforms to know the nuances that make measurement and reporting inconsistent across them. Because if you have those spread out across teams, you may not even realize that. The default attribution windows for advertising platforms all vary the way they’re calculated. One specific example, you can measure the attribution by click or by impression on Facebook and then set a threshold, but their thresholds aren’t as flexible or as robust as Google Analytics. Then if you look at AdWords for example, if you’re running display and search campaigns the way that search and display get credit, if someone both sees an ad and they click and convert, can throw the data off as well.

Unless you have explicitly said, “Hey. We’re aware of this assumption,” or “We’re aware of this flaw and attribution and we’re okay with it, and this is how we’re going to make decisions.” But yeah, there’s so many nuances that make things more complex and it also reinforces the point that the more marketing technology you add, the more expertise you need. It really helps too if you have a centralized source that can oversee all the platforms and also detect, “Hey. Here is where we’re going to run into inconsistencies or where we should expect them, and here is our plan for it for dealing them.”

The fewer metrics you have to do that with, the better. I think it also helps, yeah, just prevent teams from coming to a meeting or conclusions where you have four sets of data that all appear to be that everyone went in with the intention of getting the right data. But with Google Analytics specifically, you could arrive at three different outcomes just by taking different paths to get the data.

Justin:

Right, exactly. Just to sum up some of the things we talked about on this episode, some of the little pieces of advice that we maybe have. You want to use as little data as you can, but no less. Why do you want to use as little data as you can? Because analytic systems are brutal, because you want to be focused on getting contacts around your metrics, because you want people to be able to focus on one or two things instead of trying to optimize for a million things. Because all the analytic systems require expertise and they require people to understand them and set them up properly. Those are some of the things that I’ve heard as to why you really want to be pretty focused.

Is there anything else that we kind of missed, Vincent? Are there any little pieces of advice or reasons that we want to encourage that, that I haven’t just mentioned?

Vincent:

I think that about covers it, yeah. Keep things simple, establish a single source of truth or a primary source of truth and a backup and communicate that verbally, and then just have a document where you just state all of these things. Make it easy for people to access the data that you are collecting or the dashboards that you do have and clear. I would focus on those things and remember that context is more important than just monitoring and collection of data.

That about wraps it up for us. Thanks for listening the Marketing Made Difficult. I’m Vincent Barr and this is my co-host, Justin Dunham. For our full episodes and transcripts, check out our Medium site. We are at medium.com/marketing-made-difficult and we’ll see you next time. Thanks.

Justin:

Thanks a lot. Bye.