Know The Truth, Don’t Just Believe It

Tim O’Connor
Mar 9 · 20 min read

ANSWERS, NOT DATA. Why some marketers succeed with analytics and others don’t. (A Medium book). Chapter 2.

Image for post
Image for post

“It ain’t what you don’t know that gets you into trouble. It’s what you know for sure that just ain’t so.”

Mark Twain, Often called America’s greatest humorist

W hen most people, even many marketers, describe marketing they tend to focus on branding, advertising, digital performance marketing, and demand generation. Together you could group these as promotional related activities. But there are classically 4Ps of marketing: product, price, place (channels), and promotion. When I was Senior VP of Marketing and Merchandising (CMO) for Unisource, then a $5B portfolio company owned by Bain Capital, I led all four, and one year we implemented a major companywide price increase project.

It was challenging not so much due to data, but because of internal resistance as a key person wanted to believe something that simply wasn’t true and if it had been followed would have been a disaster. But for a brief period of time, he got others to believe him. They all believed it to be true because Chuck (not his real name) believed it to be true. Yet none of them knew it to be true. And that is the focus of this chapter; the difference between just believing something versus knowing for sure, and its ramifications on marketers being successful with analytics.

Just believing something came close to costing $9 million

Here’s the story. When the pricing project was completed, we increased EBITDA by $9M in one-year via strategic price optimization on over one million item/customer combinations, with minimal customer churn. At the time, Unisource’s annual EBITDA was about $50M (it was a wholesale distribution business going through a turn-around), so this was close to a 20% EBITDA improvement.

However, when Unisource’s CEO first decided to do this project, the President of one of the businesses, Chuck, was against it and would repeat over and over again something like, “We can’t raise prices, as we’re already losing business because our prices are too high.” He was dead set against the project. Notwithstanding Chuck, the CEO approved the project.

Now fast forward a few months after we made price changes. The CEO of Unisource comes into my office and asks me to explain how come sales for Chuck’s business are down, as Chuck is telling him and others it’s because of the price increases. It was such a big issue that Chuck was lobbying with the entire executive leadership team for a complete rollback of every price increase.

The three of us got together, and when I asked Chuck why he knew the sales decline was due to price changes, I remember it as if I was in the room today. Chuck said something like, “You raised prices, and now my sales are down. There’s nothing else it could be. You’ve messed up my business.”

I bet many of you have been there before on the receiving end.

I’ll come back in a later chapter about how to handle flack like Chuck’s, but he believed what he believed, and in his mind, it was so. However, he didn’t know it to be so. There is a Grand Canyon between those words: believing and knowing. Many marketers trip over these words too.

That day, I dived into the data and found that sales were up, in the aggregate from the item/customer combinations where we increased price. However, sales from item/customer combinations where we did not increase prices were down, and the decrease canceled out the favorable price increase benefits. The net sales reduction wasn’t due after all to the price increases.

When I showed the data to the CEO and Chuck, I remember Chuck looking like a toddler who got caught in a lie, as he just responded something like “OK,” and that was that. We didn’t roll back the price increases, and over the year we increased EBITDA by $9 million.

Chuck isn’t unique. People like him reside in the boardrooms of major companies down to the individual contributor. Candidly I’ve fallen into the same behavior, saying I know something at times when in reality, I only believe it to be so. We humans, tumble into self-created traps, lying to ourselves when we do that. Let’s dive into why we so often settle for believing.

Why we confuse knowing and believing

T he gold standard for a high school student worldwide to take college-level courses and earn college credit, advanced placement, or both is the College Board’s AP Program. After a class, you complete an AP Test. Here’s an example of a multiple-choice test question one might find on the AP World History test.

Question: Which of the following countries was not a central belligerent in the Crimean War?

  1. England
  2. France
  3. Germany
  4. The Ottoman Empire
  5. Russia

The correct answer would be 3, Germany.

But how do you know? You might say it’s because you read it in the previous chapter. But how do you know what I told you was true? Or, for that matter, if you were a high school student and you were taking the AP World History test, how would you know the answer? You’d probably say your teacher told you. But how do you know they told you the truth? For that matter, how do they know it’s the truth?

While we live in an era of science, of evidence-based reasoning, of calm, cold analysis, the truth is for most of the things we think to be accurate; we probably have no direct evidence ourselves. How do you know the Earth circles the Sun and that Kopernikus/Copernicus was correct?

And what about Neil Armstrong and Buzz Aldrin landing on the Moon? How can you be sure a Hollywood studio didn’t take the photo at the start of this chapter?

This lack of first-hand evidence is even among highly trained scientists, physicians, and researchers. We naturally can’t possibly have direct evidence on everything we need to know to get through modern life. And I freely admit we can never be 100% certain of anything.

Therefore, we as humans have to use various short cuts that assure our brains don’t blow up, or we’d sit dazed in a chair and questioning everything we see, hear, read. For me, I believe that the Earth circles the Sun, and Neil and Buzz were on the Moon; but I can’t say I know for certain on either. I believe this since that is what I was taught in school and I think that if these beliefs were wrong surely by now a wave of credible astronomers or members of the film crew, would have blown either or both stories. That’s my short cut for accepting these as truths.

But short cuts can lead us astray when we come up to something we really ought to know for sure, or have a high degree of first-hand proof, and not just believe it to be so.

There are a litany of short cuts, including relying on our trusted social network of friends and colleagues; abrogating our self-responsibility to trusted brands, so-called experts, and the media; confirmation bias when we lean towards agreeing with things we already believe to be true to reinforce our existing beliefs; and in Chuck’s case thinking something to avoid the possible pain of bad news if you dug in, such as learning the issue might be ineffective sales management.

These shortcuts are evolutionary traits that have been built into our DNA over the ages since we simply don’t have the time to reason through every decision we have to make quantitatively. When our ancestors were walking through the jungle and saw the bushes move, they didn’t stop to think through, “Is it a kitty cat or a lion?”

But we don’t face lions in the jungle anymore. So what can you do as a marketer to not just outsource to others, finding the truth, or at least get closer to it, especially on critical items such as why are your sales down?

It’s quite easy. It’s a philosophy and operating habit. It’s less about how you do it, and more about it being something you want to do, not something you do begrudgingly. Let me share four examples of very successful people I’ve worked closely with, within marketing topics, that modeled getting to knowing and not stopping at believing. They each used different approaches, but they all shared a common zest for getting to know the truth.

Four approaches to the truth

K evin Gilligan is a former manager of mine who promoted me into my first marketing manager role at Honeywell. He went on to big things, including President of Honeywell’s $10 billion Global Controls business and CEO of Cappella Education, one of the USA’s largest online universities. He’s currently Vice Chairman of Strategic Education, Inc. which merged with Cappella. What I remember about Kevin is that when evaluating new product opportunities (one of the marketing 4Ps — product), he used to ask me, three simple but powerful questions, “Why is the opportunity real? Why will we win? Why is it worth it for us to pursue it?” Remember those questions from the previous chapter?

A different approach came from Dr. Doris Larmann, who was VP Communications at Siemens USA when I was VP Marketing for the global CIO organization. Doris went on to head up corporate communications at three major Swiss companies, Zurich Financial, SIX Group, Swiss Post, and currently, she owns an artesian cheese company in the Swiss Alps. That’s got to be a cool gig! Her method to avoid the belief trap was to socialize things with many different people, thereby seeking various perspectives, opinions, and insights, trying to tease out if she was missing anything.

Then there is Tom Pitera’s approach. I had the good fortune to work for Tom three times (he’s the best boss I’ve ever worked for too which is saying a lot as I’ve been blessed to work with many terrific ones). He initially hired me into my first Private Equity job at Unisource, where he was President of the Packaging and Supplies business (Chuck led a different business), and I was VP Marketing. Tom has been a CEO, COO, or President of five, billion-dollar-plus companies, and he’s currently CEO and Chairman of Russell Hendrix, Canada’s largest restaurant equipment and service company. He had many approaches to get to know the truth, but one especially stands out which was centered on wanting to see the correlation / R-square of data analysis. If R-square is new to you, we’ll be covering it in chapter 4.

And finally, there’s the approach of Mark Curcio, whom I worked with when he was at Bain Capital, and I was at Unisource. Mark is now Vice Chairman of Hollywood Studios Holdings, and he is one of the smartest people I’ve ever worked with. His approach was to follow a classical Jesuit critical thinking approach, following each answer with a why, and then another why, and then another. If you wanted to hire someone to either prove or disprove the existence of God, you’d want to hire Mark. As Mark used to say, “I’m just wanting to get to the truth.”

There you have it, four different approaches, all used within marketing, all sharing a common theme. That is, the four executives didn’t take data on face value. They all had a critical operating philosophy of seeking to know and not just believe. And they all wanted to find the truth, even if the facts weren’t what they wanted to hear. They knew there is a big difference between knowing and believing. It reminds me of a quote by Dr. Carl Jung, one of the great pioneers of modern psychology. Jung said the following in an interview when asked about did he believe in God.

“I know. I don’t need to believe. I know.”

Carl Jung, from 1959 BBC Face to Face Interview

That is a powerful distinction. Knowing is first-hand experience and deeper understanding, not just surface understanding of facts. You get knowledge through observation and inquiry, such as that demonstrated by Kevin, Doris, Tom, Mark, and Carl Jung.

Believing, on the other hand, isn’t founded in in-depth understanding. It may even have facts that seemingly support it, which makes it particularly hazardous. It’s shallow.

True deep knowing, though, comes from more than just knowing a fact from rote learning. When your statements and actions come from a level of true deep knowing, there is a conviction, certainty, and calm behind such action. You’ll feel it in your body, and your inner voice will be still and at ease. We’ll come back to inner knowing in chapter 7.

On the other hand, Chuck just believed, and yet he was a smart person. I don’t fault him though for going to the easy answer, since even quants that have terabytes of information, and teams of analysts also sometimes stop at believing. It happened in the 2008-housing crisis.

Even super-smart people believe and don’t know

M aybe you’ve seen the Academy Award-winning movie The Big Short. The film is an adaptation of Michael Lewis’s book of the same name. Lewis dives into the 2007 housing market crash and global financial crises and exposes how a few Wall Street contrarian outcasts, made hundreds of millions of dollars by being curious, getting into the weeds, and looking deeply for patterns as to what was happening in the USA mortgage loan market. I want to briefly visit this story, as it’s a great example of how quants will sometimes believe they know.

One of the contrarians was an eccentric hedge fund manager, former physician Dr. Michael Burry (played by Christian Bale) who started to focus on the subprime market. Through his deep, extensive, in-depth analysis of mortgage lending practices in 2003 and 2004, he correctly predicted that the real estate bubble would collapse as early as 2007.

Image for post
Image for post

Here’s why. The investment banks had bundled thousands of mortgage loans together and created the collateralized mortgage loan market and sold them to pension funds, insurance companies, and others looking for stable higher returns than buying USA Treasury bonds. The mortgage salespeople, mortgage companies, real estate closing lawyers, and investment banks made massive amounts of money. But before long, every good borrower who could borrower had. The gravy train for everyone who made money, was going to dry up.

What do you do?

To keep the money flowing, banks began to reduce borrowing standards and offer more available access to debt. Borrowers got into high-risk mortgages such as option-ARMs (adjustable rate mortgages), and they qualified for mortgages with little or no documentation. Even people with bad credit could be eligible as subprime borrowers. Lenders were approving no-documentation and low-documentation loans, sometimes not even requiring verification of a borrower’s income and assets. Loan quality began to deteriorate.

Burry saw in the deep data by actually reading the prospectuses with thousands of loans, and doing the math, that instead of the collateralized loans being 65% triple AAA rated, the highest rating possible, that they were crap. When the ARMs adjusted on the riskiest borrowers, many of them would have trouble making payments. When that happened, then they’d have to sell their house or go through foreclose. That would lead to housing supply going up, demand down, and home prices dropping. The cycle would then accelerate. The risky borrowers would not only not be able to afford the monthly reset payments, but they’d also be sitting on a home underwater, meaning the value of the house would be less than the amount owed on the home. Thus more reason to walk away. It’s straightforward if you look.

Burry’s analysis led him to persuade Goldman Sachs and other major investment firms to create a credit default swap market that did not exist, allowing him to bet against the market-based mortgage-backed securities. The banks were interested as they made another commission.

The book and film highlight how, when the mortgage market did collapse, Burry’s long-term bet, exceeding $1 billion, produced an overall profit of over $2.69 billion.

But here’s the curious thing. Like Burry, the banks too had all the data as it was their data, plus they had thousands of analysts and super-smart people at the helm. But the banks got it wrong. They believed the mortgage market would never fall, and here’s the big point, just like Chuck, they didn’t want to know anything other than their belief. That would mean knowing bad news.

The Big Short is a funny movie, with an All-Star cast, about a complicated story that at its core reminds us how often we as humans fall into snares of believing things that ain’t so. That even goes to the quote at the start of this chapter, as the film opens with an epigram of that famous Mark Twain quote to suggest the theme of the movie.

Have you ever seen that Mark Twain quote before? Have you ever quoted it? Do you know there is no evidence, zero, zilch, nada, none, in any of his books, essays, letters, or speeches that he ever said or wrote those words?

I wonder if the producers knew it wasn’t a Twain quote? Surely if they did, they would have shared with the public how clever they were adding in this cheeky hidden joke. My guess is they fell into the trap of believing because someone told them it was a Twain quote. But I don't know and I’m now on the slippery slope of assuming. But heck, it is just a quote and not a sales decline on a billion-dollar business. I’m ok if they just believed on this one, though it might also have been an inside joke they kept secret. Who knows?. And for that matter, why should you believe me either, when I say that Twain didn’t quote it. Oh, the webs we weave.

Let’s go a bit deeper into the weeds with another example of believing something that ain’t so and show how a ‘Rebel who’s a Quant’ solved it and got others to know the truth. That rebel was me.

Getting to knowing through NPS

T here’s a terrific metric called Net Promoter Score (NPS) that is known as the single best predictor of a company’s growth.

It goes like this. You ask your customers: “How likely would you recommend us to a friend.” The choices are 0 to 10, with 0 equaling “Not likely to recommend” and 10 equaling “Extremely likely to recommend.” To calculate an NPS, you take the percentage of Promoters (the 9s and 10s) and subtract the percentage of Detractors (the 0 to 6s). What remains are the Net Promoters. It looks something like this.

Image for post
Image for post

A tremendous amount of studies have shown, regardless of the industry, that when NPS grows, a company’s sales typically grow. On the flip side, as NPS declines, a company’s sales usually decline. NPS is also proven valid at the individual customer or group of customers level. Let me share a personal experience.

TTT Holdings, then a Private Equity owned $700M wholesale tire distribution, was a client of mine when I was at Marketing Rules. Tom Pitera, who had moved on from Unisource, was their President and COO. He hired me to bring in NPS thinking into the company. He and I knew from previous companies that to get everyone in the company buying into NPS, including the CEO, you need to make NPS more granular and less academic-sounding, where people can see the relationship and effect of high versus low scores on the business.

What we did was capture TTT’s net promoter score as rated by each of their independent tire dealer customers. Then we grouped customers into one of 11 Year-over-Year (YOY) growth groups such as grew 5% to 10% or declined by 5% to 10%. Then for each growth group, we calculated that groups combined NPS.

Here is the chart of NPS data from TTT and about 2,000 customers. As you see in the graph, as NPS declines, YOY sales growth declines and becomes negative.

Image for post
Image for post

Now as much as I’m a big fan of NPS, on its own at this level it’s not very actionable.

NPS goes up. It goes down. But what does it tell you? Sure, if it goes down, it’s saying there are issues. And if it goes up it’s telling you, you must be doing something right. But in either case, you don’t know what is causing the change. You’re stuck with merely having to believe you know what is causing the increase or decline. Is it prices, product features, quality, customer service, marketing, who knows? Let’s return to Unisource when I worked with Tom Pitera.

Finding the truth via a regression analysis

O ne of Tom’s businesses, like Chuck’s, was also having a sales decline, but this time sales management believed fulfillment was the cause. In one corner was one of Tom’s SVP’s Sales, who thought the issue was products were out of stock. His sales team couldn’t sell if the products weren’t available for purchase. That’s an entirely plausible hypothesis. And he had data to back him up. It was from the sales managers he spoke to who provided him feedback on what customers were telling the front line sales reps. Yes, it’s anecdotal data. But it could be representative truth. Then again, it could be cherry-picked and biased data.

The President of Supply Chain also had data. The company had about a 98% fill-rate on lines. A line is a unique product, regardless of quantity, being purchased. With five lines on average per order, the order fill rate would be 98% to the 5th power or 90%, which was a record high level for the division over the previous few years. Being aware of that information, the President of Supply Chain, too, had a belief, which was the sales team couldn’t sell. Was that fair or a cheap shot?

To help find the truth, I decided to incorporate in our quarterly NPS survey, a twist beyond just asking customers the “would you recommend us” NPS question. We also asked a series of other questions such as rating the company on product availability, sales rep responsiveness, pricing, and delivery. By doing that, we could then perform a multi-variant regression and correlate each of the additional questions to its explanatory power of driving the Net Promoter Score up or down and possibly uncover the truth.

You might be asking yourself what is a multi-variant regression, and individual explanatory power?

Regressions are a statistical model to see how much the change in one thing tracks the change or variance in another. Outdoor temperature and the purchase of hot coffee drinks is an example. Ideally, you’re looking at past data to help you come up with a model to predict future performance.

For example, let’s say you’re a performance marketer for a coffee chain, and you knew cold weather was approaching. From your regression, you knew temperature and hot coffee purchases were inversely and highly correlated (temperature goes down, hot coffee sales go up). You might use that information to not run a marketing promo since the colder outdoor temperature itself may drive sales, and you won’t need the promo.

Multi-variant regressions are a regression model that takes into account multiple data. Using our coffee example, you might look at not only how much weather change correlates to the change in hot coffee sales, but also perhaps how does the day of the week, and the month of the year relate to sales changes.

I’ll come back to multi-variant regression analysis in chapter 4, but for now, what our regression at Unisource told us was: 1) how much did sales rep responsiveness influence increasing or decreasing NPS, and 2) how much did product availability influence sales change.

The correlation showed that, indeed, out of stocks were causing a detrimental decline in NPS, something that operations couldn’t see in their data. They knew the fulfillment percentage was up, but they couldn’t see customers were unhappy.

But we also found out that sales rep responsiveness, or lack thereof, was a higher contributor to the change in NPS. In order words, what was having the most significant negative impact on sales was the sales team.

What no one was able to see until we had more in-depth data, and then discussed it all, was the impact of sales concentrating more time on new accounts as opposed to existing customers. In the past, when there were order-fulfillment issues, the customers would call the sales reps directly and not customer service, and the reps would personally concierge the problem masking the problems and making the customer feel special. This personal service turned lemons into lemonade.

Now when customers called the sales reps, it was taking time to get a return call and answers. The customers got hotter as the special care they formerly received was disappearing.

To some extent, the Sales SVP and the President of Supply Chain were both correct, believing they knew for sure what the issue was; that it was the other guy. They didn’t think it could also be their organization. They didn’t know, and that caused problems.

The moral of the Unisource story is, for a marketer to be a master of analytics, you have to be continuously uncomfortable and not stay in the zone of everything is okay or that you know something when in truth you only believe. To some degree, this is irrational, purposefully causing yourself to be uncomfortable. But it is necessary.

On the other hand, I get it you can’t question everything. You have to be judicious as to when you need to act like an NCSI Investigator, as at some point you do need faith in others, in something. In chapter 7 we’ll discuss using intuition to help you.

But for now, let’s close with an example of significant consequences when someone just believed, and didn’t take the time to inquire deeper. It’s something well beyond the sales growth challenges of some Unisource executives.

A high cost in just believing

O n the morning of Sunday, December 7, 1941, six USA Army Air Corps B-17 Flying Fortress bombers were flying from San Francisco, California, and scheduled to arrive later that day in Oahu, Hawaii, At the same time a Japanese task force of six aircraft carriers, the Akagi, Hiryū, Kaga, Shōkaku, Sōryū, and Zuikaku, were also in route to Oahu, intending to launch their 408 aircraft to attack the USA Navy Pacific Fleet and other military installations.

At 7:02 am Honolulu, Hawaii time, two Army Privates reported seeing an enormous blip on radar coming in 137 miles from the north. They called in the finding, and finally 18 minutes later, at 7:20 am, they heard back from their commanding officer, Lieutenant Kermit Tyler, who told them to not worry about it.

He never asked the radar operators: From what direction is the blip still coming? How large is it now? How close now? Did the blip direction change since you first called in? How many planes to you think it is? What else can you tell me, and so on?

Interesting is the radar operators thought the blip on their screen represented about 50 plans, but they never volunteered that information to Tyler, and he never asked. Lieutenant Tyler in an interview with The San Diego Union-Tribune said,

“Common sense said, well, these are the B-17s. So I told them, don’t worry about it.”

Lt. Kermit Tyler, Radar CO at Pearl Harbor

Don’t worry about it? He had data in front of him but didn’t inquire. He wasn’t curious; after all, he said it was just common sense what was going on with the radar. That sounds like just believing.

But if it were the B-17s coming in from San Francisco, they would likely be coming in from the Northeast and not the North. That’s a big difference in direction when something is a hundred miles out. You’d notice it on the radar. But why bother? It had to be the B-17s. Right?

It was an uncommon catastrophe.

At 7:48 am, a full 46 minutes after picking up the impending attack by radar, the shooting began. Ninety minutes later, 2,335 American service members died, and 1,143 were wounded. Eighteen ships were sunk or run aground, including five battleships, the USS Arizona, California, Nevada, Oklahoma, and West Virginia.

Image for post
Image for post

I don’t mean to pick on Lt. Tyler as he had to bear a lifetime of weight for his fateful decision. Who knows what would have happened if the warning went out and he sent up some planes to check. Maybe Pearl Harbor would have turned out different. I don’t know. Most historians say the USA and Japan were in any event headed for war. It was just a matter of time.

But it’s not an unreasonable hypothesis that probably fewer American’s would have died that day. Maybe, tourists, today wouldn’t be going to see battleship row at Pearl Harbor where 1,177 sailors and officers’ lives ended on the USS Arizona.

Later on, the Army held hearings, and they concluded it wasn’t Taylor’s fault, as he never received training on how to operate the radar. Having no training, he didn’t even have the slightest idea of what to ask the radar operators. All he could do was guess and believe he knew what was sure that just wasn’t so. Maybe, may not.

As I’ve said before, I’ll freely admit that I’ve fallen into the trap of believing and not taking the time to inquire and know. And that often bites me in the butt. So the message is clear. For marketers to be masters at analytics and insights, you can’t just believe. You’ve got to know. And that starts with being curious and knowing the right questions to ask when searching for the truth. Let’s dive into that.

Click here to go to Chapter 3.

The Startup

Medium's largest active publication, followed by +731K people. Follow to join our community.

Tim O’Connor

Written by

Rebel who’s a quant; CEO / P&L from $0/start-up to $300M+; 6 time CMO, including two billion-dollar-plus; founding team of 3 award-winning startups.

The Startup

Medium's largest active publication, followed by +731K people. Follow to join our community.

Tim O’Connor

Written by

Rebel who’s a quant; CEO / P&L from $0/start-up to $300M+; 6 time CMO, including two billion-dollar-plus; founding team of 3 award-winning startups.

The Startup

Medium's largest active publication, followed by +731K people. Follow to join our community.

Medium is an open platform where 170 million readers come to find insightful and dynamic thinking. Here, expert and undiscovered voices alike dive into the heart of any topic and bring new ideas to the surface. Learn more

Follow the writers, publications, and topics that matter to you, and you’ll see them on your homepage and in your inbox. Explore

If you have a story to tell, knowledge to share, or a perspective to offer — welcome home. It’s easy and free to post your thinking on any topic. Write on Medium

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store