The Futile Attempt To Cram Analog Data Into A Digital Straight Jacket.

David Grace
David Grace Columns Organized By Topic
7 min readJun 8, 2017

--

Companies Should Stop Trying To Squash Oddly-Shaped Bits Of Customer Feedback Into One-Size-Fits-All Digital Boxes

By David Grace (www.DavidGraceAuthor.com)

This is Part 2 of a post on the futility of trying to wrench digital answers from analog situations.

Part 1 was: Why There Are So Few Good Movies. The Futility Of Trying To Jam A Digital Peg Into An Analog Hole.

We live in a world where businesses think that the effective way to collect and report product data and customer satisfaction is in an ordered, pre-designed, digital format. Much of the time the truth is just the opposite.

Digital Customer Surveys

It’s common that we buy a product and later are asked to take a survey:

  • On a scale of 1 to 10 how satisfied are you with this product?
  • On a scale of 1 to 10 how likely would you be to buy this product again?
  • If your answer was less than 5, is your reluctance to purchase this product again because:
  • The price was too high
  • It failed to perform up to your expectations
  • Etc.

These sorts of surveys are an attempt to coax digital answers from analog situations and they are largely doomed to failure.

Why People Use Digital Surveys

The driving force behind digital surveys is that they are easy to distribute, easy to answer and easy to process. People check a few boxes, each of which has numerical score, the computer totals up the numbers, and BAM, you have instant data that you can take to the board room.

“87% of our customers were at least ‘very satisfied’ with the new XL 500 and 92% were more likely than not to purchase it again. The XL 500 is exceeding all our expectation. We’re doing a great job. Yippee.”

OK, maybe they wouldn’t say the “yippee” part, but you get the idea.

For many reasons, in real world terms the data from these surveys ranges from almost worthless to misleading.

The Digital Survey Process Is Based On The Fallacy That The Manufacturer Can Read The Customer’s Mind In Advance

In order to create the survey in the first place, the manufacturer has to figure out in advance what the customer wants to tell it. Very often the manufacturer’s predictions are wrong.

The company has to be wrong much of the time because people are variable, their product experiences are variable, and their responses are both wider and more unpredictable than the survey designer will ever be able to predict.

What if people hate your product because it has a sharp edge near the on-off button that hurts their finger? That’s not going to be one of the questions on your survey.

What if the power-on light is too bright or it’s placed someplace where it’s difficult to see? That’s not going to be one of the questions on your survey.

There are a hundred reasons why people might not like your product and 95% of them won’t be on your digital survey.

The two principal the reasons I try to avoid McDonalds are:

  • The electronic menus are terrible. Really, really bad.
  • I dislike the large NO LOITERING signs that they plaster all over the walls.

Are there going to be questions about either of these things on the McDonalds’ customer satisfaction survey? No.

So, first off, digital surveys fail to collect a great deal of valuable, important customer feedback.

It Takes Too Many Steps To Get To A Meaningful Answer

If you asked your customer analog questions: “What do you think of our fries?” — “Do you like one of our competitor’s fries better than ours?” — “If so, whose fries do you prefer and why?” you’d get some pretty useful answers.

If you tried to digitally format the “What do you think of our fries?” question, you would have to break it down into

  • “Rate our fries on a scale of 1 to 10”
  • “If you rated our fries at less then 5 was it because of their
  • () size, () texture, () taste, etc.”
  • “If it was taste, then:
  • “Do you think that they were () too salty, () not salty enough, etc.

and even then you wouldn’t get nearly as much useful information as you would from just asking the open-ended question.

People Are Bad At Accurately Translating Their Feelings Into A Digital Score

If your question is: “On a scale of 1 to 10 how much do you like the layout of our store” you’re living in a fantasy land if you think that most people are going to be able to give you a meaningful number.

“On a scale of 1 to 10, how would you rate the taste of a Big Mac?”

How can you ever expect to get a meaningful number back?

Bottom line: The data derived from most of these surveys often materially omits important information (questions not asked), returns off target replies and gives you numbers that bear little relation to reality.

Digital Survey Drawbacks Summarized

  • You can’t predict all the answer categories in advance.
  • The questionnaire becomes too long and complicated and thus quickly tires out/deters the customer.
  • You won’t know when the customer has just gotten frustrated and started randomly checking the boxes.
  • There is a basic inability to accurately translate a human attitudes into an accurate, meaningful digital score.

It’s Much More Effective To Get Analog Data

How do you get useful data? You ask the customer straight out:

  • “What, if anything, do you especially like about our product?”
  • “What, if anything, do you especially dislike about our product?”
  • “If you could, what changes would you make to our product?”
  • “If you could, what features would you add to our product?”
  • “What features do you like about our competitors’ products that we don’t have?”
  • “Why did you pick our product instead of a similar product from one of our competitors?”
  • “If you could make this purchase over again, would you still buy our product? If not, why not?”

You also want to question the people who bought your competitor’s product:

  • “Why did you buy our competitor’s product instead of ours?”

I would have a prominent link to a “Customer Suggestions & Feedback” page on my web site and the Suggestion & Feedback page would have a list of open-ended questions with associated answer text boxes.

  • “What do you like about McDonalds?”
  • “What do you dislike about McDonalds?”
  • “What changes would you make to McDonalds if you could?”

The customer could elect to give a narrative answer to any question and then click SEND.

The reluctance to solicit free-form narrative responses is based on the concern that it will be difficult to extract useful information from nonstandardized input.

Interpreting Analog Data Costs More But The Results Are Much More Useful

Yes, to mine free-form text responses the company would have to hire humans to read, then categorize the answers. A human would need to summarize the data and collate it into trends and themes across many responses.

A human would have to sort those collated customer responses into various categories, but It wouldn’t take long for commonalities to appear. After reviewing only a few dozen responses themes would start to emerge.

With open-ended customer input the manufacturer would pretty quickly find out about the inconvenient on/off switch, the fact that the customers are irritated that center side of the bread always toasts darker than the outer side of the bread, that the menus are confusing and incomplete, that there isn’t enough sauce on the pizzas or that the sauce is too salty, and so forth.

Not Soliciting Analog Customer Feedback Is Penny-Wise & Pound-Foolish

A company spends hundreds of millions of dollars, maybe billions of dollars, to make and market its product, but it doesn’t want to spend an extra million or two to find out how it can improve the product it so that it can increase its sales by five or ten percent?

If I were selling the Chevy Cruze I’d be contacting every Toyota Corolla, Ford Focus and Nissan Sentra buyer I could find to ask them why they picked those cars instead of the Cruze. I would ask them, “What changes could Chevrolet make to the Cruze that might have convinced you to pick the Cruze over the [Corolla, Focus, Sentra]?”

If I were selling the Chevy Cruze, then on the one-year anniversary of each car’s sale I’d be contacting every person who bought a Cruze asking them, “If you could wave a magic wand that would make any changes you’d like to your Cruze, what changes would you make?”

You Don’t Have To Process Every Single Customer Response

If you’re collecting millions of responses the first step is to sort them by major demographic categories. Once that’s done you can randomly select one or two thousand responses from each major demographic and perform a detailed review on just those responses.

Use An AI To Crunch The Data For You

IBM’s Watson system can read English text. Make a deal with IBM to send all the narrative customer responses to Watson and have it distill them into a coherent report.

Processing structured, digital input is cheap but the result is not very useful. Processing unstructured analog input is somewhat more expensive but the informational value of the result is an order of magnitude greater.

If you provide a simple way for your customers to tell you how to make your products better, they will.

– David Grace (www.DavidGraceAuthor.com)

To see a searchable list of all David Grace’s columns in chronological order, CLICK HERE

To see a list of David Grace’s columns sorted by topic/subject matter, CLICK HERE.

--

--

David Grace
David Grace Columns Organized By Topic

Graduate of Stanford University & U.C. Berkeley Law School. Author of 16 novels and over 400 Medium columns on Economics, Politics, Law, Humor & Satire.