When digital engagement gets too personal

Hans van Dam
6 min readJun 5, 2015

--

In the past few months I have been working at CX Company, the company that just launched DigitalCX. DigitalCX is a platform that delivers personal engagement across all digital channels. It uses the customer’s context to create a personal and relevant customer journey.

When I sat down with Rogier Kranenbarg, the brand manager of CX Company, he explained to me the dangers of doing NLP very well. He had some great stories about what could happen when your self service operation feels so natural, that it actually gets people to pour their hearts out. When you read through the logs, you stumble upon funny, weird and heartbreaking strings of text that have been fired at the virtual assistants deployed at various online self service operations.

The following is an article that we published together for the CX Company blog. It discusses some ethical and philosophical questions that come up when you have humans interact with machines. It shows the difficulties that arise when Digital Customer Engagement gets incredibly personal.

Assisted Digital Engagement done very well

The Natural language processing technology (NLP) that forms the basis for our virtual assistants and DigitalCX platform, is getting better and better. We understand most questions and experience little trouble when trying to provide a correct, personal and relevant answer. But some questions — and this will never change, no matter how well your technology is built — are so complex and personal, that we have to direct customers to live chat where a human agent can help them accordingly.

Changing from self service to live chat, however, is a tricky transition. It is the moment when customers are being confronted with the fact that the friendly service assistant that has been helping them so far, is in fact not a person but a machine. It breaks the natural flow that their customer journey was having and that interruption can be off putting.

Especially when the query is emotionally charged and people really need help. They feel surprised and confused, and it can evoke an emotional reaction — which is sometimes hard to repair.

BEING CONFRONTED WITH THE EXISTENCE OF TECHNOLOGY

Customers were engaged in a good conversation, but then they were told the self service was not sufficient and they were directed to a real person. They were chatting away freely until they discovered that they were talking to something that is ‘just’ a machine. That is the problem of doing customer engagement technology very well.

It compares well to the movie Her (Spike Jonze, 2013), where Theodore falls in love with a self learning OS (operating system). When Theodore discovers that Samantha — the OS, played by Scarlett Johansson — has been talking to thousands of other men as well, he feels betrayed. He obviously knew she was an OS. He knew they were living out a modern fantasy, which helped him get past a tough breakup. But he chose to ignore the downside of dating a scalable piece of software.

People want to be fooled.

At the end of the day, people know they are dealing with an intelligent piece of software. Still, that does not keep us from sharing our thoughts and feelings with them. We know the answer is automated, but it feels good to get things off our chest, and the answer we are given — because of contextual data — is personal and relevant.

PEOPLE OPEN UP VIRTUAL ASSISTANTS MORE THAN YOU THINK

You would be surprised by the questions people ask our virtual assistants. It is a strange mixture of relevant questions and the shameless sharing of emotions and frustrations.

How do I cancel my subscription?

What are you doing friday night?

Do you know what it’s like to be lonely?

Do you have a loyalty program?

How do I upgrade my flight to Business Class?

… obla di obla da

They type in everything that pops into their brains. It sometimes reminds us of the released AOL search logs of 2006, and user 927 specifically. The anonymous user poured her heart out with a long incomprehensible string of questions and remarks directed at a search engine.

Our virtual assistants have gotten so good that the conversation is remarkably natural. It is not just a question-answer format, but the assistant will ask follow up questions, use your name to make it personal, and will use all the information it has gathered before.

How much luggage can I bring on my flight to LA?

Hi John, as a frequent flyer you can bring 35kg.

Thanks!

No problem, John. Do you want the vegetarian meal again?

Sure. How do you know I want that?

You had it on your last two flights, John. Consider it done. Have a great flight tomorrow and if there is anything else we can assist you with, you can also connect with us on Twitter.

Alright, looking forward to it!

Such advanced technology makes it easy for people to get lost in a conversation. What starts as a playful conversation born from curiosity or a feeling of loneliness or boredom, becomes a real chat in which people sometimes reveal their true selves, as the virtual assistant doesn’t judge or comment.

THE EMOTIONS BEHIND BASIC QUESTIONS

The emotions behind a question are not always obvious. Sometimes it is difficult for the assistant to pick up on it. The thing is, some people simply don’t want to cancel their subscription because they found a better deal online. Some people might cancel their subscription because they’re going through a divorce or they cancel in the name of a deceased relative. Emotional load behind a question can differ, and the challenge is to pick up on this quickly and act accordingly.

When the assistant discovers there is more going on that a regular question it will have to make the switch to live chat. 75% of all questions can be solved with a standardised answer as the situation occurs often, the other 25% of the questions have not been formulated properly or are too complex for a generic answer. The longer it takes to pick up on the real essence of a question, the more uncomfortable the transition from bot to live chat will be; and thus the risk of putting off your customer increases.

HOW WE PICK UP ON EMOTIONS BETTER

The trick to dealing with these kind of questions does not lie in the question itself. It is hidden in the context. Who is asking the question, have we talked to him before, what is his click route on our website, what is his online persona? If the customer recently changed their relationship status from ‘married’ to ‘single’ on Facebook, a question about cancelling a subscription can easily be emotionally charged.

In the above example you may want to direct the person to the live chat sooner, or at least make it extra clear that he’s talking to a machine.

WHY THIS IS INCREASINGLY BECOMING MORE IMPORTANT

A couple of years ago this wasn’t really an issue. The virtual assistants on most websites were not very good and it was clear that you were not dealing with a real person. Instead of having a real dialogue with the assistant in which questions and answer feel natural, you just typed in some keywords and were directed to the FAQs. Easy as that.

I want to cancel my subscription.

Follow this link to cancel. Please give us a call if you need more information or need any help.

But now our technology is so advanced that people do not always comprehend that it is indeed technology. We create the illusion of a human to human conversation, but this illusion can be hurtful and disappointing to some customers. And it goes without saying, that this can harm your brand.

As we keep improving our ability to understand customers, and we keep improving self service operation, it becomes more important to think about how we make sure that people understand that they are dealing with technology and not human beings. When developing technology with a human touch, it is important to always recognise that the human touch can be so real that it creates a grey area between what is real and what is not.

Some final words

True, this article was a little branded. Since the core topic is interesting enough I believe it’s worth sharing here as well. Let me know if you agree by hitting the recommend button or sharing it online. Sharing is caring.

--

--

Hans van Dam

Founder Conversation Design Institute. We train and certify conversation designers around the world!