Opus Research Intelligent Assistants Conf — Day 2

Dominique Boucher
CXinnovations
Published in
6 min readSep 21, 2017

In my previous post, I gave a recap of day 1 of the Intelligent Assistants Conference organized by the Opus Research analysts firm this week in San Francisco. Here is now a recap of the talks I attended during day 2 of the conference (I missed a few).

Ask Fedex

The first talk of the morning was given by Fedex, and they presented their virtual assistant “Ask Fedex”. They launched it in 2015 in the US, and later in other countries. If I recall correctly, it is now available in 18 languages. The VA is accessible only on the web (it’s a sidebar tool on the right of the web page). It was developed using the Nina technology from Nuance, leveraging the experience and knowledge the team had with their IVR. Using “Ask Fedex”, customers can make inquiry about their shipments (it is integrated with Fedex’s trackink API), among other things.

Fedex put in place a continuous improvement process, which is paramount to a successful VA. They admitted that it is not yet integrated with the other channels to provide a seamless omnichannel experience, but they are working on it, preferring to take baby steps.

A few interesting statistics given: more than 70% of their contacts in the IVR come from customers who were using digital channels, 52% of all interactions do not require human intervention, and they have a 81% first call resolution rate.

Storytelling wins when it comes to brands

Nike Jordan presented their Breakfast Club Training chatbot, developed by Snaps. Jordan’s objective is to help kids get better at sports, and this project was to find out how, and what platform and channel they should be using. The chatbot, which is available on FB Messenger, helps users complete a 30-day training, with proactive notifications, and engagement. Why use Messenger when it could have been an app? Easy: that’s where the kids are. So brands must be where consumers are.

They shared some of the lessons they learned during the project: you need to provide value, create fun, good content and engaging scripts, and the tone needs to be personal. All this is way better than pitching AI and technology for the sake of it. Also, chatbots need to be seen as a companion in a journey.

And interestingly, they found that it’s more effective to NOT let customer ask anything. Customers like carousels, hence the importance of creating a strong narrative.

Building the business case for IAs and chatbots

The next panel was quite interesting. Tobias Goebel from Aspect opened it with a call to change how we consider IAs/VAs/chatbots projects, arguing that we should see them much more like workforce projects (with an onboarding process, considering IVR scripts like a way of training the chatbot on business rules, seeing developer tools as workforce management tools, etc.).

The Aspect experience with the Edward assistant (Radisson Hotel in London) confirmed that instead of demotivating real employees because they see chatbots as a threat, they are more engaged, having more time to answer physical customers. And customers love to self-serve. In fact, customer satisfaction goes higher when compared to staying in queue for 20 minutes on the phone…

Peter McKean, CEO of Synthetix, reiterated the importance of going to the cloud, because you can’t control how many customers will come to your site, or to your bot (which is not the case for IVR systems, where you have this control).

Finally, and I don’t remember who said that, the conclusion was that the industry is moving out of PoCs and experimentations, that it needs to be more confident about the value of IAs and chatbots, even if we don’t know all the answers yet (there are still soft arguments to be made…).

Feeding the brain

IP Australia, a governmental agency that administers intellectual property rights in Australia, showed us how they became a fully digital organization in just 2 years, after realizing customers want information and they want it now, in the format they want. Helped by Datacom, they developed Alex, a virtual assistant directly accessible from their web site, backed by a LiveChat team. They make sure that every interaction with customers involve Alex in some way (this can be just a reference to Alex in an email), to drive adoption. They now have 88% customer satisfaction, calls declined to a 1/3 of the volume they had prior to this transformation, reaching a 99.6% digital adoption in 4 years.

An interesting aspect of their system is the use of a human-assistance platform called Alex Coach, where people help Alex make the right decision in mostly real-time (40 seconds or less).

Technology should be invisible to the customer

Royal Bank of Canada (RBC), Canada’s largest bank, then presented the work they’ve been using Omilia conversational technology. They showcased, live (risky business!), their system. It was really impressive! We were able to see the natural understanding system operate in real time, seeing what the caller was saying. (Unfortunately, this was only an experimental version of their system.) They report a 59% reduction in abandonment rate compared to DTMF, with less than 1% negative feedback from their customers (with more than 1.5M calls).

Among the many lessons they learned from the project (quoted from their slides): NLU/AI testing methodology is not the normal way QA engineers think, semantic tuning is key as there are so many ways people ask for the same thing, and selecting the proper voice talent is extremely important for conversational feeling.

Reflections on UI

The next panel reflected on the use of different UI elements for creating compelling user experiences. We first learned that Comcast handles more than one billion voice commands per month.

Here are a few quotes I gathered:

  • Voice can create a stronger user experience because it conveys emotion, which is not the case with text. When you have a conversation with someone face to face, you decode non-verbal language that help better understand the other party.
  • Christian Petersen, from Comcast, would like bots to think like a computer, but interact like a human. Humans are highly specialized for communication, while machines can do computations and processing more effectively. An ideal system would combine both.
  • According to Cathy Pearl, even if it’s directed, a dialogue can be pretty effective.
  • Building trust and authenticating bots is crucial. ID management and authentication will need to be part of IAs.

Kicking emails “ass”

The last case study of the day was from L’Oreal, the global beauty company, and their use of chatbots to drive marketing. They presented the work they’ve done with Automat.ai, a Montreal-based company specialized in conversational marketing (which is, for Andy Mauro, founder and CEO of Automat, very different from customer service). They showcased Beauty Gifter, a chatbot that helps one find the perfect beauty gift to someone.

Their advice is really to look for specific marketing, e-commerce, and AI features, not just a generic chatbot builder.

They showed impressive statistics demonstrating the effectiveness of a chatbot-based marketing strategy compared to emails, with 6–7x response rates! Conversational marketing lets brands build personalized, one-on-one relationships with their customers that vastly outperform traditional digit marketing campaigns.

Demistifying AI

In the last session, panelists were asked for their thoughts on the current state of AI, and its use in IA technologies.

To Jay Wilpon, from Interactions, AI is a buzzword, not much more than good programming. Most systems in the market are still rules-based, and too directional. And we (as an industry) are not doing challenging applications to force us to do anything more than just rules-based systems. But we are at an early stage, the industry is still in its infancy (during the first day, someone said that IAs and chatbots are at the same stage as the web in 1996).

Another interesting comment was made to the effect that AI algorithms haven’t that changed in years. It is their availability that has changed everything. Now a few dedicated developers can use TensorFlow (or any other open-source ML framework), for instance, to build a model and experiment with new AI applications, start businesses, etc.

AI (machine learning) can also be used to tune systems, thus the importance of collecting data. For instance, Interactions, through their human-assistance platform, is collecting billions of curated, manually tagged sentences every year.

Final comments

I think the main takeaways from this conference are:

  • The industry is maturing quickly.
  • There a many large scale deployments that now prove the real benefits of intelligent assistants.

Thanks to Opus Research for organizing it!

--

--

Dominique Boucher
CXinnovations

Conversational AI Solutions Architect @ National Bank of Canada. My interests revolve around DevOps, cloud infrastructures and AI industrialization.