Q&A: Dashbot On Why Conversational Analytics are the Future
This month we’re chatting with Arte Merritt, CEO and Cofounder of Dashbot, a bot analytics platform that enables publishers and developers to increase engagement, acquisition, and monetization. In conversation with Arte, we get the low-down on the future of conversational analytics, how voice-controlled bots like Alexa and Google Home are changing the game, and the surprising stats around people sending scandalous selfies to bots (yes, really). Check out the full conversation below.
Give us a quick primer on Dashbot.
Dashbot is a bot analytics platform with additional tools to take action on the data. When we refer to “bots,” we mean any conversational interface, whether more text based—like Facebook or Slack—or voice based—like Alexa or Google Home.
We support Facebook Messenger, Alexa, Google Home, Slack, Kik, and any other conversational interface.
The conversational data is much richer and more actionable than traditional analytics. It’s all unstructured data; users can send in images, videos, and audio, but more importantly, it’s their own words, their own voice saying what they want from the bot, as well as what they think of the bot afterwards. This ultimately allows users to take some pretty interesting actions based on the data — like inserting a live person to help someone through to conversion.
Bots have become very popular on the aforementioned platforms, which are reaching billions of people. Where do you see the future of bots, and how does Dashbot fit into that vision?
We are strong believers that conversational interfaces are the future. If you remember all the videos of two-year-olds swiping on iPhones and iPads, the same thing is happening right now with devices like Alexa and Google Home — little kids already know how to interact with the technology.
What got us even more excited is that the data is much richer and more actionable. In addition to different types of content, users are sending in their own words and voices, saying exactly what they want from the bot. This is much different than predefined links in buttons for apps and websites.
And it provides actionable insights—like inserting a live person when necessary, or sending a push notification based on the audience segmentation. In these cases, real-time analytics actually matter — you can help lead a person through to conversion or whatever the success metric may be.
How would you describe the state of intelligence in bot technology right now? Are bots capable of understanding the underlying language, or is it more of a call-and-response type of logic?
We see all types of bots — from AI-based, to regular expression/decision trees, and even ones that are just really well done script writing.
To us, it doesn’t matter if the bot relies on AI or not. Our analytics can still be used to help build a better bot to increase user acquisition, engagement, and monetization — identifying where the logic is breaking down as well as opportunities for new features and functionality.
The space is relatively early so most bots are not using AI or Natural Language Processing. The ones that are tend to have limits.
Given users can ask anything of the bot. It’s difficult to capture all the possible use cases — which is why a service like Dashbot is needed. A developer or brand can see what users are saying and how their bot responds, and then decide to add support or adjust the responses.
As you noted, Dashbot works with popular voice-controlled assistants like Amazon Alexa and Google’s Home Assistant. They’re bots in a way, but much more powerful on many levels — is there anything that’s especially novel about applying your expertise to these assistants vs. other “bots?”
Voice interfaces provide additional types of data, which leads to even more interesting reports and actions to take.
Both Alexa and Google Home provide the intents and context of the requests users make.
However, Google goes one step further and provides the raw input text too. For example, if you ask Google, “what’s the weather in San Francisco?”, you would get the raw text, “what’s the weather in San Francisco” as well as the intent (“check weather”) and the context (“San Francisco”).
The raw text is super helpful as it enables developers to see where their intents might be breaking down or where there are opportunities for new types of use cases.
For example, not all users may phrase questions the same way — one user may ask “When are the Warriors playing?” whereas another asks, “Are the Warriors playing tonight?” Having the raw text enables developers to see how users are phrasing inputs and add support for the different formats.
Can you talk about any interesting or unique ways that companies are using Dashbot to elevate their business?
One of our customers is a sports bot — for cricket scores.
The founders view the analytics every day to see if there are any new types of requests users are making that the bot is not currently supporting, making heavy use of our message funnel report.
This has led to some interesting new features and functionality. One feature they added was for player info. Originally they just provided scores of the games, but noticed users were asking about players, and added support for player info.
A more interesting feature they added, though, is the stop/mute functionality. Cricket games can go on for days. They found that when users’ teams were losing, they would get upset receiving score updates and would block the bot. The company paid to acquire these users, and then had to pay to reacquire them after they blocked the bot. They saw this in the analytics and added support for a “mute” functionality that enabled users to mute the score updates when their teams are losing — and thus they retained the users instead of losing them.
Your Message Funnel feature has gotten a lot of praise. Why is this information so helpful to businesses and what kind of improvements does that knowledge translate to later on?
At O’Reilly’s Bot Day conference last October, Amir Shevat, the Developer Relations Manager for Slack, described the report as “the holy grail for analytics.”
The Message Funnel is a great way to learn how users are interacting with the bot as well as where the bot may be breaking down. The learnings can be fed right back into the bot to improve the overall experience.
For example, a developer can look at the bot’s error or “I don’t know” response and see all the users’ messages that lead to the response. Using that info, they can add support for the missing functionality or better responses.
Dashbot monitors messaging content from both high-level (aggregated) views and up-close. What are some funny or surprising things you’ve noticed about people’s interactions with bots?
We publish industry write-ups on the data we’re seeing to try and help the whole industry move forward. For example, publishing the most common messages users send to bots, analyzing the effects of UI elements on engagement, and examining retention rates.
The most common message into Facebook bots turns out to be “hi” followed by “hello” — this may seem obvious, as it’s conversation, but not all bots support even these basic messages. At one of our meetups, a developer asked if the panel would review his bot, and as soon as we typed in “hi,” the bot responded with an “I don’t know what you mean” message.
Earlier in the year, we analyzed the images users were sending to bots. We ran them through Amazon’s image recognition service.
The surprising result is that the most common image sent to a bot is the selfie. We dived deeper into gender differences and noticed cars, vehicles, and motorcycles showing up more for men, while selfies were still the top for women.
The crazy thing, though, is people do send naked selfies to bots. Luckily, it’s a very small percentage — about 1–2% of images. However, when they do it, they do it a lot! A “normal” image is sent on average once, whereas a NSFW image is sent in on average five times. In fact, during a two-month period, one person sent their naked selfie 266 times!
We’re trying to be a thought leader in this space.
We publish industry reports to help the whole space move forward. Hopefully the learnings and insights we’re seeing can help others improve their bots.
We get a lot of positive feedback on the reports. We see them referenced by other writers and often hear speakers at events referencing the data as well. The writing has also helped drive organic signups.
In addition to writing, we also host events.
We host monthly meetups in San Francisco, and occasionally in New York, to help bring the community together to learn and share from each other.
We also hosted our first conference in May — the Super Bot Conference. We had 300 attendees and speakers from Google, Facebook, Slack, Kik, NBC, Fandango, AOL, and many more companies.
Could you talk a little bit about your Live-Control feature? When might a company want to take the conversation off auto-pilot and insert the human element?
Conversational interfaces are a great example of where having real-time data and analytics is actually meaningful.
With our Live Person takeover feature, if a session is going awry, a live person could pause the bot and help lead the user through to completion.
Imagine the case of a hotel booking bot where a user is getting stuck — perhaps the sentiment is growing negative, or they asked for help. You can pause the bot and help lead the user through to conversion, and then turn the bot back on. It doesn’t necessarily have to be for a negative condition either, it could just be that over time we learn certain conditions are ripe for a takeover.
What possibilities do you see for bots outside the most obvious places like Facebook Messenger and Slack?
There are a lot of opportunities with voice enabled interfaces on Alexa and Google Home — not just in the home, but in retail, travel, the office, and more.
Imagine walking into a store and instead of trying to find a salesperson to locate an item, you can just walk up to the device and ask where the item is located.
What are your thoughts on where bot technology will go in the next 5–10 years?
We are strong believers in conversational interfaces — they are the future of human/computer interaction.
While there are plenty of exciting use cases and device already, we are still in the early days and will see even more improved AI and NLP over the coming years.