Botiquette Matters.
With chatbots in the fast lane to becoming a larger part our daily lives, more questions are popping up around the acceptable rules of engagement between bots and humans.
Bots have it rough. The conversational nature of our engagement with them opens up a highway for free flowing abuse, from profanities and insults to propositions of all inappropriate sorts.
While we may know deep down that there isn’t a super-fast-typing human on the other side, we — being human — often fall victim to the Eliza effect and subconsciously assume that the messages we receive from a chatbot are coming from a place of real emotion.
I’m not sure if the Eliza effect can be applied in reverse, but if there’s some level of assumption of messages coming from a place of emotion, does that mean that the same level of assumption often exists when flinging abusive behaviour towards them? Is it always simply curiosity around how the bot has been programmed to react or is it just pure bully mentality?
(Insert deep pondering moment).
Eliza effect aside, chatbots are designed to replicate how we would communicate with an actual human. With that in mind, should we not then treat them with similar courtesy? Sure, they won’t get offended — technically — but acceptance of abusive behavior towards a human-like technology would surely not do any good in nurturing good behavior towards actual humans.
This isn’t just a melodramatic idea. Kate Darling, a leading expert in Robot Ethics at MIT, believes that how you treat a chatbot could say a lot about you. She says,
“We’ve actually done some research that shows that there is a relationship between people’s tendencies for empathy and the way that they’re willing to treat a robot,” she said.
The other issue that comes into play when users start abusing chatbots is that they learn from the language we expose them to. You know how when a parent is doing grocery shopping and their child innocently blurts out a completely inappropriate word that they picked up from their mother’s conversation on the phone? Sort of like that. Except, in the case of chatbots, it isn’t accidental.
It’s us who need to teach them how to behave.
Just look at what happened with Microsoft’s Twitter chatbot in March last year. ‘Tay’ was designed as an experimental research project in conversational understanding. Within 24 hours of learning from public data and how users engaged with it, it became a raging racist, anti-Semitic, sexist disaster and was quickly deactivated.
In a statement, Microsoft said: “The AI chatbot Tay is a machine learning project, designed for human engagement. As it learns, some of its responses are inappropriate and indicative of the types of interactions some people are having with it.”
It’s cases like these that fuel the belief that there needs to be a code of behavior between bots and humans — ‘Botiquette’.
This not only includes a set of guidelines for how we need to behave towards bots, but also how chatbots behave towards us.
While respectful language seems like an obvious rule of Botiquette for users, chatbots also need to mind their manners and bear in mind their context. While a cheeky, fashionista bot aimed at a young audience may greet you with a disapproving comment about your outfit, the same attitude isn’t going to digest quite the same when coming from a financial advisor bot.
Not even abusive behaviour towards a bot should justify reciprocated abuse. There’s actually a Weather chatbot called Poncho that calls you out if you’re rude to it and will give you the silent treatment for 24 hours if you don’t behave better. Ignoring abusive behavior could in fact be the best route and the least likely to provoke further abuse in the hope of a reaction.
Honesty about a chatbot’s non-human status is another commonly recommended best practice, as is keeping a user’s personal information confidential, unless stated otherwise.
Ultimately, it comes down to respect. Sure, we’re talking about conversations with algorithms, but chatbots are really like having conversations with ourselves and with each other, the ones they will continue to learn from. So, best we mind our manners.