3: Bots and speech

Now for something a little different.

The rise of the bots

Lots has been written about the recent explosion in bots. Matt Galligan wrote a nice summary here and recent research pointed out how bots could improve the generally underwhelming experience of online services.

Bots provide a human-computer interaction that tries to make it feel like you’re chatting with an ultra-efficient human. They use natural language processing to turn your text into clear instructions. Bots either go through a script or use artificial intelligence for more open-ended interaction. Scripts are simpler; AI is more intriguing, suggesting the ability to learn a user’s behaviours and quirks.

The great thing about bots is that, once built they are relatively easy to deploy to new platforms. And using third party tools like wit.ai and api.ai gives your bot a wealth of intelligence and, day I say it: personality.

Bots have many potential energy applications but two are obvious. Firstly, the simple UX allows users to get information about their account and consumption, thus increasing the levels of ‘energy literacy’.

SMS-based energy bot

The second is to use bots to push notifications or to provide feedback.

“Set thermostat to 22degrees”

That will cost you an extra £2.50 every day

Obviously, for this to translate into improved energy efficiency still requires action on the part of the consumer but bots can drastically ease access to information, which is vital for informed decisions.

Speech

In addition to text-based messaging services, bots can use natural language processing to provide voice interaction.

Siri and Google Now do this, but by far the best application of speech today is Amazon’s Echo. The device, a foot-long black cylinder, sits silently in a corner waiting for the “Alexa” wake up call. The Alexa skills kit, which powers the Echo device, is programmed to parse speech and turn it into instructions. Connect to a smart thermostat and you have a totally new way to control the heating in your home.

A raft of partners for the UK launch allow you search for recipes, call a taxi or order take away. You can now also use Alexa to control your heating, submit meter readings and get an update on your energy use.

Amazon Echo and skills for UK launch

Why is this useful?

I’ve been using Echo for the last six months and, while it has its annoyances (or rather it’s my children who, predictably, ask it incessantly where poo comes from) the experience is generally awesome. Alexa mostly hears accurately and responds correctly — it is noticeably better than Siri. It allows you to get information easily and without requiring you to interutp your flow to reach for your phone. Sure, I’m talking to an inanimate black cylinder, but it has freed my thumbs and given me much better understansing and control over my energy use.

Who knows what the ‘average’ consumer will think about this. There’s only one way to find out, right? One thing is for sure: this could be potentially revolutionary for customers with visual or mobility impairments. Imagine: instead of struggling to find the thermostat inside the cupboard under the stairs, you simply say “Alexa, tell the thermostat I’m cold”.

The future

This stuff is actually quite exciting! I see a number of important developments:

  1. Bots will get really good. In two years you will be interacting with bots daily. Although you may not realise it.
  2. Bots will get personal and smart — they will recognise your personal preferences and remember your routines. If Alexa here’s my voice at 9am on a Tuesday in January, she will know I’m working from home and will turn the heating on, but only in my office.
  3. User interfaces will change — you could change the heating using tiny body movements with Google’s Project Soli. Or it could respond to biometric data coming from the Bragi’s Dash earbuds.
A single golf clap? Or a long standing ovation?

By clapping more or less, you can signal to us which stories really stand out.