Automating Alexa Skill Dialog Testing

Alexa Skill Testing Can Be A Long And Laborious Task, Having To Run Through Possibly Hundreds Of Dialogs Manually. In This Post, We Will Walk You Through A Concept For Testing Alexa Skill Dialogs Without Speaking To Alexa And Completing Tests In A Matter Of Seconds.

While chatting with one of my colleagues about a project she is currently on, she was telling me about an Alexa skill that she is testing and how it supports hundreds of dialogs between the Alexa skill and the user. This got me wondering if there wasn’t a tool that could help automate Alexa skill dialog testing.

After a few searches on the web, I was unable to find anything that could appropriately test skills however, I did find a few technologies that I could use to create a tool myself. Specifically Bespoken and Mocha.

The goal of the tool would be to ease a QAs life and make it possible to test hundreds of Skill invocations in less than a minute without having to speak to Alexa. This solves the recently introduced problem of testing Alexa skill dialogs. With mobile apps, it’s quite easy to test at your desk, but with voice interaction experiences a QA would have to speak to the skill allowed disturbing their colleagues or becoming frustrated when Alexa doesn’t quite catch what was said because of background noise. This solves that without having to sacrifice valuable meeting room space to have a dedicated quiet space to test.

The tool would also help improve the quality of Skills since testing all the dialogs of a complex feature could be achieved within seconds with no human effort required to repeat it after new changes are introduced. You can imagine just how beneficial it would be to use this tool with continuous integration and be informed about a bug immediately.

To test this concept I created a simple, yet practical, skill that will walk a Doctor through the UK’s NHS Summary Flowchart to Aid Diabetes Diagnosis. This would have similar specifications to my colleague’s project and would be complex enough to demonstrate the concept’s viability.

So using bespoken tools with the mocha testing framework I was able to create tests which will test dialogs. Afterwards, I’ve moved them into a separate file with the easy-to-read yaml format to make it possible to modify and define new tests without writing any code.

Here is the testing code.

Here is the yaml file with the test definitions:

And here are the successful test results!

Obviously, this simple testing logic does not cover all complex cases of real-world Alexa skills but it proved that this concept is useful to make Alexa skill testing more efficient. Both Alexa skill and testing logic code can be found in the following repository.

My original post at blog.