Keep Calm and Eat On with Alexa
There has been an increasing amount of interest for the development of natural learning interaction for bots and applications ever since Amazon released Alexa in 2014, Google Home in 2016, and companies like API.ai making their presence known in the space. Benefits and possibilities are endless. During CES 2016, numerous companies like Ford, LG, and GE to name few had their own take of how they would be integrating Alexa in to their products.
At Wildebeest, we like to be on the forefront of technology, and do our very best to embrace challenges. When Alexa came out in 2014, the skills or voice commands were very limited and documentation was extremely sparse. This made any development and progress for Alexa difficult. So we sent out to build a new Alexa skill that would help solve a basic problem, where to eat lunch. First world problems, right?
Here at Wildebeest we take everything pretty seriously, and there’s nothing more serious that finding a great place to eat. Ok, maybe that’s a little exaggerated but deciding on a place to eat amongst the team had always been a chore. It usually was the result of everyone being too polite and indecisive about where to go. So we thought, what better solution than using Alexa as an impartial decision maker.
The Robocop of lunches everywhere.
The goal was to be able to ask Alexa a simple question, “where should we eat today?” Alexa would then decide based on some criteria which restaurant would best fit our team and sms the restaurants information. With our goal set, we decided on utilizing Yelp’s API because of the ease and wide range of endpoints. And then leaned on Twilio’s SMS feature to send the newly located restaurants information to the user.
*More information can be found on the use of the two APIs at the links provided at the bottom.
I won’t go into the details of what it took to develop the specific skill because since then, Amazon has changed the process and requirements. And shortly after, Amazon released a similar Yelp skill that was incorporated into the core functionality of Alexa. However, if there’s enough interest and likes I’ll be happy to follow up this post with a tutorial on building an Alexa skill.
In the end we created a skill called, Lunch Lady.
Alexa, ask Lunch Lady for the best sushi joint.
Alexa, where should we eat today?
When the skill was invoked it would return a selected restaurant and prompt the user for their phone number to have the information texted to them. The skill helped facilitate and expedite the decision making of finding a place to eat. Now we where able to resume or normal lives as developers without any blockers. Keep Calm and Eat On!