Machine learning in Alexa

Shiramshettysamarth
5 min readFeb 12, 2020

--

Amazon Alexa is one of most famous and efficient virtual assistant which was developed by amazon and was first used in the Amazon Echo and the Amazon Echo Dot smart speakers which was developed by Amazon Lab126.It is capable of doing all those activities which require physical movement of human beings.For instance Alexa can set alarms,play Audio books,play podcast, tell us about weather, play music or movie with few recommendations and other general purposes like to know about a place or about a word.Important question here is “HOW?” it is able to do that. Well, basic answer would be with some knowledge or information it has on a particular area which is relatively close to our question.But how knowledge is transferred into Alexa?how it keeps updating itself with existing knowledge? and how does it match’s the keywords to answer our question? the answer to all these questions is Artificial Intelligence and Machine learning. We can demonstrate the idea of how AI and Ml by considering the following example. Eg :-when we say ”fɔr ti taɪmz” it can mean

  1. For Tea times
  2. For Tee times
  3. Forty times
  4. Four Tea or Tee times

These kinds of questions whose pronunciation will be same but meaning totally different from one another,In these tricky cases Alexa considers the total context of sentence in the manner it is used like “ Multiply two forty times” then in that case it will consider forty number multiplying with any of following number you give it to. Comparitively Alexa is the best assistant for any kind of task. It’s understandability,analyzing the keyswords and providing appropriate results is best than any other assistant. When i decided to do my research on Alexa practically in beginning stages it was the articles and news information what i had. But when i acutally started to talk to people about this. I learned a lot of things. I immediately installed the Alexa app on my mobile and ran many test on it in different aspects. Not only on Alexa but also ran same test on other Assistants also like Google Assistant , Cortana , Bitsbee etc. Surprisingly 85–90% Alexa was better than rest of the assistants. How is the question again? Well the Skill sector in Alexa is again a plus point here. The skill’s allow alexa to evolve itself everday by attaching new skill to it.

Food Ordering Example With Alexa
Food Ordering With Amazon Alexa
Food ordering with Google Assistant
Food Ordering with Google Assistant

Here’s a small example on how things work between google assistant and alexa. We can clearly see that amazon is showing some appropriate results and asking us to install the skill which is required for the given command or task. Where as the google assistant gives us results of few nearby restaurants.

The results of both the assistants differ because of the difference in their usage of keywords. Google assistant in this case worked like an browser’s searching through the keyword “Food” which made it display the Restaurants instead of food delivery app’s or websites. Where as coming to alexa it utilized the whole sentence dividing it into different fragments, comparing and analyzing with proper sources made it suggest us Dominos food delivery on our Food ordering command. This Sentence Division and proper analyzing is how the machine learning concepts in alexa work as. Using proper Algorithms and giving efficient results.

Coming to the Entertainment aspects , lets consider here Music as example in this case.This one is dubious. No digital assistant works with each music streaming application. It’s same with the video apps. Siri works agreeably with Apple Music, yet that is the sole music application it can support.Google Assistant is somewhat better, simply because it has some help for Spotify, however, it still truly needs you to utilize YouTube or Google Play Music. Cortana is comparable — simply supplant Google’s stuff with Microsoft’s little-known service- Groove Music.

Moderately, Alexa made things the most effortless. It defaults to Amazon’s new Music Unlimited service but at the same time is the most sensible about Spotify’s strength. It even gives you a chance to make Spotify your default player.

Alexa AI research division are pursing semi-supervised and unsupervised techniques, in which AI systems learn to make predictions without ingesting gobs of annotated data. Semi-supervised and unsupervised learning have their limitations, too, but both promise to supercharge Alexa’s capabilities by imbuing a human-like capacity for inference.

In another example of unsupervised learning transforming the ways in which Alexa’s models are learned, Amazon researchers described a technique that tapped 250 million unannotated customer interactions to reduce speech recognition errors by 8%. Two semi-supervised learning techniques yielded greater gains: Using an acoustic model trained on 7,000 hours of labeled data and 1 million hours of unannotated data, Amazon scientists managed to cut error rates by 10% to 22%. Meanwhile, a separate team reduced errors by 20% with 800 hours of annotated data and 7,200 hours of “softly” unlabeled data that contained artificially generated noise.Each interaction you have with Google, Amazon, Apple, or Microsoft simultaneously builds their pool of knowledge as well as the personal context they have with you as a user.You can think of a smart assistant as a search engine that only gives one result. Voice agents do not have the temporal luxury to utter ten choices from which a user may choose. It is right or wrong. Still, their AI learns when you pose a follow-up question, especially if the phrasing is similar. Here lets the famous example which i’ve seen on alexa throughout my research.

You: “Order more pods.”
(Alexa’s AI): “I see a potential match in purchase history, but I’m not sure…”
Alexa: “Sorry, I don’t know pods.”
You: “Order more Tide.”
(Alexa’s AI): “Aha! Pods means Tide Pods!”
Alexa: “I found Tide Pods for $19.97…”

Even a human might not know that “pods” means “Tide Pods” so Alexa, like a human, must be taught. So, it is trained in such a way to undestand human behaviour in every aspect and react accordingly.

I can also tell alexa can be used by disabled people like Visually Impaired also? Yes, absolutely. With above all examples given it clearly proves that it can also be used by people who are disabled independently. Command’s like Listening to music, Narrating a story or riddle , Playing games like quiz, detecting music&playing music are few operations which make can utilisedby just voice command. Even future tech like Smart cap using Alexa has also being in development which will be used for Visually Impaired for navigation purpose where they can be independent

--

--