Interswitch Alexa Challenge

Interswitch Engineering
Interswitch Engineering Blog
7 min readOct 29, 2018

By Ayodeji A for TEAM OMNICODENT

As part of our Engineering culture to drive more innovation amongst our Software Engineers, our Fridays are set aside for everyone to partake in the Mind the Engineering Fridays where teams are set up to work on diverse in-house products to encourage team bonding and practical solution creation.

To top the fun, hackathons are held at the end of the month with full focus on product inception. The most recent had us all partake in a 24-hour hackathon tagged “The Alexa Challenge” with the goal of building innovative payment solutions by integrating any of Interswitch’s products on Amazon’s Echo Voice assistant: Alexa.

Meet Alexa:

Alexa, as described above is Amazon’s voice activated virtual assistant present in Amazon’s Echo — a smart speaker. With the right commands, Alexa can be enabled to play music, set alarms, read out information from the internet about the weather, news, sports etc. With the right skills, she can also be extended to do much more.

Introducing Alexa Skills:

Alexa can be extended by building Skills, these are voice driven programs that can be loaded to Alexa. A Skill consists of

  • Invocation: is the word(s) used to get Alexa to start your Skill at any time.
  • Intents: are the actions users of this skill can ask Alexa to perform.
  • Utterances: are several ways users are expected to interact with your skill.

All of these make up the Interaction Model. Once properly implemented, each skill is ready to be installed and used. Visit https://developer.amazon.com/alexa to learn more on building skills on Alexa.

At the start of the hackathon, each team received an Amazon Echo to help with implementation. The rules were simple:

  • 10 teams
  • 24 hours to hack
  • Build something to demo
  • Integrate at least one solution with any of Interswitch product APIs.
A typical achitectural flow for integration

With teams settling to work with API’s of the following products: QuickTeller, Interswitch Lending Service (ILS), Mobile Banking and VTUcare, several innovative products were built with the winning team creating solutions on Voice Banking and Betting Odds.

  • Voice banking allowed for implementation of airtime top-up, funds transfer and name enquiry.
  • Betting Odds allowed to get betting odds for EPL matches.

Upon research, there were 2 ways to build a Skill with Alexa,

  1. With an AWS Lambda function
  2. With a Web Service

Both ways were selected since different skills were being built.

The Lambda Function approach: An AWS Lambda Function Is a service that lets you write and run code without managing your own servers. This approach was relatively simple, but the biggest challenge came with signing up on AWS. Getting a verification code to verify ourselves on the platform proved very challenging.

After several trials, a code was generated, accounts were set up and we went on to write the skills. With Lambda functions, we were able to leverage the SDKs in Java, Node.js and Python.

The Web Service approach: This was also relatively simple; All we needed was a web server. One could build an app with an embedded web server or have a web server hosted somewhere and deploy the application to it, whichever works.

To implement the Betting Odds specifically, we made use of Bhawlone- an external betting API vendor, here’s how:

Step 1 — Defining the Interaction Model:

This involved defining the Invocation name, Intents, slots and sample utterances that would trigger an intent. Here’s how we achieved ours:

{ 
“interactionModel”: {
“languageModel”: {
“invocationName”: “betting odds”,
“intents”:[
{
“name”: “AMAZON.FallbackIntent”,
“samples”: []
},
{
“name”: “AMAZON.CancelIntent”,
“samples”: []
},
{
“name”: “AMAZON.HelpIntent”,
“samples”: []
},
{
“name”: “AMAZON.StopIntent”,
“samples”: []
},
{
“name”: “AMAZON.NavigateHomeIntent”,
“samples”: []
},
{
“name”: “GetBettingOdds”,
“slots”: [
{
“name”: “club”,
“type”: “EPL_CLUBS”,
“samples”: [
“{club}”
]
}
],
“samples”: [
“betting odds for {club}”,
“{club}”
]
}
],
“types”: [
{
“name”: “EPL_CLUBS”,
“values”: [
{
“name”: {
“value”: “arsenal”,
“synonyms”: [
“gunners”
]
}
},
{
“name”: {
“value”: “Watford”
}
},
]
}
]
}, `

View here for complete file

Note: Some intents are created by default and comes with every skill. They can be identified with the “AMAZON” prefix.

Step 2 — Defining handlers for each intent:

For each intent defined in the interaction model, a handler also had to be defined as this helps perform the logic tied to an intent when sample utterances tied to that intent are said by a user. Here’s how

class GetBetOddsHandler(AbstractRequestHandler):                             
"""Handler for Getting Betting Odds."""

def can_handle(self, handler_input):
# type: (HandlerInput) -> bool
return is_intent_name("GetBettingOdds")(handler_input)

def handle(self, handler_input):
# type: (HandlerInput) -> Response
logger.info("In GetBetOddsHandler")
attr = handler_input.attributes_manager.session_attributes
slots = handler_input.request_envelope.request.intent.slots
logger.info("Slots are: {0}".format(slots))

response_builder = handler_input.response_builder

slot = slots.get("club")
if slot.value:
club_name = slot.value.lower()
logger.info("Club name input: {0}".format(club_name))
res = self.odds(club_name)
handler_input.attributes_manager.session_attributes =
attr
response_builder.speak(res).ask(data.ASK_AGAIN)

else:
logger.info("User requested for a non EPL club")
response_builder.speak(data.NON_EPL_CLUB)
.ask(data.ASK_SPECIFIC_CLUBS)
return response_builder.response

Step 3 — Integrating APIs of the betting sites.

The external API used- Bhawlone came along with a great API documentation and was easy to integrate with: Our intents were made to call it’s APIs to

1. get betting odds for EPL matches

2. gather data from the odds

3. have Alexa read out this information.

Like above, we defined the interaction model, wrote the handlers for each Intent, once we had created the Skills, an endpoint was set up to where the skill was located on the Amazon Developer Console.

Here’s a link to the project’s codebase on github, feel free to play around.

P.S: The next step typically was to have Alexa enable you place bets for matches but we were unable to complete this bit due to time constraints.

Step 4 — Testing

After integrating the API’s, the next step was to test out the skills. A lot of tests were carried out on the Developer Console with the Alexa Simulator. There are 2 ways to test:

1. Text based: A text command is imputed and Alexa responds with appropriate responses based on what your skill tells her to do.

2.Voice simulator: Like it implies, a command is voiced out and a verbal response is given by Alexa.

Step 5 — Setting up the Echo to use the custom skills

To do this:

  1. Getting the apk for the Amazon Alexa mobile app was tough one to crack due to location restrictions, but we found a way around it and downloaded to our testing device (mobile phone) and registered with the email address used on the Developer Console.
  2. Upon signing in, all skills already set up on the Developer Console account can be seen and installed on the Echo device automatically once the mobile app is set up.

Step 6 —Demo!

Try your skill on your echo device with your invocation name. That’s it!

Like said above, the goal of the hackathon was to build innovative payment solutions by integrating any of our products on Amazon’s Echo Voice assistant: Alexa. To that, some amazing ideas also showed up from other teams including:

  • A skill for monitoring internal services in Interswitch and their performances
  • A skill for managing spending habits
  • A skill named Red Robot which helps women manage their cycles.

Like every other hackathon, there were lessons to be learnt:

  1. Always Demo something that works: The demo is THE MOST IMPORTANT part of any hackathon, and no matter how amazing your idea is, if your demo isn’t perfect, your chances of winning is low. For the winning team, we had a few things working perfectly, but had to mock and hard code some things, just for the demo. The demo acts as the first impression you give the judges and should aim to be the best, If you have an unstable or unreliable solution, then take a video of it while it’s stable and demo that, it could save your presentation.
  2. Standout: Creating Payment solutions may sometimes sound un-exciting and in events as this, there is the valid probability that most teams would build similar solutions. With that in mind, It’s strongly advised to have a WOW factor that stands out your product. For us, it was taking the Voice Banking Solution live. The energy in the room when one of the judges was able to use the service for themselves and verify that it was working, could not be matched. Finding a wow factor most times takes a lot of brainstorming but many times ends up being a small feature that’s easy to implement.
  3. Teamwork and Planning: Each team was amazing and practiced a lot of teamwork, everyone was quick to do their research, identify what needed to be done, divide the work accordingly and worked tirelessly overnight. When anyone had any challenge, they quickly came together to help resolve the challenge and were all willing to do all it took to achieve the goals set out.
  4. Easy to use and readily available APIs: The Importance of well documented and easy to use APIs cannot be overemphasized and this is important because it was probably the greatest challenge any team faced. APIs need to be easy to understand and use, it’s really that simple.
  5. Endless Possibilities with building for virtual assistants!

And there were prizes, Yes, you guessed right! each member of the winning team went home with a well-deserved Amazon Echo device.

Team OmniCodent — Winning Team and Software Engineers at Interswitch

--

--

Interswitch Engineering
Interswitch Engineering Blog

Fostering a better developer and software engineering experience at Interswitch through in-depth documented technical learnings and exploration.