Create your own personal assistant in Azure

Mihaita Tinta
ING Hubs Romania
Published in
8 min readJun 11, 2021

Adding an AI capability into your application can now be a simple process. I tried to build a simple chat bot with Azure and it worked with a little effort. No AI previous knowledge is required. It can do small talk and answer some weather questions, all in 120 languages.

You can setup your free account here. We will create the bot using the Bot Framework that uses QnA Maker Cognitive AI. There are different options regarding the programming language you want to use. We will go with Spring Boot and Java based on this sample project. The source code will be visible here. I also followed this tutorial to better understand what steps to take.

We will use a new resource group to be able to delete everything at the end

The first step is to create a QnA Maker resource. This contains all the knowledge base we need for our bot chat. All the questions and answers regarding our domain will be based on the information we will import here.

After everything was deployed successfully we can go to QnA Maker.

There are multiple ways to create a knowledge base. The fastest approach for me was to import a file containing questions and answers. The supported formats are here. After we train our knowledge base, we are able to test it.

There are some silly questions we can already ask.

If everything looks ok we can publish our knowledge base and create our SpringBot that will answer our questions.

Create your java bot by following all the steps. If you encounter issues don’t forget to add the debug flag in the command. In my situation I had to upgrade my subscription to be able to add a new deployment due to the error below:

This region has quota of 0 PremiumV2 instances for your subscription. Try selecting different region or SKU.

Fortunately I was still able to use the free credit :). Finally, we can see the bot in action:

There are multiple ways to interact with the bot. The most simple approach is to use the Test Web Chat channel. There are many other options like Teams, Slack, Email, Facebook, Alexa etc which you can configure to get in touch with your audience. You can also generate an iframe to insert the chat bot in your existing website:

<iframe src='https://webchat.botframework.com/embed/mihBotApp?s=SECRET_KEY_HERE'
style='min-width
: 400px; width: 100%; min-height: 500px;'></iframe>

Depending on the bot name, if we access a link similar with: https://<your-bot-name-here>.azurewebsites.net/ we will get a sample Chat Bot implementation that is able to answer questions on the /api/messages endpoint.

As I said, we will use the sample spring bot java code. To test our local instance we need to also run the Bot Emulator. After we add our keys in the application.properties file we are good to go.

Our local bot instance is running on the server.port we configured. We can see on the right hand side some API calls (we can inspect) initiating the conversation with our SpringBot.

All the interaction is possible due to the BotFrameworkHttpAdapter bean which enables the interaction via the /api/messages endpoint (see com.microsoft.bot.integration.spring.BotController)

Our ActivityHandler is processing new messages from our users and we can easily delegate the question to a QnAMaker instance (connected to our knowledge base)

// The actual call to the QnA Maker service.
return qnaMaker.getAnswers(turnContext, options)
.thenCompose(response -> {
if (response != null && response.length > 0) {
return turnContext.sendActivity(MessageFactory.text(response[0].getAnswer()))
.thenApply(sendResult -> null);
}
else {
return turnContext.sendActivity(MessageFactory.text("No QnA Maker answers were found."))
.thenApply(sendResult -> null);
}
});

This scenario is very simple, we may or may not get an answer (with a score) that we can use in our reply to the user message.

With a simple mvn command we can deploy our code to the Azure location. Don’t forget to run az login to connect to your account before.

mvn azure-webapp:deploy

We can see the application is now up and running

The next step is to add the language understanding service to be able to make our bot understand some commands. We will use the api.weatherapi.com to find how the weather is in a specific location.

LUIS — language understanding service

When we enter for the first time the page we have to connect our Azure account. You can read more about how LUIS works.

We will have to create an app that understands Weather questions.

We can add the prebuilt domain:

This is very useful because it provides real examples on how to organise a model. There are some best practices we have to consider like keeping a simple domain.

Based on the user input we compute the user intent and extract the entities. You can explore some examples below:

After we train the app we can also test it from the UI with a few clicks. If everything is ok, we can also publish it so that our chat bot can use it.

Because we have to keep simple domains defined in a LUIS app or a QnA knowledge base, we can apply the dispatcher pattern. This means, we can process the user input, compute with a given score which is the user intent, and then let our bot service reply with the best answer.

Please note the Orchestrator is now the target Azure intent resolution. We will stick with our approach since it is the only one available in the java samples. We need to run some commands to create the luis instance:

dispatch init -n dispatch-luis --luisAuthoringKey "key-from-your-luis-profile-page" --luisAuthoringRegion westeuropedispatch add -t qna -i "qna-id" -n "qna-name" -k "your-qna-key" --intentName q_chit-chat
// add how many domains you want
dispatch create

We now have a LUIS dispatcher that can route messages

Our dispatcher has the configuration below. It can answer weather questions and do small talk.

We can also test again and see if the mapping is correct:

We have to add the connection details to our local luis instance in the application.properties

and the LUIS maven dependency:

<dependency>
<groupId>com.microsoft.bot</groupId>
<artifactId>bot-ai-luis-v3</artifactId>
<version>4.13.0</version>
</dependency>

The Luis Recognizer class is used to query the LUIS Service using the configuration set by the LuisRecognizeroptions.

dispatch = new LuisRecognizer(recognizerOptions);

Depending on the message we get in the turnContext, we can use the result from the LUIS service with the NamedIntentScore which provides a score for each category.

For example, if the result is related to the weather domain we can extract from the recognizerResult the entities mentioned by the user:

how is the weather tomorrow in Bucharest?

In the code above we pass the location and the time to the weatherService. In return it will provide us the temperature. You can check here what other things weatherapi.com can return.

You can also notice the CompletableFuture chain we build with each processing step.

To get the answers from the QnA service we just have to pass the turnContext containing the question. We can use the first answer for the reply. This is an opportunity to maybe check the score if the answer is also good enough.

So far we can see the conversation is working: small talk and weather questions.

With the Translator resource we can enable our Bot to talk multiple languages. We have to create one for us:

From the samples we can see how to enable the translation middleware. Messages going in or out gets processed by this layer. We can also add conditions to disable the translation if needed.

There isn’t a very complex integration. We just need the translator key in the call we make towards the Translator API.

If we want to add all the available languages to our bot, we can retrieve them with a similar call:

new RestTemplate()
.getForEntity("https://api.cognitive.microsofttranslator.com/languages?api-version=3.0&scope=translation", LanguageResponse.class)
.getBody()
.translation
.entrySet()
.stream()
.map(e -> new Lang(e.getKey(), e.getValue().get("name")))
.collect(Collectors.toList());

We can also return complex objects in the conversation like possible answers (cards)

After we choose Spanish we can see the bot is understanding what we ask

This is an important aspect because there wasn’t a single Spanish word we defined in our knowledge base/luis app. With a minimum effort we can interact in 120 languages with our users.

While this is fun, you should also consider the costs involved. In the few days I played with this you can see below some charges from the free credit were taken.

You can also add some cost alerts and budgets.

We didn’t mention anything about the bot state or how to build a more complex conversation. Ideally the bot should guide the user and solve a problem with some minimum steps. One could also consider persisting the questions (without an answer) for better improvement of the knowledge base.

The source code is here.

--

--

Mihaita Tinta
ING Hubs Romania

A new kind of plumber working with Java, Spring, Kubernetes. Follow me to receive practical coding examples.