Make Watson a member of your team by customizing your cognitive app

joe russo
joe russo
Sep 5, 2018 · 9 min read

Welcome back!

In Part 1, we introduced you to the basics of developing apps using Watson Work Services and walked you through an example app that incorporates Watson technology. In Part Two, we’ll explore the capabilities of Watson in more detail and you’ll get a chance to apply what you’ve learned to your own AI-infused app.

Imagine what it would be like to have super computer Watson, as a member of your team? Well that’s the easiest way of thinking about Watson added to your app — Watson basically functions as an active participant and contributor to the conversation.

A user can post a message to the conversation. All the users and Watson can see the message was posted. Watson annotates that messages, kinda like making personal notes in a notebook and this allows Watson to keep track of it.

Now if you want Watson to do very specific functions such as tell you the local weather or fetch the latest news, you can train Watson using Watson Assistant.


Using Watson Assistant you can create Intents, Entities and Dialogs.

Let’s start with the basics. Watson Assistant lets you build apps that understands natural-language and uses machine learning to interact and respond to users. A Watson Assistant Workspace (not to be confused with Watson Workspace) is where all the natural language processing actually occurs — its basically a container for all the artifacts that define the conversation flow for your app.

Next, in order to understand how you can use Watson Assistant to make your app have super powers you need to understand intents, entities and dialogs and the role they play.

An intent represents the purpose of a user’s input. You can think of intents as the actions your users might want to perform with your application.

Entities represent a class of object or a data type that is relevant to a user’s purpose. By recognizing the entities that are mentioned in the user’s input, your App can respond to the specific actions to fulfill an intent.

A dialog defines the flow of your conversation in the form of a logic tree. Each node of the tree has a condition that triggers it, based on user input.

You can learn more, and get started creating your first Watson Assistant Workspace using this tutorial, Getting Started with Watson Assistant


Let’s create!

Now we’re ready to build a simple app that can understand conversation and provide information about the weather using Watson Assistant.

First, you’ll need to create a Watson Assistant workspace.

  1. Go to https://assistant-us-south.watsonplatform.net/instances.
  2. Click Workspaces.
  3. Name your workspace.
  4. Click Create.

Now, let’s create intents. Remember, these are the actions users want to take in your app.

In order to train Watson, you need to add some actual statements users are likely to say to initiate the action.

Now let’s create an app to test what we’ve done so far — this will load your space with the custom Watson assistant and we can see how it effects Moments.

  1. Go to developer.watsonwork.ibm.com/apps

2. Click Create new app

3. Add the App name, here we’re using WeatherBotJ just cuz, click Create

4. Click Create

That’s it! You’ve created your app. Make sure to copy the secret somewhere that you can reference later.

Now, let’s add in the Watson Assistant you just created.

  1. Click on Make it Cognitive.

2. Enter your Watson Assistant ID, Watson Assistant username and Watson Assistant password.

To get this information go to your Bluemix dashboard (https://console.bluemix.net/dashboard/apps) and click on the Watson Assistant app you just created.

In the section marked Credentials, click Show to see and copy your user name and password.

Next, click Launch Tool and then click the drop down overflow menu.

Choose View details.

Copy and paste your Workspace ID.

Once you’ve copied all the information into the appropriate fields in Make it Cognitive, click Connect. This will add Watson Assistant to your App.

Let’s test it out.

  1. Go to a space, or create a new one in Watson Workspace.
  2. Tap the space name or settings icon to access Space settings.
  3. Click Apps and look for the App you just created. Any apps you create will automatically show up in your list of All Apps for all your spaces.

4. Add your app to your space.

In the conversation, type “Is it cold?” See how your app responds.

Now click the moments icon in the top right to view the moments for the space.

Look at the most recent moment. See how Watson identified this text as a question?

Congratulations!

You have connected a custom Watson Assistant that added a message focus to your question for Weather and the Watson technology built into Watson Work Services automatically identified this message as a question. Just think, you did all of this without writing a single line of code! But there is so much more that you can do — this is where the fun begins.


See what Watson does under the covers

First, as the diagram above indicated, when your custom Watson Assistant detects something it knows about it will add annotations to the messages. Let’s look at what’s actually happening here.

Using the GraphQL Tool that is part of the developer experience, I can get the space id for the space I want to use.

I used this query to get a list of the first 5 spaces, their titles and id.

query getSpaces {
spaces(first: 5) {
items {
title
id
}
}
}

I copy the space id of the space I want to use. Now, I can get details about the space, such as a message. To do this, we need the message id.

query getSpace {
space(id: "enter the space id") {
title
conversation {
messages {
items {
content
id
}
}
}
}
}

Now we’ll take the message id and get the detail around this message, specifically the annotations.

query getMessage {
message (id: "enter message id") {
id
content
contentType
annotations
}
}

If you look at the result, you’ll see a section for message-focus. This is the annotation we care about, in particular:

\”lens\”:\”weather\”,

\”phrase\”:\”is it cold?\”,

\”confidence\”:1.0,

This means, the lens is “weather”, which is what we saw in the Moments summary above, and it has a confidence score of 100%.

In other words, we had to first know someone is likely to be curious about the weather and we had to decide whether we wanted to display the current weather or near term forecast. Then, we needed to listen to when an annotation is added to a message. When it was, we extracted the confidence score and returned either the current weather or forecast.

Watson Work Services let’s you listen for events in a space. In this case we need to listen to when an annotation is added and if that annotation lines up with what we’re looking for, fire our app’s function (or not).

We could do this by filtering and pulling those elements to make a decision about weather, HOWEVER, we can also go one step further.

In Watson Assistant, we can use the Dialog to create logic and build in a trigger. In other words, when it detects a user has asked for weather or for a forecast, Watson Work Services will automatically annotate that message with a message-focus, and then your app can listen for the annotation added AND action taken — and that action taken item will indicate the action that you set up in Watson Assistant — thus firing your action.


Creating Dialogs for Triggers

Let’s go ahead and create some dialogs. We’ll create a node for Weather, getting current temperature and a node for Forecast. When making your dialogs, you need to add the intents, then open up the JSON editor and paste in your custom actions. Here’s how to do it.

Create two conditional blocks, one for if Weather and one for if Forecast

Click to customize Weather

Click the hamburger menu (3 vertical dots)…

…and then click to Open JSON editor. Once opened, add this JSON

{
"output": {
"text": {
"values": [},
"selection_policy": "sequential"
},
"actions": {
"GET_WEATHER"
]
}
}

This will provide an action which Watson Work Services will handle in the case of detecting that someone has said something that seems to match the Weather intent. We’ll see how that works in a moment. While we’re here, let’s also add an action for Forecast.

{
"output": {
"text": {
"values": [},
"selection_policy": "sequential"
},
"actions": {
"GET_FORECAST"
]
}
}

Now that we’ve added the dialog logic and action JSON, let’s see what this does.

Go back to your space, and ask again about weather — you’ll see now that Watson Work Services automatically underlined the text, indicating it recognizes actionable language and that there is an action that the user can take.

Let’s break this down:

Using the same process as above, we can get the message id and pull the annotations.

Now the key elements are:

\”lens\”:\”forecast\”,

\”phrase\”:\”will it rain this weekend?\”,

\”actions\”:[\”GET_FORECAST\”],\

You can key your code off of the value it would get back from the actions element.


Stay tuned for more

This is just the tip of the iceberg. Next time we’ll describe another annotation, actionSelected and write the code in node to handle the action, creating a customer experience for end users. In the meantime, visit https://developer.workservices.ibm.com to learn more about developing apps with Watson.

IBM Watson Workspace

Everything you need to do your best work

Thanks to Watson Workspace

joe russo

Written by

joe russo

Designer, developer, writer, soccer fan, traveler, lover of food and cooking.

IBM Watson Workspace

Everything you need to do your best work

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade