API.AI in Swift 3

Jayven N
iOS App Development
7 min readFeb 8, 2017

Build a bot like a boss.

“I have a voice.” — Bot

Purpose

I hope that by the end of this post, you will have learned how to implement API.AI into your own project in Swift 3 like a boss.

This post is a compilation of information that I’ve gather over time and thought would be helpful to share. A post for people who are interested in building their own bot or have found insufficient/outdated information on the internet.

If you have read my previous blog post, you would know how I feel about the lack of relevant support for API.AI. Outdated resources and a whole lot of digging. This is a straight line guide that you can follow.

Prerequisite

Before going forward, a decent understanding of API.AI is highly recommended. If this is your first time hearing about API.AI, then check out Introduction to API.AI. Other than that, let’s get started.

Project Idea

After writing the header above, I decided to take 15 minutes to come up with a simple and meaningful AI project idea.

I have decided that we are going to build a traffic lights bot app. I want a simple and applicable project.

We’ll have a bot that understands us. Change color on screen according to our input. We’ll also have a bot that speaks to us.

Setting Up API.AI

Once again, if you have not check out the API.AI guide, I highly recommend that you do before going further. Here.

Let’s begin by opening up API.AI. Then sign in and we should be at our main console.

Now create an agent called “Traffic-Master.” Create an intent called “change.color.”

Create a “colors” entity with red, yellow, and green entries. Also add synonyms like stop, slow down, and go. We may want to refer to these colors with verbs like “show me the stop color.”

Now go to the “change.color” intent we created a moment ago. Type in user expressions for your intent.

Example:

Cool, now let’s scroll down to the “Response” section. Add these two lines under text response:

Bam! I’m $colors!Roses are red. Violets are blue. I just changed to the color $colors just for you.

We just taught our bot how to respond in text to the change color intent. We gave it two response variations so our bot isn’t so bland. Bot will behave more similarly to human. The more variations you add to your bot, the more ways your bot can respond to you. It’s pretty cool.

Testing On API.AI

Now we’re going to test out our bot. Look to the right and you should see a space for you to test. Press the microphone button and say “show me red.” Or just type it in.

Then try “it’s time to go.” Remember, we set go as green and etc. Also notice the two different responses from our bot.

Cool, our Traffic Master understands us!

Now put “change.color” as our action and press save. The purpose of this will become clear when we code up our app. This acts as an identifier.

Setting Up Xcode

Go ahead and create a new Single View Application project in Xcode. I am going to name my project “Traffic Lights AI.”

Now create a new podfile in your project by opening up Terminal. In your Terminal, go to the directory of your project and type in pod init. Next, open up the podfile and add this pod:

pod ‘ApiAI’

The podfile should look something like this:

# Uncomment the next line to define a global platform for your project
# platform :ios, '9.0'
target 'Traffic Lights AI' do
# Comment the next line if you're not using Swift and don't want to use dynamic frameworks
use_frameworks!
# Pods for Traffic Lights AI
pod 'ApiAI'
end

If it does look like that then save the file. Go back to your terminal and run this code:

pod install

Get Going With Xcode

Storyboard

Now we are going to need our View Controller to look like the image below. I have a UITextField, UIButton, and UIView. Drag them into your storyboard and give them constraints. As long as they look relatively the same. We’re good.

Note: UIView’s background color is white.

Now rename the button’s title to send and connect the UIKit objects from your storyboard to your View Controller:

UITextField — outlet — tfInput

UIView — outlet — viewColor

UIButton — action — btnSendDidTouch

Cool, now that we feasted up the starter plates. It’s time for the main course of our meal. The codes.

App Delegate

Import ApiAI at the very top of your AppDelegate.swift file. We need to configure our API.AI:

func application(_ application: UIApplication, didFinishLaunchingWithOptions launchOptions [UIApplicationLaunchOptionsKey: Any]?) -> Bool {let configuration: AIConfiguration = AIDefaultConfiguration()
configuration.clientAccessToken = "YOUR_CLIENT_ACCESS_TOKEN"
let apiai = ApiAI.shared()
apiai?.configuration = configuration
return true}

This is how API.AI identify which agent we refer to. To get the client access token, go to your main console and press on the setting button right next to your agent’s name.

Then scroll down on the page and you should see your client access token.

View Controller

Ok, let’s go to the ViewController.swift file. Import ApiAI. Import AVFoundation. AVFoundation because we want to listen to the text spoken from our device/simulator.

Declare and initialize the following constant at the top of your class where your outlets are:

//For our device to speak
let speechSynthesizer = AVSpeechSynthesizer()

Then add these two functions into your View Controller class:

//Device speak
func speak(text: String) {
let speechUtterance = AVSpeechUtterance(string: text)
speechSynthesizer.speak(speechUtterance)
}
//Animation with color change on UIView
func changeViewColor(color: UIColor) {
viewColor.alpha = 0
viewColor.backgroundColor = color
UIView.animate(withDuration: 1, animations: {
self.viewColor.alpha = 1
}, completion: nil)
}

Inside of btnSendDidTouch, add this:

let request = ApiAI.shared().textRequest()if let text = self.tvInput.text, text != "" {
request?.query = text
} else {
return
}

What we are doing here is we initialized a request. Before that, we check if the text field is empty. If it is, then we just return.

Now right below those codes, add the codes in the screenshot. It simply look a lot clearer than typing it out in the code block. We are more aware of what’s going on when we type it out ourselves anyways. So type away.

Here is the function looks like:

We made a call to API.AI. Once API.AI has responded, we see the response action. If the action is “change.color,” then we safely unwrap the result parameters and change the view’s color accordingly. If not then we just change the view’s color to black. If it’s black then we can visually see that the AI does not understand us.

After that, we safely unwrap the text response and have our text be spoken in speech. Then, if for whatever reason the call to API.AI is unsuccessful, we print out the error for debugging.

The last two lines and its explanation:

//Send to request to API.AI
ApiAI.shared().enqueue(request)
//Empty text field after send button clicked
tfInput.text = ""

Just like that. Done.

We have ourselves a bot that understand us, speak to us, and change colors for us.

Test It Like It’s Hot

Max out the volume. Make sure to hook up some quality speakers for the full AI experience. You’ll need it.

Go ahead. Type in your commands for the bot and press send. If the bot understands you, the UIView’s color will fade out and fade in to red, yellow, or green. Else, it will just change to black and you can add new user expressions to the AI’s knowledge pool. That’s it. I’m heading to bed now.

Last Remarks

I hope you have enjoyed and learned something valuable from my article. If you have then let me know by hitting that ❤ button and follow me on Medium. Also, share this article so that your circle can make some knowledge gains too.

Lastly if you have any comment, question, or recommendation, feel free to drop them below. Tell me what you want to learn next!

Feel free to check out my recommended articles:

Master Functions in Swift 3

Introduction to API.AI

--

--