Advanced Conversational AI with Amazon Connect and Cognigy

Peter Nann
Cognigy.AI
Published in
9 min readApr 14, 2020

Amazon Connect is a cloud contact centre that can be up and running in minutes, on both telephone and chat channels. It has a pay-as-you-go pricing model that many businesses are starting to find attractive.

A powerful feature of Connect is that it supports Amazon Lex out-of-the-box for natural language understanding (NLU), able to understand both spoken input on the telephone, and typed input in chat. With Amazon Connect being so easy to access, all of a sudden, the power of Conversational AI can be readily employed in contact centres of all shapes and sizes.

Or can it?

NLU is not enough

You see, NLU is only one part of building a high quality conversational experience. It is only the ‘understanding’ part, and much more is needed to get the user what they need quickly and reliably.

NLU just captures user input. If we use a website analogy, working NLU just means the user has a working mouse and keyboard — There is a lot left to design and build for a compelling experience. Conversational AI solutions require us to also build the complex and subtle processes of handling a multi-turn, bi-directional conversation; formulating responses such that they are natural and helpful to the user at all times, handling problems gracefully, and accessing multiple external data sources to bring data to bear on the conversation.

At Cognigy we recently completed an integration with Amazon Connect, bringing together the power of these two platforms. Connect brings a wealth of Contact Centre features that are standard or can be employed with a few clicks, such as call recording, and full chat session handover to agents. Cognigy.AI brings a powerful, low-code tool-set for building and maintaining advanced automated conversations. You can see a demonstration of the Cognigy.AI platform here.

Standard Amazon Connect / Lex tooling

While Connect’s in-built ‘Contact Flow’ tool can be used to create simple conversational flows, it uses a basic flowchart paradigm with many limitations at this time. Alternatively, the Lex framework can be used on its own to create complex conversations, but that typically requires significant amounts of back-end code development which often becomes opaque and hard to maintain over time. Sometimes a mixture of the two approaches is used, requiring two skill-sets and two processes to maintain.

Connect & Lex logic tooling:

Connect & Lex logic tooling — A basic flow tool, and/or back-end Lambda code

Cognigy.AI integrated with Amazon Connect

The Cognigy.AI platform is a perfect complement to the NLU facilitated by Lex. It enables the creation and deployment of advanced, enterprise-class AI-driven conversations which can be very easily integrated with existing back-end systems, such as CRM, ERP or RPA. It solves the limitations of Lex & Connect, with an enterprise-grade, graphical, low-code tool for complex conversations, with many advanced features.

The Cognigy.AI Flow editor — Many advanced features

Moreover, with just a few clicks, a conversation built in Cognigy.AI can be deployed to over 15 other channels, such as Alexa, Facebook, SMS or Apple Business Chat. Your organisation can share skill-sets and prior work across multiple conversational projects — Reaching a level of organisational maturity not often seen to date in this space.

Our integration with Amazon Connect utilises standard features of Lex, which is itself a standard feature of Connect. By using Endpoint Transformers and NLU Transformers available in the Cognigy.AI platform, no internal changes were needed, which is an excellent demonstration of the platform’s flexibility.

Full integration code is available

A GitHub Gist is available with all the code needed for this integration. To complete a running test you will need access to a Cognigy.AI platform version 3.6 or greater, your own Amazon Connect instance, a Lex bot to test, and an AWS Lambda function you’ll create.

But please read the full post before you jump into the code! The easiest order of deployment is described near the end of this post.

The integration with Connect

The following graphic shows the overall architecture.

Amazon Connect & Cognigy.AI integration architecture

As illustrated above, a small Amazon Lambda function is used, since Lambda is the only back-end integration mechanism supported by Lex. This Lambda is essentially a ‘shim’, of just a few lines of code, which translates the Lambda event from Lex into a request to a Cognigy.AI REST Endpoint:

The crux of the Lambda function ‘shim’

There is a little more too it than the few lines above, which you can see in the Gist that accompanies this post, but the above code illustrates the core behaviour of the Lambda function — Not much to it.

Cognigy.AI’s Endpoint Transformers are used on the REST Endpoint to translate the NLU result in Lex format, into the expected Cognigy REST format.

An NLU Transformer (a new feature in Cognigy.AI v3.6) is also used to achieve some increased convenience with NLU handling in the Cognigy.AI Flows. More on this below.

NLU — We have options

Both Lex and Cognigy.AI can carry out NLU. For this blog post we decided to use Lex for NLU processing. For an organisation invested in the Amazon Connect ecosystem, we believe this will be a common choice. This means that Intents and Slots will be extracted by a fully-formed Lex NLU model, before being passed into the Cognigy.AI REST Endpoint. The raw text string of what the user typed or spoke is also included.

Another option for NLU would be to use the raw text string supplied from Lex, and process that within Cognigy.AI to derive intents and slots. This could utilise the in-built Cognigy NLU, or another NLU engine available in the platform. Contact us at support@cognigy.com for more information on this option.

Passing Lex NLU results into Cognigy

The Lambda code snippet above shows that we pass the full Lambda event into Cognigy.AI, which is not expecting such a format.

In fact, Cognigy.AI does not typically expect NLU information to come in via a channel Endpoint.

We solve the format problem with an Input Transformer configured within a REST Endpoint, which simply takes the Lex Lambda event, and processes it into data structures that we then passes into our conversational Flows.

The core of the Input Transformer function is quite simple:

Cognigy Input Transformer

See the Gist for full details.

The ‘data’ property is a powerful feature across Cognigy.AI to include/attach arbitrary data for many and varied uses.

The critical part above is that the full Lex event, in the variable ‘lex’ above, is passed through into the platform in ‘data’. This will be accessible as ‘ci.data.lex’ in a Flow, and critically: is also available in an NLU Transformer as ‘data.lex’.

An Execution Finished Transformer is also configured, to turn aggregated Flow outputs (mainly: The response to speak) into a format that Lex will understand. This is a little more complex and not highly important to show here, so see the Gist for details.

Note that CustomerNumber (the calling phone number) and ContactId (a UUID for the specific call) as seen above should be passed through from Connect — we show you how below. The code above defaults these to random UUID values if not supplied.

Using Lex intents in place of Cognigy intents

The remaining challenge is using the Lex-detected intents inside our Cognigy Flows.

We wanted to achieve a solution which would capitalise on some of the convenient features of Cognigy NLU — Namely the convenience of Default Replies attached to intents, such that we need no Flow logic to handle simple user queries that have simple ‘one-shot’ answers, often referred to as ‘FAQs’.

This is where the new NLU Transformer feature was employed. Above, we stored the Lex result as ‘data.lex’, and this can now be used in an NLU Transformer.

We configure the REST Endpoint to use Cognigy NLU, but then ‘Transform’ the result to essentially cause the Lex intent result to overwrite the Cognigy intent result, with the following NLU Transformer code:

NLU Transformer — Lex intent overrides Cognigy intent

With this in place, any Cognigy intents with names matching Lex intents, will emit their ‘Default Replies’ as configured in the Cognigy.AI GUI. This is a simple, powerful way to answer basic questions from users (FAQs), without having to build an explicit flow for each.

You may also notice above that Lex slots are ‘merged’ in with the Cognigy slots, so both are available. This allows Flows to take advantage of Cognigy’s powerful and automatic functionality of Slot and Keyphrase extraction, for use in all decision Flows, as well as Lex-detected slots.

The final piece of the puzzle

As a recap, the steps to utilise Cognigy.AI from Amazon Connect, in the most convenient sequence to deploy, are:

  1. In Cognigy.AI 3.6 project, create a new NLU Connector, of type ‘Cognigy’, insert the NLU Transformer code from the Gist, and be sure to enable the ‘post NLU Transformer ’ with the toggle above the code
  2. Create a REST Endpoint, using the NLU Connector created above, insert the Endpoint Transformer code, and enable the ‘Input’ and ‘Execution Finished’ Transformers with the toggles
  3. Create an AWS Lambda function, using the code from the Gist, and alter the doPOST() URL to be the ‘Endpoint URL’ from the REST Endpoint configured above
  4. Create/Use a Lex bot — Each intent should use the above Lambda function for fulfillment
  5. Ensure Connect can access your Lex bot. Head to the Amazon Connect service console, select your instance, and navigate to ‘Contact Flows’. There should be a section called Lex where you can add your bot.
  6. Now finally : Access the Lex bot from your Connect ‘Contact Flow’. See below!

The final step is to integrate the Lex bot into a Connect Flow. In the Connect ‘Contact Flow’ editor, add a ‘Get customer’ input block. It will end up looking something like this:

Where ‘LexTest’ will be replaced by your Lex bot’s name, and the region may be different. Configure the block as shown below. Note that the very first prompt played by the bot is supplied in this block in Connect; also select your Lex bot in place of ‘LexTest’.

To pass through some useful session attributes from Connect, through Lex, and to Cognigy.AI, also add these session attributes exactly as shown to the block:

Ready to run!

If all is well, you can now test the phone or chat experience in Connect.

When you respond, your first input should be processed in your Cognigy.AI Flow, and then the conversation should follow your Cognigy.AI Flow logic, utilising Lex detected intents.

Seeing what’s what

If you have your Flow running, that’s great!

If not, we need to see what’s what.

You may have noticed that the code samples in the accompanying Gist, include ‘console.log()’ lines to emit key information. So look in your Cognigy.AI project logging, turn on ‘Debug’ view, and look for lines starting with ‘########’. These should help to see what is happening at key points with this integration. If you need to dig deeper, add or alter console.log() lines to print more details.

You may want to delete the lines once you are happy with your integration.

Epilogue

This concludes the groundwork for this integration.

Where you take your solution, with the power of Cognigy, Connect and Lex, is up to you!

--

--