Get Started with DialogFlow

Hemantjain
10 min readAug 18, 2019

--

What is DialogFlow?

Dialogflow is a Google-owned developer of human-computer interaction technologies based on natural language conversations which enable users to build intelligent chatbots readily.

Three Main Reasons to use Dialogflow in an application are-

· Intuitive and easy to use interface

· Build a chatbot using Dialogflow’s one-click integration

· Use the fulfillment system to enable the bot to have various non-conversational functions (e.g. look up the weather, retrieve information via 3rd party APIs).

Benefits of DIALOGFLOW

Fast Coding

Coding does not have to take a long time. With Dialogflow, code-related tasks can be completed expeditiously since the platform has an in-line code editor. This provides developers with tools to link their agents with applications via Cloud Functions for Firebase. Or if they prefer, they can create their custom webhook and host it in the cloud or on-premise.

Machine Learning Powered

Dialogflow is now supported by Google’s machine learning technologies. This provides developers with the means to train their agents to understand the user’s intent by extracting data from the conversation. The platform also includes more than 30 pre-built templates that developers can utilise as foundations.

Natural Conversations

Although users will basically talk to a machine and a set of algorithms, it is still critical to provide them with natural conversations so they will feel that support is personal and tailored. Developers can do this with the assistance of Dialogflow since the platform supports the creation of chatbots that can carry on natural conversations.

Small Talk

With Dialogflow, developers can create algorithms for their agents to make small talk with users. The platform lets them define set phrases for different topics or lines of conversation such as emotions, confirmations, and more. This allows developers to create interactive conversations with users that are not related to service requests and queries.

Dialogflow Features:

· Agent Creation & Management

· Intents

· Entities

· Training

· Integrations

· Analytics

· Fulfillment

· In-Line Code Editor

· Cross Platform Support

· Multi-Lingual Agent Support

· Small Talk

This article is based on Dialogflow and its basics that you all can apply.

Agents:

A Dialogflow agent is a virtual agent that handles conversations with your end-users. It is a natural language understanding module that understands the nuances of human language. A Dialogflow agent is similar to a human call center agent. You train them both to handle expected conversation scenarios, and your training does not need to be overly explicit.

Agent settings

To access these settings:

1. Go to the Dialogflow Console

2. Select your agent near the top of the left sidebar menu

3. Click the settings button next to the agent name

General

· Description: Description of your agent. Displayed in the Web Demo for your agent.

· Default Time Zone: Default time zone for agent.

· Google Project:

· Project ID: GCP project linked to agent.

· Service Account: Service account for authentication.

· API Version: API version for agent. Select V2 API for all new agents.

· Beta Features: Toggle to enable beta features for your agent.

· Log Settings:

· Log interactions to Dialogflow: Read more in Training.

· Log interactions to Google Cloud: Read more about Google Stackdriver. This option is only available if Log interactions to Dialogflow is enabled. Disabling Dialogflow’s logging will also disable this setting.

· Delete Agent: Completely deletes agent and cannot be undone. If the agent is shared with other users, those users must be removed from the agent before you can delete it.

Languages

Add multiple languages and their respective locales to make your agent multilingual.

Choose a language from the list and click the Save button. To add a locale, if available, hover over the listed language and click + Add locale.

ML settings (machine learning)

Dialogflow agents use machine learning algorithms to understand end-user expressions, match them to intents, and extract structured data.

An agent learns both from training phrases that you provide and the language models built into Dialogflow. Based on this data, it builds an algorithm for making decisions about which intent should be matched to an end-user expression. This algorithm is unique to your agent.

Dialogflow updates your agent’s machine learning algorithm every time you make changes to intents and entities, import or restore an agent, or train your agent.

Export and import

Note: Import and export should be used for backing up agents or transferring them from one account to another. While you can edit the JSON files directly and re-import them, editing should be done using the Dialogflow console or API. This ensures that changes are validated by the system and keeps troubleshooting to a minimum.

· Inline editor files “package.json” and “index.json”.

· Integration settings for the agent.

· Export as ZIP: Exports the agent as a zip file.

· Restore from ZIP: Overwrites the current agent with the supplied zip file.

· Import from ZIP: Adds intents and entities to the current agent from the supplied zip file. If any existing intents or enties have the same name as those in the zip file, they will be replaced

Intents

SKF-Chatbot Intent

An intent categorises an end-user’s intention for one conversation turn. For each agent, you define many intents, where your combined intents can handle a complete conversation. When an end-user writes or says something, referred to as an end-user expression, Dialogflow matches the end-user expression to the best intent in your agent. Matching an intent is also known as intent classification.

Layman’s Language: Whenever user types his or her query dialogflow try to find the correct intent for the query using the training phases and responds accordingly.

· Training phrases: These are example phrases for what end-users might say. When an end-user expression resembles one of these phrases, Dialogflow matches the intent. You don’t have to define every possible example, because Dialogflow’s built-in machine learning expands on your list with other, similar phrases. You just have to provide some examples so that dialogflow can get the gist about the intent.

· Action: You can define an action for each intent. When an intent is matched, Dialogflow provides the action to your system, and you can use the action to trigger certain actions defined in your system.

· Parameters: When an intent is matched at runtime, Dialogflow provides the extracted values from the end-user expression as parameters. Each parameter has a type, called the entity type, which dictates exactly how the data is extracted. Unlike raw end-user input, parameters are structured data that can easily be used to perform some logic or generate responses.

· Responses: You define text, speech, or visual responses to return to the end-user. These may provide the end-user with answers, ask the end-user for more information, or terminate the conversation.

In Layman’s Language Responses are the text which is given by the dialogflow to the user on the basis of user’s query.

Entities

Each intent parameter has a type, called the entity type, which dictates exactly how data from an end-user expression is extracted.

Entity terminology

The term entity is used in this documentation and in the Dialogflow Console to describe the general concept of entities. When discussing entity details, it’s important to understand more specific terms:

· Entity type: Defines the type of information you want to extract from user input. For example, vegetable could be the name of an entity type. Clicking Create Entity from the Dialogflow Console creates an entity type. When using the API, the term entity type refers to the EntityType type.

· Entity entry: For each entity type, there are many entity entries. Each entity entry provides a set of words or phrases that are considered equivalent.

Knowledge connectors

Knowledge connectors complement defined intents. They parse documents (for example, FAQs or articles) to find automated responses. To configure them, you define one or more knowledge bases, which are collections of documents. You can enable knowledge bases for your agent, so all detected intent requests may find automated responses using your knowledge bases. Alternatively, you can specify one or more knowledge bases in your individual detect intent requests.

It is common for an agent using knowledge connectors to also use defined intents. Knowledge connectors offer less response precision and control than intents. When using both intents and knowledge connectors, you should define your intents to handle complex user requests that require special handling and precision, and let knowledge connectors handle simple requests with responses automatically extracted from your documents. When you identify content in FAQs that you want to expand on, you can convert the questions into defined intents, giving you full control.

Enable beta features

You may need to enable the beta API:

1. Go to the Dialogflow Console

2. Select an agent

3. Click the settings button next to the agent’s name

4. Scroll down while on the General tab and ensure that Beta Features is enabled

5. If you have made changes, click Save

Create a knowledge base

You can use the web UI (Dialogflow Console), REST API (including command line), or client libraries to create a knowledge document:

Use the Dialogflow Console to create a knowledge document:

1. If you are not continuing from steps above, navigate to your knowledge base settings:

a. Go to the Dialogflow Console

b. Select an agent

c. Click Knowledge on the left sidebar menu

d. Click your knowledge base name

2. Click New Document or Create the first one

3. Enter a document name

4. Select text/html for Mime Type

5. Select FAQ for Knowledge Type

6. Select URL for Data Source

7. Enter the URL in the URL field

8. Click CREATE

Integrations

Dialogflow integrates with many popular conversation platforms like Google Assistant, Slack, and Facebook Messenger. If you want to build an agent for one of these platforms, you should use one of the many integrations options. These integrations provide platform-specific features for building rich responses. Direct end-user interactions are handled for you, so you can focus on building your agent.

Dialogflow also provides agent import and export features for other natural language understanding platforms, such as Amazon Alexa and Microsoft Cortana.

Featured Integrations

Fulfillment

If you are using one of the integrations options, and your agent needs more than static intent responses, you need to use fulfillment to connect your service to your agent. Connecting your service allows you to take actions based on end-user expressions and send dynamic responses back to the end-user. For example, if an end-user wants to schedule a haircut on Friday, your service can check your database and respond to the end-user with availability information for Friday.

Each intent has a setting to enable fulfillment. If an intent requires some action by your system or a dynamic response, you should enable fulfillment for the intent. If an intent without fulfillment enabled is matched, Dialogflow uses the static response you defined for the intent.

When an intent with fulfillment enabled is matched, Dialogflow sends a request to your webhookservice with information about the matched intent. Your system can perform any required actions and respond to Dialogflow with information for how to proceed. The following diagram shows the processing flow for fulfillment.

1. The end-user types or speaks an expression.

2. Dialogflow matches the end-user expression to an intent and extracts parameters.

3. Dialogflow sends a webhook request message to your webhook service. This message contains information about the matched intent, the action, the parameters, and the response defined for the intent.

4. Your service performs actions as needed, like database queries or external API calls.

5. Your service sends a webhook response message to Dialogflow. This message contains the response that should be sent to the end-user.

6. Dialogflow sends the response to the end-user.

7. The end-user sees or hears the response.

Thanks for reading! So, this is my first blog. Please do clap if you liked it, you can clap more than one 😉. Comment(reviews or doubts) and Share it 😄

You can connect with me on Github, Linkedin, Twitter 😄

--

--