Deconstructing Chatbots: How to integrate Dialogflow with BigQuery
Introduction
In this article we will learn how Dialogflow connects with BigQuery and stores information collected during the conversational experience. We will use the same Agent that we created in previous labs “Appointment Scheduler”. In the Agent’s GCP project we will create a dataset and a table in BigQuery. Then we will edit the original fulfillment with the BigQuery dataset and table IDs. Finally we will test to see if the interactions are getting recorded in BigQuery.
Here is the sequence diagram of the events from user to the fulfillment and BigQuery.
What you’ll learn
- How to create dataset and table in BigQuery
- How to set up BigQuery connection details in Dialogflow fulfillment.
- How to test fulfillment
Prerequisites
- Basic concepts and constructs of Dialogflow. For introductory Dialogflow tutorial videos that covers basic conversational design check out the Deconstructing chatbot series.
- As I mentioned earlier, we will be using the same “Appointment Scheduler” chatbot that we built in this article and add the integration with BigQuery.
- Also go through the “Deconstructing Chatbots: Understanding Fulfillment by integrating Dialogflow with Google Calendar” article to get some background on Dialogflow fulfillment.
- If you would rather follow a visual for this integration, check this video out
Step 1: Create Dataset and Table in BigQuery
- Navigate to the Google Cloud Console
- In the Cloud console, go to the menu icon ☰ > Big Data > BigQuery
- Under Resources on the left pane, click on the project ID, once selected, you will see CREATE DATASET on the right
- Click on CREATE DATASET and name it.
- Once the dataset is created, click on it from the left panel. You will see CREATE TABLE on the right.
- Click on CREATE TABLE, provide Table name and click Create table on the bottom of the screen.
- Once the table is created, click on the table from the left panel. You will see “Edit Schema” button on the right side.
- Click on Edit Schema button and click on Add Field button. Add “date” field and repeat the same for “time” and “type”.
- Take note of the “DatasetID” and “tableID”
Step 2: Add BigQuery connection details to Dialogflow fulfillment
- Open Dialogflow Agent and enable Fulfillment inline editor. Refer to the previous article if you need help with this .
- Make sure the “package.json” in the Dialogflow fulfillment inline editor contains BigQuery dependency. “@google-cloud/bigquery”: “0.12.0”. Make sure you use the latest version of BigQuery at the time you are following this article.
- In index.js create “addToBigQuery” function to add the date, time and appointment type in the BigQuery table.
- Add the projectID, datasetID and tableID in the TODO section of the index.js file to properly connect your BigQuery table and the dataset to your fulfillment.
Sequence of events:
- The intent map calls the “makeAppointment” function to schedule an appointment on Google Calendar
- Within the same function a call is made to “addToBigQuery” function to send the data to be logged into BigQuery.
Step 3: Test Your Chatbot and the BigQuery table!
Let’s test our chatbot, you can test it in the simulator or use the web or google home integration we have learnt in previous articles.
- User: “Set an appointment for vehicle registration at 2pm tomorrow”
- Chatbot response: “Ok, let me see if we can fit you in. August 6, 2 PM is fine!.”
- Check the BigQuery table after the response. Use query “SELECT * FROM `projectID.datasetID.tableID`”
Congratulations!
You created a fulfillment using inline editor and integrated it with BigQuery. Think of more integrations like these and try them out!
Next steps
- Check out the Dialogflow website here for documentation, APIs and more integrations.
- Follow Deconstructing Chatbots video series and subscribe to Google cloud platform youtube channel
- Try Dialogflow, it is free to build a bot! Check it out.