Graphing data from Airtable

This guide will walk you through how to connect to an Airtable base from a Jupyter notebook, pull in your data and plot it, all with python.

Kyle
Analytics Vidhya
Published in
5 min readSep 25, 2019

--

Introduction

Airtable is an awesome tool for centralizing data and running multiple different segments of your business. However, sometimes we can have so much tabular data that it is hard to really grasp week-to-week developments.

‌It is possible (on the pro plan) to build charts using Airtable’s code blocks. Another option is to export to excel. But to really supercharge your analytics, this work should be as automated as possible. So this guide will show how you can fetch Airtable data from a Jupyter Notebook, manipulate and visualize this data, all in Python.

This post is for you if:

  • You are already using Airtable for some specific knowledge management — your sales CRM, for example — but are frustrated with the built-in data visualization features.
  • If you think your team could benefit from a more automated process for tracking metrics that impact decision-making.

Getting started

Airtable Python uses Requests, which you can install by running:

Note that you will need to generate an API key for your account if you haven’t already. You can do this in your Airtable account settings.

It is bad practice to have your API keys in the code you are running. Instead, we can set a global variable in our ~/.bash_profile. Copy your API key to the clipboard. Open your bash profile in your preferred editor and enter the following:

replacing “API_KEY” with the string you just copied.

From the command line, run:

Airtable’s REST API interface can be found here: https://airtable.com/api. We are now ready to start using Airtable Python.

In Jupyterlab

Fetching our data

For the purpose of this guide, we will work with one of Airtable’s sample templates — the Sales Pipeline base. Let’s fetch the data:

where ‘AIRTABLE_API_KEY’ is the global variable we just set in our bash profile and ‘BASE_ID’ is the ID for the specific base we are working with. Airtable’s REST API interface can be found here: https://airtable.com/api, where you can access a list of all your Airtable bases. Each Airtable base will provide its own API to create, read, update, and destroy records.

We have pulled in 2 different responses the correspond to the two tables inside our Sales Pipeline base.

Converting the response into a Pandas DataFrame

Above we’ve assigned our responses to variables and read them into their respective DataFrames. The response from Airtable comes in the form of an ordered dictionary, which we’ve had to parse above.

Visualizing our data

Now it is time to start gaining insights from our data, but first up:

Define your business questions

What business questions do you want to answer? We’ve laid out some example questions for our sample sales pipeline data.

  1. What is the value of our closed deals? Who has secured the most revenue?
  2. What stages are other deals at in the pipeline? What is the weighted value of these deals?
  3. What is our actual monthly revenue? What are our forecasted targets?
  4. How are our sales reps doing? Are they hitting their quotas?

Install the required packages

At Kyso, we work a lot with Plotly. For use in JupyterLab, we need to install the jupyterlab and ipywidgets packages:

Plotly has a wrapper for pandas (data manipulation library) called Cufflinks, which is currently having compatibility issues with plotly’s latest version. A workaround for this at the moment is to downgrade plotly and install cufflinks with the following commands. Once you see how easy it is to generate graphs with Cufflinks, you’ll understand why we go through this hassle.

As of September 2019 this will get you up and running:

Now we’re ready to answer our questions above:

1. Closed deals by contract size and sales representative

2. Deal valuations by stage & sales representative

3. Actual revenue & Forecasted deal valuations

4. Success rates of our sales reps

Conclusion

And there you have it. Easy-to-build data dashboards are now right at your fingertips! Note that you can check out this guide on Github here, with an example notebook and the python code ready for a quick set up. To really automate this workflow, you could fork the repository & edit the notebook (i.e. insert your own Airtable API key, the name of your base and table names).

Follow this guide on syncing your repository to your Kyso account.

Now, every time you push a new commit — like running the notebook weekly — these changes will be reflected on the Kyso post, where your team can learn and collaborate. Remember — data is only useful to you if you actually use the insights gained!

--

--

Kyle
Analytics Vidhya

CMO & Data Science at Kyso. Feel free to contact me directly at kyle@kyso.io with any issues and/or feedback!