Send you data to BigQuery with ETL tool Dataddo

George Chalikiopoulos
6 min readJan 21, 2023

--

Step #1: Create a Google Cloud Account

First and foremost you need to go to https://cloud.google.com/ where you are prompted to create a google cloud account

Step #2: Complete the account information

A the end of the form they will ask you to put in your credit card, no worries they will not charge you anything even after the trial credits are expended.

I usually add my Revolut card when I create Google Cloud Accounts as it is easier to manage it, but it’s up to you.

Step #3: Create a Project in GCP (Google Cloud Platform)

Click on the project dropdown and

Give your project a name (i.e. “gsc-collect-data” )and press “create”

A notification will appear that will inform you that you project has been created. Press “Select Project”.

Step #4: Create a Dataset inside BigQuery

Now that you have selected your project, we need to create a dataset. That’s where our data will be stored as tables.

Go to Google Cloud Menu and click on BigQuery

When the BQ menu opens you will see the project that we created “gsc-collect-data” click on the three horizontal dots and create a dataset

Give your dataset a unique id (you can use letters, numbers and underscores)

That’s it! Now we created an empty dataset that will store our data.

Now that we have created our dataset we need to populate it with data. We can do that manually (export-import) or we can use an ETL tool to do the connection for us. You can also use some open-source tools (like Airbyte) to build your own connectors for free (although you will need some technical knowledge)

For our example, we will use Dataddo, a cost-efficient, no-code tool that gets the job done in minutes. You can try Dataddo for free for 14-days

Step #5: Create a Dataddo account (or use another ETL tool)

This is pretty straightforward, go to https://www.dataddo.com/ and start your free trial.

After you created your account we need to create a flow. Meaning: Connect a source (Google Search Console) and a destination (BigQuery) and get the data moving!

Step #6: Create a Flow with GSC as a source and BigQuery as a destination in Dataddo

Go to Flows and click Create Flow

First, you need to add a Source. A source is a platform where you want to extract data.

This could be Facebook Ads, Google Analytics, or (in our case) Google Search Console.

We click Add Source and then Create New Source.

Find and select the Google Search Console source.

Then it will open the configuration page for this source.

You can give it a name and choose a dataset. I want to prefer the data as granular as possible so I choose Performance by page, query, country, and device.

Then you will be redirected to the Account page where you will have to select the GSC account that you want.

You will choose your email and then the site URL (your email must have at least an editor role in the specific GSC account).

After that, you will go to the Attributes and Metrics page. Select All and click Next.

Finally, you will be prompted to set how frequently you want the data to be updated. Choose daily and Save. You have created the source.

Now it is time to add the destination (BigQuery)

Click on Add Destination and then Add New Destination.

And we choose BigQuery.

Then you are prompted to choose a google account. Use the account that you created the BigQuery at step #1.

After that, you will be prompted to select Project and Dataset. Select those that we created in steps #3 and #4.

Click save.

Finally, you will be asked to create a Database Table (just give it a name) where all the GSC will be stored.

After that, you click “Create Flow”

Step #8: Populate the data table you created with historical data

Right now, you have created the table and the flow that will populate the data in your database…. but it’s empty. What you can do is populate it with past data in order to start analyzing.

In order to do that you need to go to your Flows and request a manual data insert.

Step #9: Go to BigQuery and see if your data are populated correctly

As we can see the data table and the schema are created. But do they have data?

In order to check if the data were properly populated we need to write an SQL Query.

SELECT * FROM `project_name.dataset_name.data_table_name`

Yes! The data are there! Now you can start analyzing your data with SQL queries!

--

--