How to make fast and convenient visualization in Data Studio based on BigQuery

Andrey Osypov
Beards Analytics 🇺🇦
7 min readMay 6, 2020

Google Data Studio is one of simple tools of visualization, what we have now. DS is free and expands actively. If we compare, DS gives a top rating of numbers of users.

Data Studio has a lot of connectors to connect for different sources including a lot of google products. Google BigQuery is one of the entrances to Google Cloud.

Besides, you can use the pay connectors, for example from supermetrics or opensource connectors.

Similarly, the types of visualizations expand actively. You can find interesting options in the Community Gallery.

Sometimes recommended to look in there to explore new types of decisions. You need to turn on job opportunities with Community Visualization Access, to have an opportunity to use it.

Simple and not the most effective way

Actually, we can simply connect the table we need by selecting BigQuery in the list of available connectors.

In this case you have access to the whole table firebase / app+web with improved displaying metrics and parameters. How usual tables in BigQuery interface have metrics users, events count and it’s all about the money.

All event parameters and user-properties are already laid out, and by creating your own functions, you can access the attributes.

For example, to access the page location parameter in the pageview event, you can create new Page fields and write this formula:

case when Event Name = ‘page_view’ and Event Param Name = ‘page_location’ then Event Param Value (String) else ‘’ end

Funnel visualization

What was lacked is the ability to build a funnel based on the events or pages, with the right conditions. Recently, the People from powermyanalytics.com added this feature:

For example, if we want to build a funnel based on certain pages, we can create several metrics by specifying these values:

CASE

WHEN Event Name = ‘session_start’ AND Event Param Name = ‘page_location’ AND CONTAINS_TEXT(Event Param Value (String), ‘https://web-analytics.me/') THEN 1

ELSE 0

END

When we are working with filters, we can include only certain segment of the audience, for example only google traffic:

After adding the necessary metrics, each of which will correspond to a separate step in our funnel, we can get a similar result:

When necessary, we can count a unique user, not by the number of page views. We can change the formula to this:

case when Event Name = ‘page_view’ and Event Param Name = ‘page_location’ and Event Param Value (String) = ‘https://web-analytics.me/' then User Pseudo ID else ‘’ end

And indicate — count distinct:

Time distribution

Using visualization from https://www.clickinsight.ca/ we can do line chars with the distribution of users by the hour:

And with the help of filters, we can look for interesting patterns. To find the hour from the time parameter, we need to create another parameter using the formula:

HOUR(Event Time)

Google maps

Known for beauty, but it finally added the ability to put data on Google Maps without the difficulties:

You have a lot of visualization variants because it is integrated visualization.

Gauge chart

Despite it not being popular, this is a nice visualization. It allows us to understand how our indicators are in the normal zone. This makes it easier to evaluate it.

Most often, we work with tables, where, among other things, it is possible to compare current indicators with a previous or a fixed period. To enable this feature, select the comparison data range, in the upper right menu — Рrevious period:

And at the output, we get something like this:

The images in the table

A great opportunity to evaluate the effectiveness of banners is to look at the creatives in the form of a picture along with the indicators in the table. For this, the corresponding field should contain the url of the image. In the parameter setting, change the Type — url and in the Show as — Image:

There is a more complicated, but flexible method.

Since we have access to raw data in BigQuery, we have the opportunity to compose our sql queries with the data we need and visualize the result. This is especially true if, in addition to our app + web data, there are Google Ads, data from the Search Console, also Facebook ads and sales data from the database.

We have the opportunity to combine the data we need and to calculate metrics on sql from several data tables at once. And in Google Data Studio, it just visualizes the results of these calculations.

Although, even super fast BigQuery can process a significant amount of data for a long time and our visualizations will not work quickly.

The easiest way to add your query when adding a data source is to select the custom query item:

In this case, we can not access the whole table, but filtering the results using custom parameters:

This will greatly speed up data sampling. This is especially true for clustered and partition tables:

WHERE

_TABLE_SUFFIX >= PARSE_DATE(‘%Y%m%d’, @DS_START_DATE)

And _TABLE_SUFFIX <= PARSE_DATE(‘%Y%m%d’, @DS_END_DATE)

Where

_PARTITIONTIME >= PARSE_DATE(‘%Y%m%d’, @DS_START_DATE);

AND _PARTITIONTIME <= PARSE_DATE(‘%Y%m%d’, @DS_END_DATE);

By the way, recently it became possible to work not only with dates, but also with other parameters, including your own.

Or, we have the opportunity to create a View based on queries by clicking on the save view button:

The speed of accessing such a table will not be accelerated, compared to a regular query, but it’s more convenient to store ready queries.

After creating and checking your query, we can save the result of its work, ready for the real table. For this option, in the more section, select Query settings:

This is a ready, counted table, in fact. The speed of work with which will be higher than performing the entire request.

If you don’t use static data, then as a rule it is necessary that our table is calculated regularly, for example, once a day. For this, scheduled queries are suitable for us, a function that allows you to calculate tables on a schedule:

When we are composing such queries, we can work not with the entire dataset, but only with the last day, when using this opportunity — append to table:

And in our queries, we indicate the sample only from yesterday:

where parse_date(‘%Y%m%d’, event_date) = date_sub(current_date(), interval 1 day)

On live projects, I add data to the partition tables day by day, or more often through scheduled queries.

Each such table contains all the necessary metrics and parameters to create a visualization on a single sheet in Data Studio. Moreover, by connecting these tables, I am using custom parameters. This allows me to reduce time and resources when creating such tables and significantly increase the speed of the reports themselves.

Join our channel in telegrams: https://t.me/firebase_app_web_bigquery

--

--