Decoding BigQuery Expenses: The Ultimate Query for Analyzing Your Analysis Costs

Shweta Shrestha
2 min readJan 29, 2024
A blurry picture focusing on the word Analytics in a screen.
Photo by 1981 Digital on Unsplash

Analyzing BigQuery costs can often feel like navigating a black box.

While the billing dashboard in Google Cloud Platform provides some insights into your costs, it may not be sufficient to fully understand what’s driving your BigQuery expenses. The SKU grouping in billing reports offers a glimpse into the cost breakdown between storage and analysis, yet the real challenge lies in deciphering the impact of individual queries and jobs on your expenses.

To address this challenge and initiate a targeted cost optimization process, we need to delve deeper. Enter the ultimate query, leveraging the system table named <region>.INFORMATION_SCHEMA.JOBS_BY_PROJECT.

Using Information Schema

For a more nuanced understanding of your BigQuery Analysis costs, the provided SQL query below allows you to pinpoint the most expensive queries and jobs, offering a granular view of your cost distribution. Google's documentation¹ on this table is comprehensive, but for a swift and more precise insight into the dollar amounts, the query below offers a streamlined solution.

SELECT
date_trunc(date(creation_time),month) as date,
query,
user_email,
count(start_time) as count_query_runs…

--

--