Using BigQuery for Billing Alerts

Martin Beranek
Ackee
Published in
3 min readJul 27, 2021

As almost any cloud user these days, we were facing the well known issue: how to get warned once the expenses are running higher than expected. Of course, you can always use the Budgets & alerts included in the billing dashboard. Those are more and less reactive. In case something bad is already happening, you get the notification once it’s too late. We were looking for something more sensitive. We were willing to get notified even in case of a slight issue just to be sure that the bill won’t run too high. That’s why this experiment happened and we created a terraform module containing the code which would notify us and send us billing alerts once the data is not following the overall trend.

Also, we did not need to get the notification for every SKU new entry in the billing. Getting notified once a day was enough for us. But even that could pose its challenges. The thing is that not all SKUs are present during the evaluation at a particular time. Some of those are inserted into billing with a delay of a day. It depends on the unit and how it’s billed to the customer.

Our issues were almost always related to SKU going above expected value and staying like this for a few days until it reached budget limitations. Having a check which would compare the long time average with the posterior values would do the trick. Therefore we decided to compute an average of 14 days for each SKU and compare it with the average from yesterday and the day before yesterday. The comparison takes place every morning, therefore we wouldn’t lose that much in case any SKU rocketed to the moon a day before.

Computing average billing data exported to BigQuery is surprisingly easy with the analytics functions . All exported data is already partitioned, therefore you wouldn’t pay extra for the data processed by query. The SQL statement ended up looking like this:

The first inner query selects the sum of costs for each SKU:

The problem is that the line repeats itself for each SKU entry based on the PARTITION BY statement. That’s why there is a second select which creates average, max and min for each SKU:

Once we were done creating the SQL, everything got packaged into the Python code and deployed to the Cloud Functions . For warnings, we always utilized a simple Slack channel. Having configured Slack webhook, sending data to it is as simple as this one liner:

For a few projects, we prefer to notify support directly. Whole oncall stack is managed by OpsGenie, therefore we created new OpsGenie Integration as an API and submit the messages like this:

Setting up cron in GCP can be done by Cloud Scheduler. Don’t forget that the call to the cloud function should be authorized. The HTTP call from the schedule can utilize OIDC token:

Whole terraform module is available on github. In case you are wondering about the Python code, it’s also in the module.

What could be done in the future

As it’s mentioned in the repository, SA key is just an input variable. Terraform uploads the key file directly to the cloud function source code. That means almost anyone who has access to the Cloud Functions in the project can steal the key file. The thing is that we couldn’t do it differently, the customer did not add permissions to our SA, just gave us the key json file. The next step should be to at least load the key from the secret manager.

We are aware that having a cloud function checking the billing is not the smartest way to get alerted. I guess the best way would be to classify outliers and work with those. BigQuery also supports machine learning functions. We can predict the future values with regression and get the warning much sooner. But for our use case, having a simple cloud function is enough.

Hopefully, you found out something interesting here. In case I mess up something which I haven’t understood correctly, please let me know.

Originally published at https://www.ackee.cz on July 27, 2021.

--

--

Martin Beranek
Ackee
Writer for

I am an Infra Team Lead at Shipmonk. My interest is Terraform mainly in GCP. I am also enthusiastic about backend and related topics: Golang, Typescript, ...