Monitor IAM binding changes on BigQuery dataset

Lorenzo Caggioni
Google Cloud - Community
3 min readMay 18, 2023

BigQuery is a powerful serverless and cost-effective enterprise data warehouse that works across clouds and scales with your data. Access control is great and granular, thanks to IAM binding. Knowing and querying historical IAM binding is crucial to know who can access which data. In this article, I show you how to create a Looker Studio dashboard to query your historical IAM binding information.

IAM binding between Cloud identity and BigQuery

Google Cloud track all IAM bindings on Logging as part of the Admin Activity audit logs. Admin Activity audit logs are always written; you can’t configure, exclude, or disable them. Admin Activity audit logs are still generated even if you disable the Cloud Logging API.

Logs are accessible through the Logs Explorer console using the Logging query language.

Before querying those data from a BI tool such as Looker Studio, perform the following steps.

Prerequisites

  • A Google Cloud Platform (GCP) project

Admin Log

Check that Admin Audit Log are enabled on BigQuery:

Audit Log console

Log Bucket

Create a Log bucket enabling Log analytics and linking it to BigQuery dataset, and setting the appropriate retention policy.

Creating a Log Bucket

Log Synk

Create a log sink at a project/folder/org level (as needed), link it to the just created Log bucket and add the filter needed, example:

protoPayload.methodName : "google.iam.v1.IAMPolicy"
protoPayload.serviceName :"bigquery.googleapis.com"
Creating a Log Sink

Query Logs

You can now query your logs with standard SQL commands!

From Log Analytics. It comes with no additional cost.

From BigQuery. It comes with the additional cost of BigQuery query execution cost.

From logs you can export information you may need to create a dashborad to monitor IAM binding on your BigQuery resources.

SELECT
initiatior,
resource,
bd.action,
bd.member,
bd.role,
timestamp
FROM (
SELECT
proto_payload.audit_log.authentication_info.principal_email AS initiatior,
ARRAY(
SELECT AS STRUCT
JSON_VALUE(x, '$.action') action,
JSON_VALUE(x, '$.member') member,
JSON_VALUE(x, '$.role') role,
FROM UNNEST(JSON_EXTRACT_ARRAY(proto_payload.audit_log.metadata,'$.datasetChange.bindingDeltas')) x
) bindingDeltas,
JSON_VALUE(proto_payload.audit_log.metadata,'$.datasetChange.dataset.datasetName') resource ,
timestamp
FROM `PROJECT_ID.custom_folder_1._AllLogs`
WHERE ARRAY_LENGTH(JSON_EXTRACT_ARRAY(proto_payload.audit_log.metadata,'$.datasetChange.bindingDeltas')) > 0
UNION ALL
SELECT
proto_payload.audit_log.authentication_info.principal_email AS initiatior,
ARRAY(
SELECT AS STRUCT
JSON_VALUE(x, '$.action') action,
JSON_VALUE(x, '$.member') member,
JSON_VALUE(x, '$.role') role,
FROM UNNEST(JSON_EXTRACT_ARRAY(proto_payload.audit_log.metadata,'$.tableChange.bindingDeltas')) x
) bindingDeltas,
JSON_VALUE(proto_payload.audit_log.metadata,'$.tableChange.table.tableName') resource ,
timestamp
FROM `PROJECT_ID.custom_folder_1._AllLogs`
WHERE ARRAY_LENGTH(JSON_EXTRACT_ARRAY(proto_payload.audit_log.metadata,'$.tableChange.bindingDeltas')) > 0
) t, UNNEST(bindingDeltas) bd

For example, withthe query above you extrapolate:
— the Cloud Identity initiating the change
— the resource affected
— the action performed: ADD/REMOVE
— the member involved
— the role assigned
— the timestamp

Query result tracking BigQuery resources IAM binding changes

Create a Looker Studio Dashboard

Log Analitics information linked to a BigQuery dataset can be accessed by any BI tools, such as Looker Studio. Create a Datasource pointing to the Dataset and a dashboard out of it, you can report any information about changes on your BigQuery IAM binding you need!

Conclusion

By following these steps, you can create a Looker Studio dashboard to query your historical IAM binding information in BigQuery. This will allow you to track who has access to your data and when their access was granted or revoked.

--

--

Lorenzo Caggioni
Google Cloud - Community

Multiple years of experience working in Integration and Big Data projects. Working at Google since 2010, lead technical integration with global customers.