Using GCP Cloud Asset Inventory Export to keep track of your GCP resources over time

Lorenzo Caggioni
Google Cloud - Community
3 min readSep 1, 2021

Google Cloud Asset Inventory is a great service that allows you to view, monitor, and analyse your GCP assets, giving you the option to export a snapshot of your entire inventory at any point of time (up to 35 days backward).

Cloud Asset Inventory: discover, monitor, and analyse all your assets in one place.

Sometimes you want to keep track of your assets longer for dashboarding or analysis. The code shown below describes a solution to export your Cloud Asset Inventory to BigQuery on a weekly basis.

The example is designed to export resources at the project level. This was done to keep the example simple and self-contained. In actual use some changes would of course be needed:

  • scoping the export and IAM custom role/bindings for the Cloud Function to export data at organisation/folder level;
  • tuning the function’s retry and error handling;
  • integrate export with a dash-boarding tool that you like the most.

This is a high-level overview of the resources that will be created by the example code:

Architecture created by the example: Cloud Scheduler, Pub/Sub, Cloud Function, Asset Inventory, BigQuery.

To run the example, start by cloning the Google Foundation Fabric Repository and moving to the blueprints/cloud-operations/scheduled-asset-inventory-export-bq blueprints folder.

git clone https://github.com/terraform-google-modules/cloud-foundation-fabric.git
cd blueprints/cloud-foundation-fabric/cloud-operations/scheduled-asset-inventory-export-bq

Once in the right folder, create a terraform.tfvars file, open it with a text editor of your choice and specify at least the following variables:

cai_config     = {
bq_dataset = "cai"
bq_table = "assets"
}
project_id = "YOUR_PROJECT"
billing_account = "111111-222222-333333"
root_node = "folders/112233445566"

The script will deploy all resources in an existing project, if you want to create a new one, take a look at the variable file and set your variables accordingly.

Run init Terraform and apply changes:

terraform init
terraform apply

When Terraform has finished, a job is scheduled to run on a weekly bases to extract and store data in Bigquery. If you want to run a test, open the Cloud Function panel and hit run now button.

Google Cloud Scheduler console.

Once the task end, you’ll be able to check data on Bigquery:

Example statement to query Cloud Asset Inventory data exported to Bigquery

Next Steps

Once you have exported your Cloud Asset Inventory data to BigQuery, you can create pipeline or dashboards to analyse or monitor your GCP assets.

--

--

Lorenzo Caggioni
Google Cloud - Community

Multiple years of experience working in Integration and Big Data projects. Working at Google since 2010, lead technical integration with global customers.