GCP Cloud Asset Inventory Feed : Get real time notifications on Resource Changes

Sandeep Bihani
Google Cloud - Community
6 min readNov 23, 2022

In an organization with many cloud assets, tracking the entire inventory of resources is challenging. It is possible to make a list of all the resources for each project in your organization individually, but it is time-consuming, error-prone, and tedious. Even if you were able to pull the list of all the resources in all the projects in your organization, maintaining how these resources are changing would be a nightmare. Obtaining the state of your resources at any given time would be nearly impossible. Maintaining a check on whether each resource adheres to the enterprise’s security and compliance standards becomes even more challenging.

Here’s where GCP Cloud Asset Inventory can help. The Cloud Asset Inventory keeps track of your GCP resources over time. This database keeps a history of 5 weeks of metadata around each asset in the inventory and lets you query your inventory at any particular time instant.

Below are some of the notable features of Cloud Asset Inventory :

  1. Search Asset Metadata
  2. Export the Inventory at any given timestamp or export change history for certain time range.
  3. Subscribe to real time notifications for changes to assets using asset feeds.
  4. IAM Policy Analyzer to check who has access to what.

In this article we will see how we can subscribe to the real time notifications for changes in our assets and get an alert when a Google Compute Engine Instance with Public IP is created.

Table of Content

  1. Architecture
  2. Pre-requisites
  3. Create Asset Feed
  4. Setup Alerts
  5. Test & Verify
  6. Clean Up
  7. Conclusion

1. Architecture

Figure below shows the high level architecture for this solution.

We will be using Cloud Asset Inventory, Pub/Sub and Cloud Functions to generate real time notifications.

When a compute engine instance is created with public IP, that change is captured by the asset inventory in real time. Through asset feed this information is published to a pub/sub topic. A cloud function, subscribes to the pub/sub topic and sends notification to a configured slack channel.

2. Pre-requisites

As a first step, enable the Cloud Asset, Pub/Sub, Resource Manager and Cloud Functions API in your project.

gcloud services enable cloudasset.googleapis.com pubsub.googleapis.com cloudfunctions.googleapis.com cloudresourcemanager.googleapis.com

Now, let’s create a PubSub topic, which will be the target for asset feed.

gcloud pubsub topics create gce-public-ip-feed-topic --project=PROJECT_ID

3. Create Asset Feed

Now let us create an asset feed. A feed is a stream of changes that you can subscribe to. Each level (Project, Folder, Organization) can have up to 200 feeds.

So let’s create a feed which would get real time changes to compute instance creation.

gcloud asset feeds create gce-public-ip-feed --project=PROJECT_ID \
--content-type=resource \
--asset-types="compute.googleapis.com/Instance" \
--pubsub-topic="projects/PROJECT_ID/topics/gce-public-ip-feed-topic" \
--condition-expression="temporal_asset.prior_asset_state == google.cloud.asset.v1.TemporalAsset.PriorAssetState.DOES_NOT_EXIST"

Note : Asset feed can only be created through CLI and API only. Option to create it through Cloud Console is not available.

For argument reference checkout the documentation for gcloud asset feeds.

For list of supported asset type refer the documentation here.

By default, any changes like create/update/delete on the resource results in a message being published to the pub/sub topic. we can use --condition-expression flag to filter on any specific changes to the underlying asset-type. In the above example I have filtered on new instance creation only. You can refer to the documentation on using--condition-expression here.

You can also use --asset-name flag to specify name of a specific resource for the feed.

--pubsub-topic is the topic that we created earlier. It will be the target for this feed.

Use below command, to retrieve the created feed :

gcloud asset feeds describe gce-public-ip-feed --project=PROJECT_ID

Now we are ready to consume the changes and send out notifications.

Note : It may take upto 10 minutes after creation, before the feed starts sending messages to pub/sub topic.

4. Setup Alerts

To send notifications to a Slack channel, you do the following:

  1. Create a new Slack app with sufficient privileges to post messages to a public Slack channel. Use the instructions here to create an app and generate a TOKEN to be used in Cloud Function to post message to a channel.
  2. Create and deploy a Cloud Function that posts chat messages to Slack when notifications are received from Pub/Sub.

In this section, we’ll setup a cloud function which reads messages from our PubSub topic and sends out a slack message when an instance is created with Public IP.

First, let’s create a directory for our cloud function code :

mkdir -p alert_function && cd alert_function

Now create two files requirements.txt and main.py

requirements.txt

requests==2.26.0

main.py

import base64
import json
import requests

# Triggered from a message on a Cloud Pub/Sub topic.
def filter_rule(event, context):
"""Triggered from a message on a Cloud Pub/Sub topic.
Args:
event (dict): Event payload.
context (google.cloud.functions.Context): Metadata for the event.
"""
TOKEN = "xoxb-XXXXXXXXXXX-YYYYYYYY-ZZZZZZZ"
URL = "https://slack.com/api/chat.postMessage"

pubsub_message = base64.b64decode(event['data']).decode('utf-8')
message_json = json.loads(pubsub_message)
natIP = json_extract(message_json,"type")

if 'ONE_TO_ONE_NAT' in natIP:
payload = get_data(message_json)
name = payload["name"]
requests.post(URL, data={
"token": TOKEN,
"channel": "#general",
"text": f"Instance *{name}* was created with external IP!"
})

def get_data(message_json):
try:
return message_json["asset"]["resource"]["data"]
except:
return message_json["priorAsset"]["resource"]["data"]

def json_extract(obj, key):
"""Recursively fetch values from nested JSON."""
arr = []
def extract(obj, arr, key):
"""Recursively search for values of key in JSON tree."""
if isinstance(obj, dict):
for k, v in obj.items():
if isinstance(v, (dict, list)):
extract(v, arr, key)
elif k == key:
arr.append(v)
elif isinstance(obj, list):
for item in obj:
extract(item, arr, key)
return arr
values = extract(obj, arr, key)
return values

Replace the variable TOKEN by your own Slack App Bot Auth Token as created in Step 1 above.

Now deploy this cloud function.

gcloud functions deploy gce-public-ip-alert --region=us-central1 --entry-point=filter_rule --runtime=python39 --trigger-topic=gce-public-ip-feed-topic

Note : You will be prompted to enable the cloud build service if its not already enabled.

Once your function is in Active state. We are ready to test and verify our setup.

5. Test & Verify

Let’s a create a compute engine instance with a public-ip in our project for testing purpose -

gcloud compute instances create test-instance --project=PROJECT_ID --network=NETORK_ID --subnet=SUBNET_ID --zone=us-central1-a

If everything is setup correct you’ll see a message on your Slack channel.

Notification on Slack

6. Clean Up

If you have used a new project for this demo then you can just delete the project to clean up the resources. Otherwise, follow below steps to delete the resources created -

Delete Instance

gcloud compute instances delete test-instance --project=PROJECT_ID

Delete Cloud Function

gcloud functions delete gce-public-ip-alert --project=PROJECT_ID

Delete Asset Feed

gcloud asset feeds delete gce-public-ip-feed --project=PROJECT_ID

Delete Pub/Sub Topic

gcloud pubsub topics delete gce-public-ip-feed-topic

7. Conclusion

Google Cloud Asset inventory is a powerful tool for organizations to keep track of all their resources in the cloud. With asset feed, they can get real time alerts on the changes happening in their environment and take appropriate actions using cloud functions.

Footnotes :

There are couple of more options to achieve the similar functionality :

  1. Create Logs-based Metrics in Cloud Logging and create alerts on them.
  2. Use organisation policies constraints to disable public IPs on GCE VM instance by default and allow only on certain VMs. More details on it can be found here.

--

--