Automating PSC Publisher Subnet Utilization Monitoring on GCP

Utkarsh Sharma
Google Cloud - Community
3 min readJun 14, 2024

As organizations scale their GCP infrastructure, monitoring and managing IP address utilization becomes critical. This blog outlines an automated process to monitor the Private Service Connect (PSC) Publisher subnet utilization, ensuring that your subnets do not run out of IPs. We’ll walk through the steps to gather necessary data, create reports, ingest logs into GCP Log Explorer, and set up alerts for high utilization.

High-Level Steps

  1. List all PSC Publisher Services across all folders/projects in the organization.
  2. Identify the attached subnets and their IP ranges.
  3. Calculate the total number of IPs in each PSC subnet.
  4. List all the connected forwarding rules to the PSC Publisher.
  5. Subtract the used NAT IPs and reserved IPs from the total to get the available IPs.
  6. Generate the percentage of available IPs and send alerts.
  7. Send the output JSON as logs to GCP Log Explorer.
  8. Trigger alerts when subnet utilization reaches 80%, 90%, and 100%.

Execution Phases

Step 1: GCP Service Attachment Audit

This step involves pulling PSC Publisher information, creating a CSV and JSON report, and uploading it to Google Cloud Storage (GCS). The script provides a comprehensive overview of service attachments, associated NAT subnets, and forwarding rule usage within your GCP projects.

Scripts:

This Github Repo has the scripts.

Features:

  • Recursive project discovery
  • Detailed service attachment retrieval
  • NAT subnet analysis
  • Forwarding rule count
  • CSV output generation
  • Robust error handling

Prerequisites:

  1. Python Environment: Python 3.7 or higher installed.
  2. Google Cloud SDK: Install the Google Cloud SDK (gcloud).
  3. Google Cloud Project Setup: Enable Cloud Resource Manager API, Cloud Asset API, and Compute Engine API.
  4. Service Account: Create a service account with the roles: Browser, Cloud Asset Viewer, and Compute Network Viewer.
  5. Environment Variable: Set GOOGLE_APPLICATION_CREDENTIALS to the path of your service account's JSON key file.
  6. Python Libraries: Install required libraries:
pip3 install google-cloud-resourcemanager google-cloud-asset google-cloud-compute ipaddress

Script Usage:

  1. Edit the script to specify parent_folder_ids with the actual IDs of your parent folders or organization node.
  2. Run the script:
python3 psc_service_attachments_subnet_monitor.py

The script discovers all projects, retrieves service attachment details, NAT subnet data, forwarding rule counts, and generates a CSV report.

CSV Output Columns:

  • Folder
  • Project
  • ServiceAttachment
  • NATSubnets
  • NATSubnetRanges
  • NATSubnetIPCount
  • ForwardingRuleCount
  • AvailableIPs
  • AvgUtilization(%)

Step 2: PSC Subnet Monitoring Log Ingestion

This script ingests log data about PSC subnet utilization into Google Cloud Logging.

Prerequisites:

  1. Google Cloud Project: Active project with Cloud Logging enabled.
  2. Service Account: A service account with the “Cloud Logging Log Writer” role.
  3. Authentication: Set GOOGLE_APPLICATION_CREDENTIALS to the path of your service account's JSON key file.
  4. Log File: The script expects a JSON log file named service_attachments.json in the same directory.

Script Usage:

  1. Install dependencies:
pip3 install google-cloud-logging

2. Prepare the log file service_attachments.json with properly formatted entries.

3. Run the script:

python3 ship-logs-gcp.py

Step 3: Google Cloud Monitoring Alert Policy Creation

This script automates the creation of alert policies in Google Cloud Monitoring using JSON configuration files.

Prerequisites:

  1. Google Cloud Project: Active project with Cloud Monitoring enabled.
  2. Service Account: A service account with the roles/monitoring.admin role.
  3. Authentication: Set GOOGLE_APPLICATION_CREDENTIALS to the path of your service account's JSON key file.
  4. Python and Libraries: Install the Google Cloud Monitoring library:
pip3 install google-cloud-monitoring

JSON Configuration:

Create a JSON file alert_policy_data.json adhering to the required structure.

Sample alert_policy_data.json:

{
"name": "projects/<project-abc>/alertPolicies/123456",
"displayName": "PSC Publisher Subnet utilisation is above 80%",
"documentation": {
"content": "psc-subnet-monitor-alert \n\nThe PSC Subnet utilisation above 80% please take a look.",
"mimeType": "text/markdown"
},
"conditions": [
{
"name": "projects/<project-abc>/alertPolicies/123456/conditions/1234567",
"displayName": "Log match condition",
"conditionMatchedLog": {
"filter": "logName=\"projects/<project-abc>/logs/psc-subnet-monitoring\" AND jsonPayload.jsonPayload.utilized_percent > 80",
"labelExtractors": {
"utilizedpercent": "EXTRACT(jsonPayload.jsonPayload.utilized_percent)",
"iprange": "EXTRACT(jsonPayload.jsonPayload.ip_range)",
"psc": "EXTRACT(jsonPayload.jsonPayload.self_link)",
"folder": "EXTRACT(jsonPayload.jsonPayload.folder_path)",
"subnet": "EXTRACT(jsonPayload.jsonPayload.subnet_name)",
"project": "EXTRACT(jsonPayload.jsonPayload.project_id)"
}
}
}
],
"alertStrategy": {
"notificationRateLimit": {
"period": "300s"
},
"autoClose": "259200s"
},
"combiner": "OR",
"enabled": true,
"notificationChannels": [
"projects/<project-abc>/notificationChannels/987654321"
]
}

Script Usage:

  1. Configure the script with your project ID.
  2. Prepare alert_policy_data.json with the correct structure and values.
  3. Run the script:
python3 setup-alert.py

Sample Alert

Conclusion

By following these steps, you can automate the monitoring of PSC Publisher subnet utilization on GCP. This ensures your subnets do not run out of IP addresses, and alerts are triggered at critical utilization levels, allowing proactive management of your GCP resources.

--

--

Utkarsh Sharma
Google Cloud - Community

Senior Solutions Consultant @ Google | Talks about AWS | GCP | Azure | K8s | IaC | Terraform | CI/CD | Docker| Helm | Migration