A Complete Guide to GCP Cloud Build, Cloud Deployment Manager, Operations, Cloud Source Repositories, Secure Source Manager, Artifact Registry, Cloud Tasks, and Pub/Sub

Warley's CatOps
25 min readJul 12, 2024

--

Introduction to GCP DevOps Tools

Google Cloud Platform (GCP) offers a comprehensive suite of DevOps tools designed to streamline the development, deployment, and management of applications. These tools facilitate continuous integration, continuous delivery, infrastructure management, monitoring, and more. In this chapter, we’ll introduce the key DevOps tools available on GCP and discuss the benefits of using them for your DevOps needs.

Key GCP DevOps Tools

1. Cloud Build
— Purpose: Automates the process of building, testing, and deploying code.
— Features: Supports multiple languages and build tools, integrates with GitHub, Bitbucket, and Cloud Source Repositories, offers custom build steps and triggers.

2. Cloud Deployment Manager
— Purpose: Automates the creation and management of GCP resources using configuration files.
— Features: Declarative syntax, supports YAML and Jinja2 templates, allows for reusable configurations, integrates with other GCP services.

3. Operations (formerly Stackdriver)
— Purpose: Provides monitoring, logging, tracing, and error reporting for applications running on GCP and other platforms.
— Features: Real-time monitoring, centralized logging, distributed tracing, and integrated error reporting.

4. Cloud Source Repositories
— Purpose: Provides private Git repositories for storing and managing source code.
— Features: Integrates with Cloud Build, supports code search and review, offers mirroring of GitHub and Bitbucket repositories.

5. Artifact Registry
— Purpose: Manages and secures build artifacts and dependencies.
— Features: Supports Docker images, Maven artifacts, npm packages, and more, integrates with Cloud Build for seamless artifact management.

6. Cloud Tasks
— Purpose: Manages the execution of distributed tasks in a reliable and scalable manner.
— Features: Asynchronous task execution, supports HTTP and App Engine targets, integrates with other GCP services for event-driven architectures.

7. Pub/Sub
— Purpose: Provides messaging services for asynchronous communication between services.
Features: Scalable and reliable message delivery, supports push and pull messaging, integrates with Cloud Functions, Dataflow, and more.

Benefits of Using GCP for DevOps
1. Scalability
— GCP’s infrastructure scales seamlessly with your needs, ensuring that your applications can handle increased load without performance degradation.

2. Reliability
— GCP provides highly reliable services with built-in redundancy and failover mechanisms, ensuring high availability for your applications.

3. Security
— GCP offers robust security features, including IAM, encryption, and compliance with industry standards, to protect your applications and data.

4. Integration
— GCP’s DevOps tools are designed to integrate seamlessly with each other and with third-party tools, providing a cohesive and efficient development environment.

5. Cost-Effectiveness
— GCP offers flexible pricing models and cost management tools, allowing you to optimize your spending and get the most value from your cloud investment.

Conclusion

This introduction has provided a high-level overview of the key DevOps tools available on GCP and the benefits of using them for your development and operations needs. In the following chapters, we’ll dive deeper into each of these tools, starting with Cloud Build.

Cloud Build — Automating Your CI/CD Pipelines

In this chapter, we’ll explore Cloud Build, Google Cloud Platform’s powerful tool for automating continuous integration and continuous delivery (CI/CD) pipelines. We’ll cover the basics of setting up your first build pipeline, discuss advanced configurations and build triggers, and provide coding examples for both beginners and professionals.

What is Cloud Build?
Cloud Build is a fully managed service that lets you build, test, and deploy software quickly, at scale. It can import source code from various repositories, execute builds using Docker containers, and produce artifacts that can be deployed to different environments.

Key Features
- Multiple Source Repositories: Supports GitHub, Bitbucket, Cloud Source Repositories, and local files.
- Custom Build Steps: Define custom build steps using Docker containers.
- Build Triggers: Automatically trigger builds based on repository events.
- Parallel Builds: Run multiple builds simultaneously to save time.
- Integration: Seamlessly integrates with other GCP services like Artifact Registry, App Engine, and Kubernetes Engine.

Setting Up Your First Cloud Build Pipeline
Step 1: Prerequisites
- A GCP project with billing enabled.
- Cloud Build API enabled.
- Source code in a supported repository (e.g., GitHub, Cloud Source Repositories).

Step 2: Creating a Build Configuration File
Cloud Build uses a configuration file (`cloudbuild.yaml` or `cloudbuild.json`) to define the build steps. Here’s a basic example:

steps:
- name: 'gcr.io/cloud-builders/git'
args: ['clone', 'https://github.com/GoogleCloudPlatform/nodejs-docs-samples']

- name: 'gcr.io/cloud-builders/npm'
args: ['install']
dir: 'nodejs-docs-samples/appengine/hello-world'

- name: 'gcr.io/cloud-builders/npm'
args: ['test']
dir: 'nodejs-docs-samples/appengine/hello-world'

Step 3: Running Your First Build
Use the following command to start a build using the configuration file:

gcloud builds submit --config=cloudbuild.yaml .

This command uploads your source code to Cloud Build and executes the build steps defined in the configuration file.

Advanced Cloud Build Configurations
Custom Build Steps
You can create custom build steps using any Docker container. For example, to use a custom Python container:

steps:
- name: 'python:3.8-slim'
args: ['python', 'scripts/test.py']

Using Build Triggers
Build triggers allow you to automatically start builds based on events in your source repository. Here’s how to create a build trigger:

1. Go to the Cloud Build Triggers page in the GCP Console.
2. Click “Create Trigger”.
3. Select your source repository and configure the trigger (e.g., on push to the `main` branch).
4. Choose the build configuration file (`cloudbuild.yaml`).

Managing Secrets and Environment Variables
Cloud Build allows you to manage secrets and environment variables securely. Here’s how to use Secret Manager to pass a secret to your build:

1. Store the secret in Secret Manager:

echo -n 'my-secret-value' | gcloud secrets create my-secret --data-file=-

2. Reference the secret in your build configuration:

steps:
- name: 'gcr.io/cloud-builders/docker'
args: ['build', '-t', 'gcr.io/my-project/my-app', '.']
secretEnv: ['MY_SECRET']

secrets:
- kmsKeyName: projects/my-project/locations/global/keyRings/my-key-ring/cryptoKeys/my-key
secretEnv:
MY_SECRET: 'projects/my-project/secrets/my-secret/versions/latest'

Coding Examples
Beginner Example: Simple Node.js Application

steps:
- name: 'gcr.io/cloud-builders/npm'
args: ['install']
- name: 'gcr.io/cloud-builders/npm'
args: ['test']

Professional Example: Multi-Stage Docker Build

steps:
- name: 'gcr.io/cloud-builders/docker'
args: ['build', '-t', 'gcr.io/my-project/my-app:latest', '.']
images:
- 'gcr.io/my-project/my-app:latest'

Conclusion

Cloud Build is a versatile and powerful tool for automating your CI/CD pipelines on GCP. By defining build steps in a configuration file and using build triggers, you can streamline your development workflow and ensure consistent, reliable deployments.

Cloud Deployment Manager — Infrastructure as Code

In this chapter, we’ll explore Cloud Deployment Manager, Google Cloud Platform’s service for managing your infrastructure using configuration files. We’ll discuss how to create and manage deployments, dive into advanced templating and configuration management, and provide coding examples for both beginners and professionals.

What is Cloud Deployment Manager?
Cloud Deployment Manager is an infrastructure management service that allows you to define and deploy resources on GCP using declarative configuration files. This enables you to automate the creation and management of your cloud resources, ensuring consistency and repeatability.

Key Features
- Declarative Syntax: Define your infrastructure in simple YAML or Jinja2 templates.
- Reusable Configurations: Use templates and modules to create reusable configurations.
- Integration: Seamlessly integrates with other GCP services.
- Version Control: Store and version your configurations in a source repository.

Creating and Managing Deployments
Step 1: Prerequisites
- A GCP project with billing enabled.
- Cloud Deployment Manager API enabled.

Step 2: Creating a Basic Configuration File
Deployment Manager uses configuration files written in YAML or Jinja2 to define resources. Here’s a basic YAML example:

resources:
- name: my-vm
type: compute.v1.instance
properties:
zone: us-central1-a
machineType: zones/us-central1-a/machineTypes/f1-micro
disks:
- deviceName: boot
type: PERSISTENT
boot: true
initializeParams:
sourceImage: projects/debian-cloud/global/images/family/debian-10
networkInterfaces:
- network: global/networks/default

Step 3: Deploying the Configuration
Use the following command to deploy your configuration file:

gcloud deployment-manager deployments create my-deployment --config config.yaml

This command creates a new deployment named `my-deployment` using the configuration defined in `config.yaml`.

Step 4: Updating and Deleting Deployments
To update an existing deployment with a new configuration:

gcloud deployment-manager deployments update my-deployment --config new-config.yaml

To delete a deployment:

gcloud deployment-manager deployments delete my-deployment

Advanced Templating and Configuration Management

Using Jinja2 Templates
Jinja2 templates allow you to create more dynamic and reusable configurations. Here’s an example of a Jinja2 template:

{% set project = env['project'] %}
{% set zone = env['zone'] %}

resources:
- name: my-vm
type: compute.v1.instance
properties:
zone: {{ zone }}
machineType: zones/{{ zone }}/machineTypes/f1-micro
disks:
- deviceName: boot
type: PERSISTENT
boot: true
initializeParams:
sourceImage: projects/debian-cloud/global/images/family/debian-10
networkInterfaces:
- network: global/networks/default

Creating and Using Modules
Modules allow you to encapsulate common configurations and reuse them across different deployments. Here’s an example of a module for a VM instance:

vm-template.jinja:

resources:
- name: {{ properties['name'] }}
type: compute.v1.instance
properties:
zone: {{ properties['zone'] }}
machineType: zones/{{ properties['zone'] }}/machineTypes/{{ properties['machineType'] }}
disks:
- deviceName: boot
type: PERSISTENT
boot: true
initializeParams:
sourceImage: projects/debian-cloud/global/images/family/debian-10
networkInterfaces:
- network: global/networks/default

config.yaml:

imports:
- path: vm-template.jinja

resources:
- name: my-vm
type: vm-template.jinja
properties:
name: my-vm
zone: us-central1-a
machineType: f1-micro

Coding Examples
Beginner Example: Simple VM Deployment

config.yaml:

resources:
- name: my-vm
type: compute.v1.instance
properties:
zone: us-central1-a
machineType: zones/us-central1-a/machineTypes/f1-micro
disks:
- deviceName: boot
type: PERSISTENT
boot: true
initializeParams:
sourceImage: projects/debian-cloud/global/images/family/debian-10
networkInterfaces:
- network: global/networks/default

Professional Example: Multi-Resource Deployment with Jinja2 Templates
network-template.jinja:

resources:
- name: {{ properties['networkName'] }}
type: compute.v1.network
properties:
autoCreateSubnetworks: false

vm-template.jinja:

resources:
- name: {{ properties['vmName'] }}
type: compute.v1.instance
properties:
zone: {{ properties['zone'] }}
machineType: zones/{{ properties['zone'] }}/machineTypes/{{ properties['machineType'] }}
disks:
- deviceName: boot
type: PERSISTENT
boot: true
initializeParams:
sourceImage: projects/debian-cloud/global/images/family/debian-10
networkInterfaces:
- network: global/networks/{{ properties['networkName'] }}

config.yaml:

imports:
- path: network-template.jinja
- path: vm-template.jinja

resources:
- name: my-network
type: network-template.jinja
properties:
networkName: my-network

- name: my-vm
type: vm-template.jinja
properties:
vmName: my-vm
zone: us-central1-a
machineType: f1-micro
networkName: my-network

Conclusion

Cloud Deployment Manager provides a powerful and flexible way to manage your GCP resources using Infrastructure as Code (IaC). By using declarative configuration files and templates, you can automate and simplify the deployment and management of your cloud infrastructure.

Operations (formerly Stackdriver) — Monitoring and Logging

In this chapter, we’ll delve into Google Cloud’s operations suite, formerly known as Stackdriver, which provides comprehensive monitoring, logging, tracing, and error reporting for your applications. We’ll cover setting up monitoring and logging, using Cloud Trace and Cloud Debugger, and provide examples for both beginners and professionals.

What is GCP Operations?
GCP Operations is a suite of tools designed to provide visibility into the health, performance, and reliability of your applications and infrastructure. It includes Cloud Monitoring, Cloud Logging, Cloud Trace, Cloud Debugger, and Cloud Error Reporting.

Key Features
- Cloud Monitoring: Provides insights into the performance, uptime, and overall health of your applications and infrastructure.
- Cloud Logging: Collects and analyzes logs from your applications and GCP services.
- Cloud Trace: Tracks the latency of your applications to identify performance bottlenecks.
- Cloud Debugger: Allows you to inspect the state of your application in real-time without stopping it.
- Cloud Error Reporting: Automatically detects and reports errors in your applications.

Setting Up Monitoring and Logging
Step 1: Enabling APIs
Before using the operations suite, you need to enable the required APIs in your GCP project:

gcloud services enable monitoring.googleapis.com logging.googleapis.com

Step 2: Setting Up Cloud Monitoring
1. Create Monitoring Workspaces:

— Go to the Cloud Monitoring page in the GCP Console.
— Follow the prompts to create a new monitoring workspace if you don’t already have one.

2. Create Dashboards:
— In the Monitoring Console, navigate to “Dashboards” and create a new dashboard.
— Add widgets to monitor various metrics such as CPU usage, memory usage, and custom metrics.

3. Set Up Alerting Policies:
— Navigate to “Alerting” and create alerting policies.
— Define conditions that trigger alerts, such as high CPU usage or low disk space.
— Configure notification channels to receive alerts via email, SMS, or other methods.

Step 3: Setting Up Cloud Logging
1. Configure Log Sinks:

— Navigate to the Cloud Logging page in the GCP Console.
— Set up log sinks to export logs to destinations like BigQuery, Cloud Storage, or Pub/Sub for further analysis.

2. Create Log-Based Metrics:
— In Cloud Logging, create log-based metrics to generate custom metrics from your log data.
— Use these metrics to monitor specific events or conditions in your logs.

3. View and Analyze Logs:
— Use the Logs Explorer to search, filter, and analyze your logs.
— Create custom queries to find specific log entries or patterns.

Using Cloud Trace and Cloud Debugger
Cloud Trace
Cloud Trace helps you understand the latency and performance of your applications by tracking requests as they travel through your services.

1. Enable Cloud Trace API:

gcloud services enable cloudtrace.googleapis.com

2. Instrument Your Application:
— Use the appropriate client library to instrument your application. For example, in a Node.js application:

const { Trace } = require('@google-cloud/trace');
const trace = new Trace();

3. View Traces:
— Navigate to the Cloud Trace page in the GCP Console to view traces and analyze latency data.

Cloud Debugger
Cloud Debugger allows you to inspect the state of your application in real-time without stopping or slowing it down.

1. Enable Cloud Debugger API:

gcloud services enable clouddebugger.googleapis.com

2. Instrument Your Application:
— Use the appropriate client library to enable debugging. For example, in a Java application:

import com.google.devtools.cdbg.debuglets.java.CloudDebuggerAgent;
CloudDebuggerAgent.attach();

3. Set Breakpoints:
— Navigate to the Cloud Debugger page in the GCP Console to set breakpoints and view application state.

Coding Examples
Beginner Example: Basic Monitoring Setup
monitoring.yaml:

displayName: "Basic Monitoring Policy"
conditions:
- displayName: "High CPU Usage"
conditionThreshold:
filter: "metric.type=\"compute.googleapis.com/instance/cpu/utilization\" AND resource.type=\"gce_instance\""
comparison: "COMPARISON_GT"
thresholdValue: 0.8
duration: "60s"
notifications:
- type: "EMAIL"
emailAddress: "your-email@example.com"

Professional Example: Advanced Logging and Trace
app.js (Node.js Example):

const { Logging } = require('@google-cloud/logging');
const { Trace } = require('@google-cloud/trace');

const logging = new Logging();
const trace = new Trace();

const log = logging.log('my-log');
const metadata = { resource: { type: 'global' } };

const entry = log.entry(metadata, { message: 'Hello, world!' });
log.write(entry).catch(console.error);

trace.startSpan('my-span').endSpan();

Conclusion

GCP Operations provides a robust set of tools for monitoring, logging, tracing, and debugging your applications. By setting up Cloud Monitoring and Cloud Logging, and using Cloud Trace and Cloud Debugger, you can gain deep insights into your application’s performance and reliability.

Cloud Source Repositories — Version Control

In this chapter, we’ll explore Cloud Source Repositories, Google Cloud Platform’s service for managing Git repositories. We’ll discuss how to set up your first repository, integrate it with Cloud Build, and provide coding examples for both beginners and professionals. We will also touch upon the upcoming deprecation of Cloud Source Repositories and the new product that will take its place.

What are Cloud Source Repositories?
Cloud Source Repositories provide fully-featured, scalable, and private Git repositories hosted on GCP. They allow you to store, manage, and version your source code while integrating seamlessly with other GCP services. However, it’s important to note that Cloud Source Repositories will be deprecated and replaced by Secure Source Manager.

Key Features
- Fully Managed: Managed Git repositories with high availability and scalability.
- Integration: Integrates with Cloud Build for CI/CD, and other GCP services.
- Code Search: Powerful search capabilities to quickly find code snippets across repositories.
- Security: Fine-grained access control and auditing.

Deprecation Notice
As per the official documentation, Cloud Source Repositories will be deprecated. The new service, Secure Source Manager, will take its place. Secure Source Manager aims to provide enhanced security and management features for your source code.

Setting Up Your First Repository
Step 1: Enabling APIs
Before using Cloud Source Repositories, you need to enable the required APIs in your GCP project:

gcloud services enable sourcerepo.googleapis.com

Step 2: Creating a Repository
1. Via GCP Console:

— Navigate to the Cloud Source Repositories page in the GCP Console.
— Click “Add Repository” and follow the prompts to create a new repository.

2. Via Command Line:
— Use the following command to create a repository:

gcloud source repos create my-repo

Step 3: Cloning the Repository
Clone the repository to your local machine:

gcloud source repos clone my-repo

This command creates a local copy of the repository in a directory named `my-repo`.

Step 4: Pushing Code to the Repository
1. Navigate to the cloned repository directory:

cd my-repo

2. Add your source code and commit it:

echo "print('Hello, world!')" > hello.py
git add hello.py
git commit -m "Initial commit"

3. Push the changes to Cloud Source Repositories:

git push origin master

Integrating with Cloud Build
Integrating Cloud Source Repositories with Cloud Build allows you to automate your CI/CD pipelines. Here’s how to set up a build trigger:

1. Create a Build Trigger:
— Go to the Cloud Build Triggers page in the GCP Console.
— Click “Create Trigger”.
— Select your Cloud Source Repository and configure the trigger (e.g., on push to the `master` branch).
— Choose the build configuration file (`cloudbuild.yaml`).

2. Example Build Configuration:
cloudbuild.yaml:

steps:
- name: 'gcr.io/cloud-builders/gcloud'
args: ['builds', 'submit', '--tag=gcr.io/$PROJECT_ID/my-app']

3. Push Changes to Trigger the Build:
Push a new commit to the repository to trigger the build:

echo "print('Hello, Cloud Build!')" > hello.py
git add hello.py
git commit -m "Update hello.py"
git push origin master

Coding Examples
Beginner Example: Simple Python Application
1. Create a new repository and clone it:

gcloud source repos create simple-python-app
gcloud source repos clone simple-python-app
cd simple-python-app

2. Add a simple Python script:

echo "print('Hello, Cloud Source Repositories!')" > main.py
git add main.py
git commit -m "Initial commit"
git push origin master

Professional Example: Multi-Module Project
1. Create a new repository and clone it:

gcloud source repos create multi-module-project
gcloud source repos clone multi-module-project
cd multi-module-project

2. Add a complex project structure:

mkdir -p src/module1 src/module2
echo "def func1(): return 'Module 1'" > src/module1/module1.py
echo "def func2(): return 'Module 2'" > src/module2/module2.py
echo "from module1 import func1\nfrom module2 import func2\nprint(func1())\nprint(func2())" > main.py

3. Commit and push the project:

git add .
git commit -m "Initial commit for multi-module project"
git push origin master

Conclusion

Cloud Source Repositories provide a robust, fully managed solution for version control on GCP. However, with the upcoming deprecation, it is advisable to start exploring Secure Source Manager for enhanced security and management features. By integrating with Cloud Build, you can automate your CI/CD pipelines and streamline your development workflow.

Secure Source Manager — Enhanced Source Code Security and Management

In this chapter, we’ll explore Secure Source Manager, Google Cloud Platform’s new service designed to replace Cloud Source Repositories. Secure Source Manager offers enhanced security features and advanced management capabilities for your source code. We’ll discuss how to set up and use Secure Source Manager, integrate it with Cloud Build, and provide coding examples for both beginners and professionals.

What is Secure Source Manager?
Secure Source Manager is a fully managed service that provides secure and scalable source code management with advanced security features and integration capabilities. It is designed to meet the needs of modern development teams by offering enhanced security, compliance, and seamless integration with other GCP services.

Key Features
- Enhanced Security: Fine-grained access control, encryption, and compliance with industry standards.
- Integration: Seamless integration with Cloud Build, Artifact Registry, and other GCP services.
- Scalability: High availability and scalability for managing large codebases.
- Advanced Management: Comprehensive auditing, monitoring, and version control capabilities.

Setting Up Secure Source Manager
Step 1: Enabling APIs
Before using Secure Source Manager, you need to enable the required APIs in your GCP project:

gcloud services enable secure-sourcemanager.googleapis.com

Step 2: Creating a Secure Source Manager Repository
1. Via GCP Console:

— Navigate to the Secure Source Manager page in the GCP Console.
— Click “Create Repository” and follow the prompts to create a new repository.

2. Via Command Line:
— Use the following command to create a repository:

gcloud secure-sourcemanager repos create my-secure-repo

Step 3: Cloning the Repository
Clone the repository to your local machine:

gcloud secure-sourcemanager repos clone my-secure-repo

This command creates a local copy of the repository in a directory named `my-secure-repo`.

Step 4: Pushing Code to the Repository
1. Navigate to the cloned repository directory:

cd my-secure-repo

2. Add your source code and commit it:

echo "print('Hello, Secure Source Manager!')" > hello.py
git add hello.py
git commit -m "Initial commit"

3. Push the changes to Secure Source Manager:

git push origin master

Integrating with Cloud Build
Integrating Secure Source Manager with Cloud Build allows you to automate your CI/CD pipelines. Here’s how to set up a build trigger:
1. Create a Build Trigger:
— Go to the Cloud Build Triggers page in the GCP Console.
— Click “Create Trigger”.
— Select your Secure Source Manager repository and configure the trigger (e.g., on push to the `master` branch).
— Choose the build configuration file (`cloudbuild.yaml`).
2. Example Build Configuration:
cloudbuild.yaml:

steps:
- name: 'gcr.io/cloud-builders/gcloud'
args: ['builds', 'submit', '--tag=gcr.io/$PROJECT_ID/my-app']

3. Push Changes to Trigger the Build:
Push a new commit to the repository to trigger the build:

echo "print('Hello, Secure Source Manager and Cloud Build!')" > hello.py
git add hello.py
git commit -m "Update hello.py"
git push origin master

Coding Examples
Beginner Example: Simple Python Application
1. Create a new repository and clone it:

gcloud secure-sourcemanager repos create simple-python-app
gcloud secure-sourcemanager repos clone simple-python-app
cd simple-python-app

2. Add a simple Python script:

echo "print('Hello, Secure Source Manager!')" > main.py
git add main.py
git commit -m "Initial commit"
git push origin master

Professional Example: Multi-Module Project
1. **Create a new repository and clone it:

gcloud secure-sourcemanager repos create multi-module-project
gcloud secure-sourcemanager repos clone multi-module-project
cd multi-module-project

2. Add a complex project structure:

mkdir -p src/module1 src/module2
echo "def func1(): return 'Module 1'" > src/module1/module1.py
echo "def func2(): return 'Module 2'" > src/module2/module2.py
echo "from module1 import func1\nfrom module2 import func2\nprint(func1())\nprint(func2())" > main.py

3. Commit and push the project:

git add .
git commit -m "Initial commit for multi-module project"
git push origin master

Conclusion
Secure Source Manager offers a secure, scalable, and integrated solution for managing your source code on GCP. By leveraging its enhanced security features and seamless integration with Cloud Build, you can streamline your CI/CD pipelines and ensure the safety and reliability of your code.

Artifact Registry — Managing Your Build Artifacts

In this chapter, we’ll explore Artifact Registry, Google Cloud Platform’s service for storing and managing build artifacts. We’ll discuss how to set up and manage repositories, integrate Artifact Registry with Cloud Build, and provide coding examples for both beginners and professionals.

What is Artifact Registry?
Artifact Registry is a universal repository manager for securely storing and managing your build artifacts. It supports various artifact formats, including Docker images, Maven artifacts, npm packages, and more. Artifact Registry provides a single place for your organization to manage and secure your artifacts across all environments.

Key Features
- Multi-Format Support: Manage Docker images, Maven artifacts, npm packages, and other artifact types.
- Security: Fine-grained access control, vulnerability scanning, and encryption.
- Integration: Seamlessly integrates with Cloud Build and other CI/CD tools.
- Scalability: Highly scalable storage for your artifacts, with low-latency access.

Setting Up and Managing Repositories
Step 1: Enabling APIs
Before using Artifact Registry, you need to enable the required APIs in your GCP project:

gcloud services enable artifactregistry.googleapis.com

Step 2: Creating a Repository
1. Via GCP Console:

— Navigate to the Artifact Registry page in the GCP Console.
— Click “Create Repository” and follow the prompts to create a new repository.
— Choose the format (e.g., Docker, Maven, npm) and set the repository location and permissions.

2. Via Command Line:
— Use the following command to create a Docker repository:

gcloud artifacts repositories create my-docker-repo --repository-format=docker --location=us-central1

Step 3: Pushing Artifacts to the Repository
1. For Docker Images:

— Tag your Docker image:

docker tag my-image us-central1-docker.pkg.dev/my-project/my-docker-repo/my-image:tag

— Push the Docker image to Artifact Registry:

docker push us-central1-docker.pkg.dev/my-project/my-docker-repo/my-image:tag

2. For Maven Artifacts:
— Configure your `pom.xml` to use Artifact Registry as a repository:

<distributionManagement>
<repository>
<id>artifact-registry</id>
<url>https://us-central1-maven.pkg.dev/my-project/my-maven-repo</url>
</repository>
</distributionManagement>

— Deploy your Maven artifacts:

mvn deploy

Step 4: Managing Repositories
1. Viewing Repositories:

— Navigate to the Artifact Registry page in the GCP Console to view and manage your repositories.

2. Setting Permissions:

— Use IAM to control access to your repositories, granting roles like `artifactregistry.reader` and `artifactregistry.writer` to users and service accounts.

Integrating with Cloud Build
Integrating Artifact Registry with Cloud Build allows you to automate the building and storing of your artifacts. Here’s how to set up a build that pushes a Docker image to Artifact Registry:

Example Build Configuration:
cloudbuild.yaml:

steps:
- name: 'gcr.io/cloud-builders/docker'
args: ['build', '-t', 'us-central1-docker.pkg.dev/$PROJECT_ID/my-docker-repo/my-image:tag', '.']
- name: 'gcr.io/cloud-builders/docker'
args: ['push', 'us-central1-docker.pkg.dev/$PROJECT_ID/my-docker-repo/my-image:tag']
images:
- 'us-central1-docker.pkg.dev/$PROJECT_ID/my-docker-repo/my-image:tag'

Triggering the Build:
1. Commit your `cloudbuild.yaml` to your source repository.
2. Push a new commit to trigger the build:

git add cloudbuild.yaml
git commit -m "Add Cloud Build configuration for Docker image"
git push origin master

Coding Examples
Beginner Example: Pushing a Simple Docker Image

1. Create a Dockerfile:

FROM python:3.8-slim
COPY . /app
WORKDIR /app
RUN pip install -r requirements.txt
CMD ["python", "app.py"]

2. Create a build configuration:
cloudbuild.yaml:

steps:
- name: 'gcr.io/cloud-builders/docker'
args: ['build', '-t', 'us-central1-docker.pkg.dev/$PROJECT_ID/my-docker-repo/my-image:latest', '.']
- name: 'gcr.io/cloud-builders/docker'
args: ['push', 'us-central1-docker.pkg.dev/$PROJECT_ID/my-docker-repo/my-image:latest']
images:
- 'us-central1-docker.pkg.dev/$PROJECT_ID/my-docker-repo/my-image:latest'

3. Push the configuration and Dockerfile:

git add Dockerfile cloudbuild.yaml
git commit -m "Add Dockerfile and Cloud Build configuration"
git push origin master

Professional Example: Managing Maven Artifacts
1. Configure `pom.xml` for Artifact Registry:

<distributionManagement>
<repository>
<id>artifact-registry</id>
<url>https://us-central1-maven.pkg.dev/my-project/my-maven-repo</url>
</repository>
</distributionManagement>

2. Deploy Maven artifacts:

mvn deploy

Conclusion

Artifact Registry provides a secure, scalable, and integrated solution for managing your build artifacts on GCP. By setting up repositories and integrating with Cloud Build, you can streamline your CI/CD pipelines and ensure consistent artifact management.

Cloud Tasks and Pub/Sub — Asynchronous Processing

In this chapter, we’ll explore Cloud Tasks and Pub/Sub, two powerful services on Google Cloud Platform (GCP) for handling asynchronous processing and messaging. We’ll discuss how to set up and use these services, integrate them with other GCP services, and provide coding examples for both beginners and professionals.

What are Cloud Tasks and Pub/Sub?

Cloud Tasks is a fully managed service that allows you to manage the execution of distributed tasks. It enables you to execute work asynchronously, outside of user requests, by creating tasks that run in the background.

Pub/Sub (Publish/Subscribe) is a messaging service that allows you to decouple services and applications. It provides reliable, many-to-many, asynchronous messaging between applications.

Key Features
- Cloud Tasks:
— Asynchronous Task Execution: Offload work to background tasks.
— Task Queuing: Schedule tasks for future execution.
— Reliability: Ensure task execution with retries and dead-letter queues.
— Integration: Works seamlessly with Cloud Functions, App Engine, and more.

- Pub/Sub:
— Asynchronous Messaging: Decouple services with message passing.
— Scalability: Handle high-throughput, low-latency messaging.
— Reliability: Durable message storage and at-least-once delivery.
— Integration: Integrates with Cloud Functions, Dataflow, and more.

Setting Up Cloud Tasks
Step 1: Enabling APIs
Before using Cloud Tasks, you need to enable the required APIs in your GCP project:

gcloud services enable cloudtasks.googleapis.com

Step 2: Creating a Task Queue
1. Via GCP Console:

— Navigate to the Cloud Tasks page in the GCP Console.
— Click “Create Queue” and follow the prompts to create a new task queue.

2. Via Command Line:
— Use the following command to create a task queue:

gcloud tasks queues create my-queue

Step 3: Creating and Adding Tasks
You can add tasks to your queue using the `gcloud` command or by using the Cloud Tasks client library. Here’s an example using Python:

from google.cloud import tasks_v2
from google.protobuf import timestamp_pb2
import datetime

# Create a client
client = tasks_v2.CloudTasksClient()

# Define queue name
project = 'my-project-id'
queue = 'my-queue'
location = 'us-central1'
parent = client.queue_path(project, location, queue)

# Create task
task = {
'app_engine_http_request': { # Specify the type of request
'http_method': 'POST',
'relative_uri': '/task_handler'
}
}

# Set task schedule time to 2 minutes in the future
d = datetime.datetime.utcnow() + datetime.timedelta(minutes=2)
timestamp = timestamp_pb2.Timestamp()
timestamp.FromDatetime(d)
task['schedule_time'] = timestamp

# Add task to the queue
response = client.create_task(parent=parent, task=task)
print('Created task {}'.format(response.name))

Setting Up Pub/Sub
Step 1: Enabling APIs
Before using Pub/Sub, you need to enable the required APIs in your GCP project:

gcloud services enable pubsub.googleapis.com

Step 2: Creating Topics and Subscriptions
1. Creating a Topic:
— Via GCP Console:

— Navigate to the Pub/Sub page in the GCP Console.
— Click “Create Topic” and follow the prompts to create a new topic.
— Via Command Line:

gcloud pubsub topics create my-topic

2. Creating a Subscription:
— Via GCP Console:
— Navigate to the Pub/Sub page in the GCP Console.
— Click “Create Subscription” and follow the prompts to create a new subscription.
— Via Command Line:

gcloud pubsub subscriptions create my-subscription --topic=my-topic

Step 3: Publishing and Receiving Messages
You can publish messages to a topic and receive messages from a subscription using the Pub/Sub client library. Here’s an example using Python:

Publishing Messages:

from google.cloud import pubsub_v1

# Create a publisher client
publisher = pubsub_v1.PublisherClient()
topic_path = publisher.topic_path('my-project-id', 'my-topic')

# Publish a message
data = 'Hello, Pub/Sub!'.encode('utf-8')
future = publisher.publish(topic_path, data)
print(future.result())

Receiving Messages:

from google.cloud import pubsub_v1

# Create a subscriber client
subscriber = pubsub_v1.SubscriberClient()
subscription_path = subscriber.subscription_path('my-project-id', 'my-subscription')

def callback(message):
print('Received message: {}'.format(message.data))
message.ack()

# Subscribe to the subscription
subscriber.subscribe(subscription_path, callback=callback)

# Keep the main thread alive to receive messages
import time
while True:
time.sleep(60)

Integrating with Other GCP Services
Cloud Tasks with Cloud Functions:
Create a Cloud Function to handle tasks:

def task_handler(request):
print('Task received: {}'.format(request.data))
return 'OK', 200

Deploy the function and set it as the task handler in Cloud Tasks.
Pub/Sub with Cloud Functions:
Create a Cloud Function to process Pub/Sub messages:

def pubsub_handler(event, context):
print('Event data: {}'.format(event['data']))

Deploy the function and configure the Pub/Sub topic as the trigger.

Coding Examples
Beginner Example: Simple Task Queue
1. Create a task queue:

gcloud tasks queues create simple-queue

2. Add a task to the queue using Python:

from google.cloud import tasks_v2
from google.protobuf import timestamp_pb2
import datetime

client = tasks_v2.CloudTasksClient()
project = 'my-project-id'
queue = 'simple-queue'
location = 'us-central1'
parent = client.queue_path(project, location, queue)

task = {
'app_engine_http_request': {
'http_method': 'POST',
'relative_uri': '/task_handler'
}
}

d = datetime.datetime.utcnow() + datetime.timedelta(minutes=2)
timestamp = timestamp_pb2.Timestamp()
timestamp.FromDatetime(d)
task['schedule_time'] = timestamp

response = client.create_task(parent=parent, task=task)
print('Created task {}'.format(response.name))

Professional Example: Pub/Sub with Cloud Functions
1. Create a Pub/Sub topic and subscription:

gcloud pubsub topics create my-topic
gcloud pubsub subscriptions create my-subscription --topic=my-topic

2. Deploy a Cloud Function to process Pub/Sub messages:
main.py:

def pubsub_handler(event, context):
print('Event data: {}'.format(event['data']))

requirements.txt:

# List any third-party dependencies here

Deploy the function:

gcloud functions deploy pubsub_handler --runtime python39 --trigger-topic my-topic

Conclusion

Cloud Tasks and Pub/Sub provide robust solutions for managing asynchronous processing and messaging on GCP. By leveraging these services, you can decouple your applications, handle background tasks efficiently, and build scalable, reliable systems.

Real-World Examples and Templates

In this chapter, we’ll explore real-world examples and templates to help you get started with Google Cloud Platform (GCP) services discussed in this guide. These examples are designed to cater to both beginners and professionals, providing practical insights and ready-to-use code snippets.

Beginner Examples
1. Simple CI/CD Pipeline with Cloud Build
cloudbuild.yaml:

steps:
- name: 'gcr.io/cloud-builders/npm'
args: ['install']
- name: 'gcr.io/cloud-builders/npm'
args: ['test']
- name: 'gcr.io/cloud-builders/npm'
args: ['run', 'build']
- name: 'gcr.io/cloud-builders/gcloud'
args: ['app', 'deploy']

Setup:
1. Create a `cloudbuild.yaml` file in the root of your project.
2. Add the build steps for installing dependencies, running tests, building the project, and deploying it to App Engine.
3. Push the code to your Cloud Source Repository and configure a build trigger in Cloud Build to automate the process.

2. Basic VM Deployment with Cloud Deployment Manager
config.yaml:

resources:
- name: my-vm
type: compute.v1.instance
properties:
zone: us-central1-a
machineType: zones/us-central1-a/machineTypes/n1-standard-1
disks:
- deviceName: boot
type: PERSISTENT
boot: true
initializeParams:
sourceImage: projects/debian-cloud/global/images/family/debian-10
networkInterfaces:
- network: global/networks/default

Setup:
1. Create a `config.yaml` file with the desired VM configuration.
2. Deploy the VM using the following command:

gcloud deployment-manager deployments create my-deployment --config config.yaml

3. Simple Logging with Cloud Logging
main.py:

import logging
from google.cloud import logging as cloud_logging

# Instantiates a client
client = cloud_logging.Client()

# Connects the logger to the default handler
client.setup_logging()

logging.info('This is a log message!')

Setup:
1. Install the Google Cloud Logging library:

pip install google-cloud-logging

2. Create a Python script and include the above code to log messages to Cloud Logging.
3. Run the script and view logs in the Cloud Logging console.

Professional Examples
1. Advanced CI/CD Pipeline with Artifact Registry and Cloud Build
cloudbuild.yaml:

steps:
- name: 'gcr.io/cloud-builders/docker'
args: ['build', '-t', 'us-central1-docker.pkg.dev/$PROJECT_ID/my-docker-repo/my-app:latest', '.']
- name: 'gcr.io/cloud-builders/docker'
args: ['push', 'us-central1-docker.pkg.dev/$PROJECT_ID/my-docker-repo/my-app:latest']
images:
- 'us-central1-docker.pkg.dev/$PROJECT_ID/my-docker-repo/my-app:latest'

Setup:
1. Create a Dockerfile for your application.
2. Create a `cloudbuild.yaml` file to build and push the Docker image to Artifact Registry.
3. Configure a build trigger in Cloud Build to automate the process.

2. Multi-Resource Deployment with Cloud Deployment Manager
network-template.jinja:

resources:
- name: {{ properties['networkName'] }}
type: compute.v1.network
properties:
autoCreateSubnetworks: false

vm-template.jinja:

resources:
- name: {{ properties['vmName'] }}
type: compute.v1.instance
properties:
zone: {{ properties['zone'] }}
machineType: zones/{{ properties['zone'] }}/machineTypes/{{ properties['machineType'] }}
disks:
- deviceName: boot
type: PERSISTENT
boot: true
initializeParams:
sourceImage: projects/debian-cloud/global/images/family/debian-10
networkInterfaces:
- network: global/networks/{{ properties['networkName'] }}

config.yaml:

imports:
- path: network-template.jinja
- path: vm-template.jinja

resources:
- name: my-network
type: network-template.jinja
properties:
networkName: my-network

- name: my-vm
type: vm-template.jinja
properties:
vmName: my-vm
zone: us-central1-a
machineType: n1-standard-1
networkName: my-network

Setup:
1. Create Jinja templates for network and VM resources.
2. Create a `config.yaml` file to use the templates and define resource properties.
3. Deploy the resources using the following command:

gcloud deployment-manager deployments create my-deployment --config config.yaml

3. Asynchronous Processing with Cloud Tasks and Pub/Sub
task_handler.py (Cloud Function):

def task_handler(request):
data = request.get_json()
print(f'Task received: {data}')
return 'OK', 200

main.py (Publishing Tasks):

from google.cloud import tasks_v2
from google.cloud import pubsub_v1
from google.protobuf import timestamp_pb2
import datetime

# Cloud Tasks setup
tasks_client = tasks_v2.CloudTasksClient()
project = 'my-project-id'
queue = 'my-queue'
location = 'us-central1'
parent = tasks_client.queue_path(project, location, queue)

# Pub/Sub setup
publisher = pubsub_v1.PublisherClient()
topic_path = publisher.topic_path('my-project-id', 'my-topic')

def create_task(payload):
task = {
'app_engine_http_request': {
'http_method': 'POST',
'relative_uri': '/task_handler',
'body': payload.encode()
}
}
d = datetime.datetime.utcnow() + datetime.timedelta(minutes=2)
timestamp = timestamp_pb2.Timestamp()
timestamp.FromDatetime(d)
task['schedule_time'] = timestamp

response = tasks_client.create_task(parent=parent, task=task)
print(f'Created task: {response.name}')

def publish_message(message):
future = publisher.publish(topic_path, message.encode('utf-8'))
print(f'Published message: {future.result()}')

# Example usage
create_task('{"message": "Hello, Cloud Tasks!"}')
publish_message('Hello, Pub/Sub!')

Setup:
1. Create a Cloud Function (`task_handler.py`) to handle tasks.
2. Create a Python script (`main.py`) to publish tasks to Cloud Tasks and messages to Pub/Sub.
3. Deploy the Cloud Function and configure the necessary services in GCP.

Conclusion

These real-world examples and templates provide a practical starting point for leveraging GCP services in your projects. By using these examples, you can build robust, scalable, and efficient systems on Google Cloud Platform.

Conclusion and Next Steps

In this concluding chapter, we’ll recap the key points discussed in this guide, highlight the benefits of using Google Cloud Platform (GCP) for your DevOps needs, and provide further resources and learning paths to continue your journey.

Recap of Key Points
Throughout this guide, we’ve covered a range of GCP services that are essential for modern DevOps practices. Here’s a brief recap of each chapter:

1. Introduction to GCP DevOps Tools:
— Overview of key DevOps tools on GCP and their benefits.

2. Cloud Build — Automating Your CI/CD Pipelines:
— Setting up Cloud Build, creating build configurations, and automating CI/CD pipelines.

3. Cloud Deployment Manager — Infrastructure as Code:
— Using Cloud Deployment Manager for managing infrastructure with configuration files and templates.

4. Operations (formerly Stackdriver) — Monitoring and Logging:
— Setting up monitoring, logging, tracing, and debugging using GCP Operations.

5. Cloud Source Repositories — Version Control:
— Managing source code with Cloud Source Repositories and understanding its deprecation and replacement by Secure Source Manager.

6. Secure Source Manager — Enhanced Source Code Security and Management:
— Using Secure Source Manager for secure and scalable source code management.

7. Artifact Registry — Managing Your Build Artifacts:
— Setting up and managing repositories for various artifact formats with Artifact Registry.

8. Cloud Tasks and Pub/Sub — Asynchronous Processing:
— Handling asynchronous processing and messaging using Cloud Tasks and Pub/Sub.

9. Real-World Examples and Templates:
— Practical examples and templates for implementing GCP services in real-world scenarios.

Benefits of Using GCP for DevOps
Using GCP for your DevOps needs offers several benefits:

- Scalability: GCP’s infrastructure scales seamlessly to handle growing workloads and user demands.
- Reliability: High availability and redundancy ensure that your applications and services remain operational.
- Security: Robust security features, including encryption, IAM, and compliance with industry standards, protect your data and applications.
- Integration: Seamless integration between GCP services allows for efficient workflows and streamlined operations.
- Cost-Effectiveness: Flexible pricing models and cost management tools help you optimize spending and maximize value.

Further Resources and Learning Paths
To continue your journey with GCP and DevOps, here are some valuable resources and learning paths:

1. Google Cloud Documentation:
— Explore detailed documentation for all GCP services.
Google Cloud Documentation

2. Google Cloud Training and Certification:
— Enroll in training courses and pursue certifications to validate your skills.
Google Cloud Training

3. Qwiklabs:
— Hands-on labs and quests for practical experience with GCP services.
Qwiklabs

4. Google Cloud Community:
— Join the community to connect with other GCP users, share knowledge, and seek support.
Google Cloud Community

5. DevOps Books and Articles:
— Read books and articles on DevOps practices and principles to deepen your understanding.
— Some recommendations:
— “The Phoenix Project” by Gene Kim, Kevin Behr, and George Spafford
— “The DevOps Handbook” by Gene Kim, Patrick Debois, John Willis, and Jez Humble
— “Site Reliability Engineering” by Niall Richard Murphy, Betsy Beyer, Chris Jones, and Jennifer Petoff

Final Thoughts and Recommendations
Adopting DevOps practices and leveraging GCP services can significantly enhance your development and operations workflows. By automating processes, ensuring scalability and reliability, and integrating various tools, you can achieve faster and more efficient software delivery.

As you continue your journey, keep exploring new GCP services, stay updated with the latest trends in DevOps, and continuously improve your workflows. The resources provided in this guide will help you build a solid foundation and expand your knowledge.

Thank you for following this guide. Good luck with your DevOps journey on Google Cloud Platform!

--

--

Warley's CatOps

Travel around with your paws. Furly Tech Enthusiast with passion to teach people. Let’s ease technology with meow!1