Pulumi AI — Leveraging LLMs for IaC with GCP

Jasbir Singh
Google Cloud - Community
4 min readJun 2, 2023

Why write your Pulumi programs yourself when you can have AI do it for you? Pulumi AI leverages large language models (LLMs) to author infrastructure as code for any architecture for any cloud in any language.

Pulumi recently released Pulumi AI, a purpose-built AI Assistant that can create cloud infrastructure using Pulumi. It builds on the power of Large Language Models (LLMs) and GPT to dramatically reduce the time it takes to discover, learn and use new cloud infrastructure APIs.

In this blog, we’ll dive deeper into this new technology that so many Pulumi users have been so excited about.

Why Pulumi AI?

It’s often hard to appreciate just how large and complex modern cloud platforms have become. Pulumi itself supports 130 cloud providers, with many individual providers having over 1,000 resources, and many of those resources have hundreds of properties. This adds up to millions of configurable properties across the entire surface area!

One of the constant challenges for teams working in the cloud is finding specific examples that cover the set of cloud infrastructure resources and configurations that they require. Many times we can point them to one of the thousands listed in our Pulumi Examples repository or enumerated in the Pulumi Registry, there are also times that the specific combination of technologies that a user needs have not yet been captured and stored.

Pulumi AI addresses this need in a direct way. Instead of having to create every possible example by hand, users can just ask Pulumi AI to generate an example that suits their unique needs, no matter how specific and bespoke the request might be.

Intelligence for Infrastructure as Code

Pulumi Insights is Intelligence for Infrastructure as Code. It adds advanced search, analytics, and AI to infrastructure as code. With Insights, you can gain unique insights into your company’s cloud usage and trends and boost engineering productivity.

Code faster with the power of AI

Author infastructure as code programs in your favorite programming language using large-language models, and automate infrastructure creation through AI. You can tap into the Pulumi platform’s deep understanding of usage patterns to leverage recommendations and generate infrastructure as code automatically.

Try it for yourself. https://www.pulumi.com/ai/.

Find anything in any cloud

Ask any questions about your infrastructure across more than 100 clouds, using either structured search queries or natural language prompts. Search helps you find that needle in the haystack — locating a single resource across many clouds and environments — as well as running sophisticated queries such as tracking down untagged or expensive resources across the whole organization.

Gain deeper insights into infrastructure as code

Identify anomalies or trends in resource usage and dig into cost, security, and compliance concerns. You can programmatically query a REST API to add automation around search results or to integrate with internal platforms and dashboards. You can export data to other data warehouses including Snowflake, Amazon Redshift, Google BigQuery and Azure Synapse. You also have built-in dashboards that give you key insights about cloud usage.

Discovering, Learning, and Building with Pulumi AI

We can leverage pulumi for provisioning of infrastructure resources on Google Cloud Platform in any programming language of our choice.

First navigate to pulumi.com/ai.

Next, you will use Pulumi AI to generate a program that will provision cloud infrastructure. In this example, we asked Pulumi AI to provision GKE cluster with Anthos Service Mesh. Using the chat widget on the Pulumi AI page, describe the infrastructure you would like to build. In our case, we supplied: “I want to setup GKE cluster on Google cloud with Anthos Service Mesh.”

Pulumi AI returned to us the code below in Typescript:

import * as pulumi from "@pulumi/pulumi";
import * as gcp from "@pulumi/gcp";
import * as kubernetes from "@pulumi/kubernetes";

const projectName = pulumi.getProject();
const stackName = pulumi.getStack();

// Create a GKE cluster
const cluster = new gcp.container.Cluster(`${projectName}-${stackName}`, {
initialNodeCount: 1,
minMasterVersion: "latest",
nodeVersion: "latest",
location: gcp.config.region,
});

// Export the cluster's kubeconfig
export const kubeconfig = cluster.kubeConfigs[0].rawConfig;

// Create the Anthos Service Mesh installation
const asmInstallation = new kubernetes.helm.v3.Chart(`${projectName}-${stackName}`, {
repo: "asm",
chart: "asm",
version: "latest",
namespace: "asm-system",
values: {
istio: {
enabled: true,
useIstioNamespace: true,
},
},
});

// Export the Anthos Service Mesh installation details
export const asmNamespace = asmInstallation.metadata.namespace;

// Export the kubeconfig and Anthos Service Mesh namespace
export const outputs = {
kubeconfig,
asmNamespace,
};

In this example, we use the @pulumi/gcp and @pulumi/kubernetes libraries to interact with GCP and Kubernetes resources. The program creates a GKE cluster using the gcp.container.Cluster resource and sets the initial node count, master version, and node version.

The kubeconfig property of the cluster is exported so that you can use it to interact with the cluster from the command line or other tools.

Next, the program deploys Anthos Service Mesh using the kubernetes.helm.v3.Chart resource. The Helm chart used for Anthos Service Mesh is specified with the repo, chart, and version properties. We set the namespace to asm-system, but you can adjust it to your preference. The values property is used to configure Istio within Anthos Service Mesh, enabling it and specifying that it should use its own dedicated namespace (useIstioNamespace: true).

Finally, we export the asmNamespace property, which contains the namespace where Anthos Service Mesh is installed. This information can be useful for deploying applications with sidecar proxies managed by Anthos Service Mesh.

To deploy this Pulumi program, you’ll need to initialize a new Pulumi project, set up the required dependencies (using npm install), and run pulumi up to create the resources on GCP. Remember to configure your GCP credentials using gcloud auth before running pulumi up.

Pulumi AI has helped us generate a program to provision GKE Cluster with Anthos Service Mesh.

Clean up

If you’d like to tear down all of these resources and delete your stack, run pulumi destroy -rf --remove.

--

--

Jasbir Singh
Google Cloud - Community

Consulting Cloud Architect, Public Cloud@Rackspace Technology