Exporting GCP Projects to Terraform

Rohan Paithankar
Google Cloud - Community
4 min readNov 29, 2023

Have you found yourself in a situation where you went on creating resources in a project through ClickOps and that eventually became your production environment? If yes, this article is for you!

In my consulting journey, I have come across some very unique challenges that organisations face. One such challenge was that of a client that had created a ‘production environment’ through ClickOps and they found themselves in a pickle trying to maintain these resources. In an environment like this, engineers often feared touching anything because it could even mean that it was their last day at the job. Added to this were the rising concerns of costs incurred by the many unknown resources running somewhere in the project.

You see, psychological safety is often under-appreciated but it is an important ingredient for building high performing teams.

Watching the team’s turmoil with this so called ‘production environment’ forced me to explore ways to bring the situation under control. And voila! Terraform-ing the resources was the most favourable solution. Why?

  • Infrastructure-as-code(IaC) would first help us audit the resources that had been deployed and thereby monitor costs incurred.
  • Infrastructure-as-code(IaC) would help us re-build the infrastructure if needed.
  • Infrastructure-as-code(IaC) would help the team deploy newer resources and most importantly, move away from ClickOps.

Having said that, I was now left with only half the answer. The second part of this challenge was to export the resources into Terraform. I had only seen this uni-directional flow:

But what I wanted was:

How did I achieve it?

Note: This process only works for Linux/MacOS. The user exporting the resources requires elevated permissions in the project. Alternatively, service account impersonation can be used.

Here are the steps:

  • Setup a new git repository and clone it to your local machine.
  • In the repository, setup Terraform backend file backend.tf with remote state file in a GCS bucket.
terraform {
backend "gcs" {
bucket = "BUCKET NAME"
prefix = "PATH/TO/STATE/FILE"
}
}
  • Authenticate with GCP and configure the target project. Additionally, service account impersonation can also be used.
gcloud auth login

gcloud config set project PROJECT_ID
  • As per the steps mentioned in the GCP documentation, install GCP’s config-connector and run the gcloud cli command to export the required resources to Terraform scripts.
gcloud components install config-connector

gcloud beta resource-config bulk-export \
--path=OUTPUT_DIRECTORY \
--project=PROJECT_ID \
--resource-format=terraform

The export is arranged in the format:

OUTPUT_DIRECTORY/projects/PROJECT_ID/RESOURCE_TYPE

Along with the bulk export of resources, it also provides the ‘terraform import’ commands to import the resources into the state file. Once the scripts are exported, run ‘terraform init’ followed by the ‘terraform import’ command.

The Catch-22

The time taken to export is linearly proportional to the number of resources running in the project. In my case, I had a ‘huge’ number of BigQuery Datasets and Tables that took ages to export if combined with all other resources. To solve this challenge, I first listed out all the resource types supported for the export using:

gcloud beta resource-config list-resource-types >> resources.txt

I then created two separate lists, one for BigQuery resources and another for non-BigQuery resources.

BigQuery resources: (resourceTypes_BQ.txt)

BigQueryDataset
BigQueryJob
BigQueryTable
BigtableAppProfile
BigtableGCPolicy
BigtableInstance
BigtableTable

Non-BigQuery resources: (resourceTypes_exceptBQ.txt)

AccessContextManagerAccessLevel
AccessContextManagerAccessPolicy
AccessContextManagerServicePerimeter
ArtifactRegistryRepository
CloudBuildTrigger
CloudIdentityGroup
ComputeAddress
ComputeBackendBucket
ComputeBackendService
ComputeDisk
ComputeExternalVPNGateway
ComputeFirewall
ComputeForwardingRule
ComputeHTTPHealthCheck
ComputeHTTPSHealthCheck
ComputeHealthCheck
ComputeHealthCheck
ComputeImage
ComputeInstance
ComputeInstanceGroup
ComputeInstanceTemplate
ComputeInterconnectAttachment
ComputeNetwork
ComputeNetworkEndpointGroup
ComputeNetworkPeering
ComputeNodeGroup
ComputeNodeTemplate
ComputeProjectMetadata
ComputeRegionNetworkEndpointGroup
ComputeReservation
ComputeResourcePolicy
ComputeRoute
ComputeRouterInterface
ComputeRouterNAT
ComputeRouterPeer
ComputeSSLCertificate
ComputeSSLPolicy
ComputeSecurityPolicy
ComputeSharedVPCHostProject
ComputeSharedVPCServiceProject
ComputeSnapshot
ComputeSubnetwork
ComputeTargetGRPCProxy
ComputeTargetHTTPProxy
ComputeTargetHTTPSProxy
ComputeTargetHTTPSProxy
ComputeTargetInstance
ComputeTargetPool
ComputeTargetSSLProxy
ComputeTargetTCPProxy
ComputeTargetVPNGateway
ComputeURLMap
ComputeVPNGateway
ComputeVPNTunnel
ContainerCluster
ContainerNodePool
DataflowFlexTemplateJob
DataflowJob
DNSManagedZone
DNSPolicy
DNSRecordSet
FirestoreIndex
IAMCustomRole
IAMServiceAccount
IAMServiceAccountKey
KMSCryptoKey
KMSKeyRing
LoggingLogSink
MemcacheInstance
MonitoringAlertPolicy
MonitoringNotificationChannel
PubSubSubscription
PubSubTopic
RedisInstance
Folder
Project
ResourceManagerLien
ResourceManagerPolicy
SecretManagerSecret
SecretManagerSecretVersion
ServiceDirectoryEndpoint
ServiceDirectoryNamespace
ServiceDirectoryService
ServiceNetworkingConnection
Service
SourceRepoRepository
SpannerDatabase
SpannerInstance
SQLDatabase
SQLInstance
SQLSSLCert
SQLUser
StorageBucket
StorageBucketAccessControl
StorageDefaultObjectAccessControl
StorageNotification
StorageTransferJob

I then wrote a simple parameterised bash script to export BigQuery & Non-BigQuery resources.

#!/bin/bash

sudo apt-get install google-cloud-sdk-config-connector

if [$2 == ""]; then
input="resourceTypes_exceptBQ.txt"
while read -r line
do
echo "Exporting resource - $line"
mkdir -p ../$1/$line
gcloud beta resource-config bulk-export --resource-types=$line --path=../$1/$line --project=$1 --resource-format=terraform
echo "Deleting resource folder if empty"
rm -d ../$1/$line
done < "$input"
else
input_bq="resourceTypes_BQ.txt"
while read -r line
do
echo "Exporting resource - $line"
mkdir -p ../$1/$line
gcloud beta resource-config bulk-export --resource-types=$line --path=../$1/$line --project=$1 --resource-format=terraform
echo "Deleting resource folder if empty"
rm -d ../$1/$line
done < "$input_bq"
fi

The script accepts the GCP Project ID as command line argument. The script can be executed as follows:

./export_resources [project-id] [Optional argument:bq]

The second argument ‘bq’ is optional. If specified, it exports the BigQuery resources. If not present, the script exports all Non-BigQuery resources.

References:

--

--