How to Manage Sensitive Information with Terraform on Google Cloud

Yusuke Enami(Kishishita)
4 min readAug 5, 2023
Terraform logo by HashiCorp

Introduction

This article details the management of sensitive information using Terraform on Google Cloud.

Sensitive information, such as API keys and database credentials, should not be stored in a GitHub repository. However, we often need to share these keys within a team.

So, how should we address this concern?

Practical Methods for Managing Sensitive Information

Note: In practice, it’s advisable to use a private repository to ensure the security of your project.

One recommended option for managing secret information is sops.

sops

sops is an encryption tool developed by Mozilla.

It supports various formats such as YAML, JSON, ENV, INI, and BINARY and can be used across multiple cloud platforms, including Google Cloud, AWS, and Azure.

sops-provider

For Terraform, a sops-provider has been developed, which automatically decrypts an encrypted file when you plan or apply your Terraform code.

The decrypted value is automatically masked as (sensitive value) in the output of the plan or apply command, reducing the need to manually manage sensitive information.

Note: The automatic masking feature is not available for versions of Terraform below v0.15.

Cloud Key Management Service (Cloud KMS)

The Cloud Key Management Service (Cloud KMS) allows you to create and manage cryptographic keys, which are organized into key rings for use in compatible Google Cloud services and your own applications.

In this demonstration, we’ll use a key from Cloud KMS as an encryption key. To perform encryption and decryption operations, the following role is required:

  • roles/cloudkms.cryptoKeyEncrypterDecrypter

First, let’s add the sops-provider and create the necessary resources.

terraform {
required_providers {
google = {
source = "hashicorp/google"
version = "~> 4.0"
}
sops = {
source = "carlpett/sops"
version = "~> 0.7"
}
}
}
// kms.tf
resource "google_kms_key_ring" "key_ring" {
project = google_project.project_one.project_id
name = "test-key-ring"
location = "global"
}

resource "google_kms_crypto_key" "test_key" {
name = "test-key"
key_ring = google_kms_key_ring.key_ring.id
purpose = "ENCRYPT_DECRYPT"
}

data "google_iam_policy" "encrypter_and_decrypter" {
binding {
role = "roles/cloudkms.cryptoKeyEncrypterDecrypter"
members = ["user:xxx@test.com"]
}
}

resource "google_kms_key_ring_iam_policy" "key_ring_iam" {
key_ring_id = google_kms_key_ring.key_ring.id
policy_data = data.google_iam_policy.encrypter_and_decrypter.policy_data
}

Next, we’ll create a test API key.

// secrets/test-apikey.json
{
"test":"aaa"
}

This API key is then encrypted using sops.


sops --input-type json --encrypt --gcp-kms projects/<PROJECT_ID>/locations/global/keyRings/test-key-ring/cryptoKeys/test-key secrets/test-apikey.json > secrets/test-apikey_encrypted.json

With the preparation of our code completed, we can now create a Cloud Function resource using the API key and plan the code.

resource "google_cloudfunctions_function" "test-fnc" {
project = google_project.project_one.project_id
name = "test-fnc"
region = "asia-northeast1"
description = "test function"

runtime = "go119"
timeout = 120
entry_point = "TEST"
trigger_http = true
source_archive_bucket = google_storage_bucket.bucket.name
source_archive_object = google_storage_bucket_object.archive.name

environment_variables = {
APIKEY = data.sops_file.api_key.data["test"]
}
}

data "sops_file" "api_key" {
source_file = "./secrets/test-apikey_encrypted.json"
}
Terraform will perform the following actions:

# google_cloudfunctions_function.test-fnc will be created
+ resource "google_cloudfunctions_function" "test-fnc" {
+ available_memory_mb = 256
+ description = "test function"
+ docker_registry = (known after apply)
+ entry_point = "TEST"
+ environment_variables = {
+ "APIKEY" = (sensitive value)
}
+ https_trigger_security_level = (known after apply)
+ https_trigger_url = (known after apply)
+ id = (known after apply)
+ ingress_settings = "ALLOW_ALL"
+ max_instances = 0
+ name = "test-fnc"
+ project = "ksst-project-one"
+ region = "asia-northeast1"
+ runtime = "go119"
+ service_account_email = (known after apply)
+ source_archive_bucket = "go-test-function-bucket"
+ source_archive_object = "test.zip"
+ timeout = 120
+ trigger_http = true
+ vpc_connector_egress_settings = (known after apply)
}

Just as a precaution, it’s a good idea to check the Cloud Function’s environment variable in the console.

Congratulations, you’ve successfully managed sensitive information with Terraform on Google Cloud! 🎉🎉🎉

Summary

Cloud KMS and sops are really good option to manage the secret information within a team.

It is crucially important here to pay careful attention to DO NOT push a raw sensitive information file into a git repository. You better to setup a operation rule like encryption is executed outside of the local repository.

And Google Cloud provide Secret Manager to manage the sensitive information, so this is one of the option to keep your code security.

Thank you for reading!

Cloud KMS and sops serve as effective tools for managing sensitive information within a team.

It’s critically important to remember not to push raw sensitive information files into a git repository. Setting up an operational rule where encryption is executed outside of the local repository is a good practice.

Moreover, Google Cloud offers Secret Manager as a solution for managing sensitive information, presenting another viable option to ensure the security of your code.

Thank you for reading!

--

--

Yusuke Enami(Kishishita)

DevOps engineer in Japanese company. I love Google Cloud/Kubernetes/Machine Learning/Raspberry Pi and Workout🏋️‍♂️ https://bigface0202.github.io/portfolio/