Multi-cloud Terraform remote state
In Terraform, the state file is used to track the current state of the infrastructure. If you are working in a team use of a local state file makes Terraform usage difficult because each user must make sure they always have the latest state data before running Terraform and make sure that nobody else runs Terraform at the same time. Because of this, it is important to store the state file in a shared location when teams can easily pull and push files to. This is where remote state comes becomes important. Remote state allows Terraform to store the state file in:
- AWS S3
- Azure Storage Account
- Google Storage Account
Note: Personally I don't recommend using Github for remote state, the problem is you don’t want to push changes after every deployment as you may still be building the environment. Also, you may not want to pull or rebase after someone releases or does a deployment as you might be working on a different component.
Object Storage is the best and recommended method of storing Terraform remote state. I have been using it for a few years now and never had any issues. It works with CICD without any issues.
Multi-cloud remote state?
How do you design the layout of Terraform remote states in a multi-cloud environment?
- Mange state in Terraform Cloud.
- Consolidate remote state to one cloud provider.
- Manage state per cloud provider.
Mange state in Terraform Cloud:
I haven't had the chance to work on Terraform Cloud but this should work great.
Consolidate remote state to one cloud provider:
Create a storage account in Google Storage or S3 and store all terraform state files. The concern with this approach is that users who are deploying in AWS may not have permission to Google Storage or users who are deploying AWS may not have permission to Azure Storage. As organizations grow there will be different teams managing different cloud providers. So, plan for the long term.
Manage state per cloud provider:
This is simple. Create storage for each cloud provider.
- Google deployments go to Google Storage.
- AWS deployments go to S3.
- Azure deployments go to Azure Storage.
- Create separate buckets for Prod and NonProd environments.
- Ensure buckets are versioned.
I created a python package to manage remote states in a multi-cloud environment. Read more on tf. tf is a python package for managing Terraform remote state for GCP, AWS, and Azure.
pip install tfremote --upgrade
Set below environment variables, depending on the cloud provider:
To see available options, run
To deploy in Google Cloud. Set below environment variables:
export TF_GCLOUD_BUCKET= <remote state bucket>
export TF_GCLOUD_CREDENTIALS= <Path to google service account file>
Deploy for team demo-team project demo-app in workspace demo-workspace:
tf -c=gcloud apply -var='teamid=demo-team' -var='prjid=demo-app' -w=demo-workspace'
When you do the deployment it will create a structure in Google Cloud Storage as Google bucket -> demo-team -> demo-app -> demo-workspace.
Note: If no
-w is provided all deployments will go to the
Structure in AWS:
- demo-team -> demo-app -> demo-workspace -> demo-state-key.tfstate
Note: By default tfstate file is named as terraform.tfstate. If case you want to specify a custom name use
-sflag. Deploy for team demo-team project demo-app in workspace demo-workspace:
tf -c=aws apply -var='teamid=demo-team' -var='prjid=demo-app' -w=demo-workspace' -s=demo-state
Structure in Azure:
- foo-team -> bar-app -> demo-workspace -> tryme_key.tfstate