Making Terraform Testing Groovy
Part 1 — Pre-provision Testing
This series is broken into three parts:
- Part 1: Pre-provision Testing (Sentinelesque plan assertions)
- Part 2: Post-provision Testing (Stack and State assertions)
- Part 3: Reusable Test Artifacts
tf-test-groovy is part of the tf-maven project which includes a plugin that allows for the execution of terraform and external dependency management via maven. The project also provides a java-based command wrapper that allows executing Terraform commands and capturing the output. More about that in this DZone article. The wrapper also exposes functionality that allows for Terraforms stacks to be created, tested and destroyed with any Java-compatible testing framework. Java testing frameworks have a very mature ecosystem with great IDE integration. tf-test-groovy is a groovy project that implements a fluent builder API on top of the command wrapper. It provides a concise way of executing Terraform and helper objects for interacting with the Terraform Plan and State JSON produced by the
terraform show command introduced in version 0.12. It also provides helper functions for testing infrastructure similar to the existing Terratest.
This type of testing is a form of testing that makes assertions against a terraform plan. This style of testing has been advocated by Terraform’s Sentinel Rules provided as a feature of paid Terraform Enterprise or Terraform Cloud. The rules are written in GoLang and executed on Terraform Enterprise as part of the provisioning lifecycle. The idea is that rules can be used to prevent some terraform code that violates a corporate policy or security control from being applied. Examples of this could be enforced
- a set of tags are applied to all ec2’s. Sentinel [example]
- security groups don’t have an outbound rule on 0.0.0.0/0. Sentinel [example]
The TF Maven project recently added the
terraform show command which allows for building tests that are equivalent to Sentinel in any testing framework or language supported on the JVM (TestNG, JUnit, Spock …). In our case we have chosen to use Spock + Groovy. There are several advantages to this:
- Cost- No need to buy Terraform Enterprise or Cloud
- Shift Left- Test can be executed on developer machines
- Leverage existing Java devs in your organization
- Consistency- Maven can be executed locally or as part of the CI/CD process
- Reusable- reusable versioned artifacts can be published to Nexus, Artifactory, etc.
- IDE integration- Test case reports integrate with standard IDE
So what does it look like?
The following is an example of a Spock test that utilizes the root module s3_pre_post_demo which creates two S3 buckets and verifies that the buckets have versioning enabled and the mandatory tags are applied. The code for the examples can be found here. At line 55 in the setupSpec method, the tf-test-groovy fluent API is used to execute the plan and return a groovy object of type TfPlan this object is very similar to the Sentinel equivalent
import "tfplan/v2" and provides methods like getResourcesByType. With the magic of Spock’s where: data providers and @Unroll we are able to dynamically create test cases for each resource returned and use the Address of the resource as the name of the test. This allows for easy mapping of the test back to the code that is being tested even when this is code is in a nested module hierarchy.
@Unroll(“#s3Bucket.address has all the mandatory tags”)
Mature Java testing frameworks have great integration with existing IDE’s like IntelliJ. Intellij even provides integration with the Spock
@Unroll dynamic test case generation. The dynamic nature of
@Unroll allows the test cases to be named with the address of the resource within the context of the .tf Terraform source code. Highlighted in the (left) figure below the test case is named aws_s3_bucket.bucket2 … which is the address relative to the root module source. This allows for easy mapping of errors back to the offending source code in Terraform. If the resource is nested in a submodule the naming indicates the address path through the module hierarchy such as module.s3module.aws_s3_bucket.bucket3
Individual test results in the (right) figure above show a failed test case with clear indication of why the test failed — resource is missing the required tag “stack_name”. Spock provides a framework that allows for very expressive test cases and assertions for clear test reporting.
Gotcha Computed Values
A big gotcha with pre-provision testing is that if you have any dynamic nature in the Terraform modules it will result in unknown/computed values. This often is the result of using Datasources to query and use values to dynamically change the behavior of the Modules. The result is that you have to decide whether to ignore the assertions if the value is computed. Some of the Sentinel examples have code that does exactly that. If there are lots of unknowns in the plan, it’s easy to question the value of pre-provision testing. Our recommendation would be to try and build cases that can be used in the pre-provisioning stage for a sanity check to provide quick developer feedback and can be reused for post-provisioning.
Because of this, we believe the only way to get good coverage of infrastructure is to do testing at multiple levels which aligns with the concept of Swiss cheese testing. Don’t get hung up on high coverage numbers on a particular type of test and take a holistic approach to testing. For infrastructure, I think post provisioning testing of modules with realistic input values provides better value than pre-provision testing. The highest value tests being integration tests where the infrastructure and application are tested together. More details about post-provision testing in Part 2
If you’re looking at alternatives to spending lots of money on Terraform Enterprise or Terraform Cloud, you want to shift left with your testing and empower your developers with quick feedback then take a look at tf-test-groovy. We welcome any contributions and feedback. We are working to add more helper functions for querying
terraform show output and adding AWS helper functions similar to what can be found as part of Terratest. This should help increase the velocity of developing test cases and make the test more expressive.