A Practical Guide to Surviving AWS SAM

Part 5 — Parameters

Paolo Fusari
BIP xTech

--

In this chapter, we will talk about a must-have for production-ready AWS SAM template, parametrization, the process of defining placeholders for values instead of hardcoding configurations with the goal of maintaining only one template for each of your environments. Parametrization must be addressed at the beginning of the project with the naming convention to be used, but don’t be too paranoid trying to define everything as a parameter, in the end somewhere you will need to insert these values, try to find the right balance. As always all the code can be found on my GitHub page.

Cloudformation and consequently SAM already has a built-in Parameters section that is a good starting point. For example, you can easily define an Env parameter with its type, allowed values and description to be used with the usual Ref function for an environment variable to be passed to a Lambda Function

Simple Env parameter

Parameter value then can be specified during deployment using parameter-overrides option with CLI command or following the guided deployment and then save the configuration in the toml file.

sam deploy --parameter-overrides Env=dev

Now that you have defined your first parameter, we can see some more advanced use cases to reach the final goal of a single template per environment. Often parameters are shared among templates and teams, that’s where SSM Parameter Store comes in handy — key-value storage optimized for data and password management, fully integrated with SAM. You have two option to use it in your template, but first, you need to create an entry in the Parameter Store. For testing purposes, you can create it directly from the console but always keep in mind to define some sort of automatization also for this task.

Example of Parameter Store

VPC configurations for a Lambda Function can benefit from Parameter Store because creating an entry for Subnets and Security Group could generate a single source of truth for this information instead of hardcoding it, so a change in one of this value will require an update of the Parameter Store and a new deploy of the application, without touching the template.

This time the type of the parameter must be AWS::SSM::Parameter::Value<Type> telling CloudFormation that the value passed to this parameter must be interpreted as the name of the Parameter Store where the real value should be retrieved. The string between angular parenthesis is used to specify the type of the parameter contained in the Parameter Store. For example for a parameter with type AWS::EC2::SecurityGroup::Id AWS before applying the template will check the existence of the Security Group in the account and region where the template will be deployed.

We can also leverage the Default section of a parameter giving it a default value that should be used if no value has been passed during deploy, in case of need it’s always possible to pass a new value overriding the default one without touching the template.

SSM Parameter Store Parameters

Again this type of parameter can be used with the usual Ref construct, we will soon see that’s not always the case.

Once the stack is deployed, parameter values can be retrieved from the CloudFormation console and, for SSM Parameter Store, both value and name will be reported for better visibility and debugging. If you are dealing with sensitive information that you don’t want to expose in plain text, a noEcho entry can be added to the parameter to see an obfuscated value from the console.

Parameters visible in CloudFormation console

The other way to use Parameter Store in AWS SAM is to embed it directly in the Resources section without passing through Parameters section. This time the parameter will be hidden in your template making it less visible and not exposed to users. The other great difference is that here you can specify also the version of the Parameter Store to be retrieved while using Parameters section always last version available will be retrieved. Keep in mind that each edit on a value in the Parameter Store will cause the creation of a new version.

SSM Parameter Store Resources

Let’s now see three ways to define a parameter to specify the log level to be enabled on a Lambda Function. Solution number one is the most simple, a basic parameter with a list of allowed values and its default passed as a reference to an environment variable.

Simple Log level

You can so instrument your pipeline to pass the value according to the environment, but maybe would be better to do something more clear directly from the template and that’s where Mappings section comes in hand. A template section where you can define values that will be assumed by a parameter based on another parameter, but let’s see in action to make it clear.

Log level Mappings

In this way, when the Env parameter is equal to dev or test, variable LogLevelMapping will be DEBUG instead with Env equals to prod variable will assume INFO value. With these kinds of parameters, you cannot use the usual Ref operator but you have to use FindInMap function. Now our template is autonomous in picking the right log level based on the environment without instrument the pipeline to do that.

You can also use Conditions section to parametrize log level. The idea behind Conditions is similar to Mappings but this time you can also use a logical condition like and, or, not operator to create more complex rules.

Log level Conditions

In this case, we defined a condition HasDebugEnabled that will be true only when Env parameter is different by prod, you can now use the If function to pick DEBUG or INFO value based on the condition. It’s up to you to combine and pick the parameter specification that mostly fit your use case, one solution may give you more flexibility while the other gives you readability that is a property that I think when using yaml should be always taken into account.

Till now, all the parameters that we have used will be static at deploy time, so a change in one parameter will need a re-deploy of the whole template. This may not be the case for some parameters that you want to be able to change at runtime.

A way could be to retrieve parameters directly from code using the AWS SDK, but I’ll encourage you to take a look at lambda power tools or ssm cache that expose also the capability to cache parameters. Pay attention to the fact that parameter store API has a default throughput limit of 40 transactions per second. This limit can be increased up to 1000 transactions per second but you will incur additional costs. If you have particular security requirements this approach would be better since parameters will be retrieved only when needed.

For example, you can see the following Python code to retrieve a parameter from Parameter Store with the libraries mentioned above

SSM Parameter store retrieve from code

This time no parameter should be added to the template but we need to add permissions to retrieve Parameter Store values. For this task comes in hand predefined policies managed by SAM, a great simplification in role management.

SSM Parameter Store Lambda role permission

Caching parameters helps to improve performance and limiting throughput, find the right value for cache expiration because if you need to quickly reset the value you will still need to perform a new deployment, forcing all your Lambda container to be destroyed limiting natural cache drain to the minimum.

Another useful SAM section is Globals, a section where you can put common configurations for some resource type, for example, you can specify a default timeout for the lambda function. Each resource of type AWS::Serverless::Function will have this default value but it’s always possible to override this value. Take some time to check the documentation about the behaviour of parameter override, for example, lists are additive while maps are merged.

SAM Globals

Check out also the documentation regarding which properties can be added to the Globals section. For example, you may have noticed that there is no possibility to define Policies in the Globals. At first, this seems like a limitation forcing you to duplicate a lot of code, but if you think about it, using Globals policy may lead to security leak against the least privileged principle. See it as a sort of strict helper to enforce only the needed permissions for each function, and if you really have the same permissions for each of your function, you can still create a role and attach it to each function.

If you have experience with CloudFormation when I explained the VPC parameters, you may have thought why not using import/export functionality? And you are right for sure. It is a valid option but has the prerequisite that your network has been created with CloudFormation, using Parameter Store you can decouple the two templates. But for the sake of completeness, you can see the equivalent management with Export functionality. Here you can also find a complete example of VPC configuration from CloudFormation

Export VPC configuration

Export functionality leverage Outputs section, the place where you put information that you want to expose about created resources. In particular, Export section not only expose within the stack this information but expose also for other CloudFormation stack giving capabilities to import this values. You can check the CloudFormation console under section Exports to review values.

CloudFormation Export console

Now with ImportValue function, values can be referenced directly in the template:

Import VPC configuration

Instead, if you have experience with Parameter Store you may know that can be defined as secure parameters, but this type of parameter is not supported by SAM directly, or better only a few resources support it. The correct way to use secrets in CloudFormation should be using AWS Secret Manager, that as the name suggests is the service that should be used to store secrets. You can still use a secure parameter inside your Lambda function if you want.

Secret Manager

As you can see very similar to Parameter Store management with resolve function.

Another way to manage parameters and I promise will be the last one and really fast, is using SAM capabilities of toml configuration file to store named environment. With this method, you can use the configuration environment section of guided deployment to create multiple version of you parameters values. It’s more related to environment management but for sure better to keep in mind that there is also this option.

SAM toml

As you can see the toml file is holding the value for Env parameter for two environments, dev that in our case matches with SAM default environment and test .

Photo by Reuben on Unsplash

Sorry for bothering you with all this yaml code this time but I wanted to highlight most (yes, there are more) of the ways to use parameters in an AWS SAM template. I know that there are infinite ways to do the same thing and it’s really hard to pick one. But the beauty of Serverless and AWS is that some changes will not be so disruptive. See you in the next chapter.

More content at bip.xTech

--

--