AWS Proton — 101 and detailed code example

Rostyslav Myronenko
11 min readJul 16, 2021

--

Table of contents

Hi everyone,

If you work with AWS, you know about services that allow defining infrastructure as code. CloudFormation is the main one which is considered as a baseline for other tools that simplify the definition of infrastructure (AWS SAM, serverless.com framework, etc.). In this post, I would mention another AWS service that is pretty new — AWS Proton.

What is AWS Proton

Proton is another service that is aimed at infrastructure as code and uses CloudFormation (SAM works as well), but its major difference to other services is decoupling of infrastructure and code. Proton introduces two entities in its setup:

  1. Service. It is an AWS resource or set of resources that usually focused on business logic and contain custom developer code. Any repeatable item in an AWS application can be considered as a Proton service instance (e.g. Lambda handler in multi-lambda environment or Fargate containers in microservices architecture). In addition to infrastructure, the Proton service instance may have its own CodeDeploy pipeline and webhook established during the creation of a service instance, so custom code will be automatically built and deployed to a service instance. Proton service instances can be created only inside a Proton environment.
  2. Environment. It is a set of shared resources and policies which are shared across service instances that are created in this environment (common DB, API Gateway, VPC, etc.).

Both Service and Environment are created from Proton templates — it is a proton-specific file structure that uses CloudFormation to define resources for an environment (environment template), service (service template), and code pipeline for service (optional, included into service template).

Both Service and Environment can consume arguments from custom parameters defined in a template (similar to CloudFormation parameters). This is the approach to create similar services with a few differences from a single template.

Below is a link to official Amazon documentation about Proton.

As for now (July 2021), AWS Proton is so new that it is not a part of AWS SDK and does not present in older versions of AWS CLI (version 1), in particular, it is absent in AWS CLI which comes with Amazon Linux 2 AMI. I have spent some time establishing a working example which would like to share with you.

Architecture example — item CRUD API

The implementation of this scenario is given in the Hands-on section below if you need to proceed immediately to the code example.

Let’s assume we have a typical serverless application as given in the following diagram. It is a CRUD API with Lambda handlers in any programming language.

If we translate this simple architecture to AWS Proton terms, we can define the following Proton items:

  1. Proton environment — API Gateway and DynamoDB.
  2. Proton services — Lambda functions
  3. Proton service pipelines — CI/CD for lambda functions

Proton environment

Let’s start with the definition of our Proton Environment.

Proton file structure for an environment template is given in the picture below:

The schema.yaml file defines the input parameters for the environment. The values for these parameters should be specified during environment creation from this template. As an example, I introduced the following environment-specific parameters. As Cloud Formation does not allow to create blank API Gateway without any resources/methods, I have defined a health check Lambda function for a particular environment — this seemed to be the most rational and useful.

  1. DynamoDB table name
  2. Stage name for API Gateway stage where lambda functions will be attached to
  3. Lambda alias name — to point API Gateway methods to this alias

The schema.yaml file on the picture below illustrates how the input parameters may look like:

manifest.yaml specifies infrastructure template files and rendering engine:

cloudformation.yaml is a CloudFormation template which describes infrastructure for our Proton environment. To consume arguments provided during environment creation, the environment.inputs object is used. For example, the name of the DynamoDB table is defined as:

TableName: "{{ environment.inputs.table_name }}"

To connect lambda services with these environmental resources, table name and some API Gateway properties were specified as the Outputs:

Please read more about the structure of the template bundle at https://docs.aws.amazon.com/proton/latest/adminguide/ag-template-bundles.html

To create an environment template, the file tree with files should be archived as a tar.gz archive (yet another mandatory requirement):

tar -czvf environment-template.tar.gz environment/

The created archive should be uploaded to S3 and used to register the environment template. Please see https://docs.aws.amazon.com/proton/latest/adminguide/template-create.html

Proton Service

Proton file structure for a service template is given in the picture below:

The schema.yaml file is a definition of input parameters for service instance and code pipeline which should be specified during service instance creation from this template. For the example I have specified the following parameters to configure the main parameters of Lambda functions:

  1. Lambda function runtime
  2. Lambda function handler
  3. Lambda function timeout
  4. Lambda function memory
  5. Lambda function code URI
  6. API path (resource to be created at shared API Gateway created from environment template)
  7. API HTTP method (method to call the resource)

Top of this file:

To configure the code pipeline for this lambda service, I have specified the following parameters:

  1. Unit test command (command to run unit tests)
  2. Package command (command to create a valid package with Lambda function — ZIP or JAR file)
  3. Package name (name of the created package after execution of the package command)
  4. Package folder (relative folder from a source root where the package is created after execution of the package command)

A part of the schema.yaml file on the picture below illustrates how the input parameters for a pipeline may look like:

The purpose of instance_infrastructure/manifest.yaml and instance_infrastructure/cloudformation.yaml is the same purpose as in the environment template. To consume arguments provided during service creation, the service_instance.inputs object is used. For example, the lambda function runtime in the template looks like as:

Runtime: "{{ service_instance.inputs.lambda_runtime }}"

To link lambda functions with API Gateway and DynamoDB we use theenvironment.outputs object. E.g. table name added to Lambda environment variables as:

Environment:
Variables:
TABLE: "{{environment.outputs.EnvironmentTable}}"

Service pipeline

Code pipeline for service instance is defined in files in the pipeline_infrastructure folder.

pipeline_infrastructure/manifest.yaml is an exact copy of other manifests files.

pipeline_infrastructure/cloudformation.yaml is the CodePipeline definition. This is where resources and commands for the pipeline are specified. To consume pipeline arguments provided during service creation thepipeline_inputs object is used. E.g. to execute package build command it is taken as:

"{{ pipeline.inputs.packaging_command }}"

The pipeline part currently is the most tricky part of work with Proton because it uses AWS CLI Proton commands. Please keep in mind that it is absent in the CLI version specified for AMI used for CodeBuild, so I had to specify commands to install AWS CLI (version 2) as well as some utilities to update service specifications.

To create a service template, the file tree with files should be archived as tar.gz archive:

tar -czvf service-template.tar.gz lambda-service/

The same approach is to be used for the environment template, it should be uploaded to S3 and used to register a service template.

A full step-by-step guide to setup Proton

Code examples

Please see working examples at my GitHub:

https://github.com/rimironenko/aws-serverless-proton-project — Proton template files to package and register Environment and Service templates and create a Proton Environment and a Proton Service.

https://github.com/rimironenko/get-item-lambda-service — implementation of getItem function in Java which can be used to create a service instance based on templates above.

For the data model I have chosen books and defined the following model for DynamoDB:

  • isbn (String) — primary key
  • author(String) — book author
  • name(String) — book name

Ensure you are using the AWS region which supports Proton!

I have used us-east-1 (N.Virginia).

1. Create an S3 bucket for proton templates

Create an S3 bucket to store Proton templates. See https://docs.aws.amazon.com/AmazonS3/latest/userguide/creating-bucket.html

2. Create archives with Proton templates

Checkout repository with Proton templates:

git clone https://github.com/rimironenko/aws-serverless-proton-project.git

Navigate to the project folder and create tar.gz archives with Environment and Service templates:

cd aws-serverless-proton-project
tar -czvf environment-template.tar.gz environment/
tar -czvf service-template.tar.gz lambda-service/

Upload created tar.gz archives to S3 bucket.

3. Register templates in Proton service

Open AWS console and go to the Proton service page.

In the left nav, select “Environment templates” and click “Create environment template”.

Fill in settings for template from S3 to be used as given below.

Fill in the template name and click “Create environment template”.

To publish the uploaded template version, select the last template version by radio button and click “Publish”.

In the left nav, select “Service templates” and click “Create service template”.

Fill in settings for template from S3 to be used as given below.

Select environment template created before as the compatible environment template:

Check that the Pipeline option is checked. As it is optional, you can uncheck it, but the template should NOT contain the pipeline_infrastructure folder then. Otherwise, Proton will fail template registration with an error.

Click “Create Service template” and publish the created template version (similar to the Environment template).

4. Create an environment from the template

Go to the “Environments” menu item and click “Create environment”.

Select the environment template which was registered and click “Configure”.

Fill in the environment name and let Proton create an IAM role, or select an existing one and click “Next”.

Fill in custom parameters (which were defined in the schema.yaml file). An example of my custom arguments is given below.

Click “Next”, review settings, and click “Create” to allow Proton to provision infrastructure. It will be created after some time.

Outputs from the stack are listed under “Outputs”.

5. Create a source code connection

If you do not have any connection to VCS, create it in CodeCommit:

Create a connection as described at https://docs.aws.amazon.com/dtconsole/latest/userguide/connections.html

6. Create a service with the pipeline from the template

Go to the “Services” menu item, click “Create service” and select the template which was registered. Click “Configure”.

Specify service name and repository settings. Service name should be alphanumeric because it is used to create a Deployment under the hood (CloudFormation naming limitation)

Click “Next” and fill in custom service inputs which were defined in the instance_infrastructure/schema.yaml file in the service template. As the environment, select an environment that was created. For code URI please use any S3 URI to a valid lambda package (code itself does not matter because CodePipeline will redeploy it from the VCS repository).

Fill in pipeline inputs that were defined in the pipeline_infrastructure/schema.yaml file in the service template.

Click “Next”, review settings, and click “Create”. Wait for some time until Proton will create the service and pipeline.

7. Validate service creation

Ensure that service exists with “Active” status.

Under the “Pipeline” tab you can see a link to CodePipeline execution.

Click it and ensure that pipeline was successful.

Code artifacts also are put to the S3 bucket created from the service template, so you can use it for any custom business flow with built code.

Proton will automatically register a webhook for you, therefore your lambda function code will be automatically built and deployed once a push event happens on the specified branch in the VCS repository.

8. Validate API

Go to the DynamoDB table which was created at the environment level and insert a test item into it.

Go to API Gateway, select a method on the resource which was created with the service instance, and click “Test”.

Provide “isbn” query parameter with the value of test item which was just created and click “Test”.

Ensure that JSON with test item is returned.

9. Clean up

Go to Proton service and click “Actions” — “Delete”.

Go to Proton environment and click “Actions” — “Delete”.

10. Troubleshooting

In case of any issues with the creation of the environment or service please find its stack in the CloudFormation and look into events to determine what is wrong with it.

For example, this was my mistake when I have used the incorrect name for the Proton service:

In case of any issues with the code pipeline, look into its output and logs to determine what is wrong with your custom build and deployment.

Conclusion

Advantages of AWS Proton:

  1. Decoupling of infrastructure code and developer code is very good
  2. It should be fine for typical AWS architectures if you can logically define resources for an environment and services. Once everything is configured, you need only a few Ops support and give most focus on pure development
  3. Seemed to suit better for pretty complex enterprise architecture

Disadvantages of AWS Proton:

  1. In the current state proper configuration of environments and services is overhead and workarounds may be required to make things working
  2. For small serverless applications, AWS SAM suits better due to much simpler configuration and deploy

Constraints of AWS Proton:

  1. Currently missed in AWS SDK (including is in progress) so only CLI and AWS console manipulations are available
  2. Up to date commands are present in AWS CLI v2 and the latest versions of AWS CLI v1, so you need to force AWS CLI upgrade which may impact your other AWS resources and operations. Also please note that Proton CLI commands are slightly different in v1 and v2
  3. As it manages infrastructure onto the CloudFormation stack, it relies on all its issues and limitations which may not allow you to create your custom infrastructure as designed

If you read this article till the end — thank you so much and hope it will help you answer the question “can I try AWS Proton for my particular case”.

Best regards, Rostyslav

--

--