Serverless: Super Simple API Development With Go, Terraform and AWS Lambda
With Serverless technology developer now with more powers and controls. With IaC using Terraform, create new API no longer need to deal with the Ops team or headache of designing server topology. Today we going to go through step by step API creation using Golang, Lambda, and Terraform.
We going to create one API that can print out
Hello world when we invoke the API URL. In AWS, to create a new API using Serverless stack as easy by attach a Lambda to the API gateway and start to serve your REST request, the following will be the super simple high-level diagram to explain this setup.
This project using the following tools, make sure you have them all install
- It also requires to have AWS environment to complete the AWS CLI setup, register a new AWS account for free here.
Setup Golang project
In your terminal type the following command to prepare the environment.
mkdir lambda-go && cd lambda-go && go mod init github.com/lambda-go && code .
What the command does is create a new folder
lambda-go and init Go mod with the package name
github.com/lambda-go and lastly, open with visual studio code, until this step we will use VSCode to work on the following steps.
Go project requires a
main function to start with, create one file
main.go with the following contents.
main the function will be the first entry point in the Golang project, and you notice it calling lambda package with one Handler, as in AWS Lambda world, everything that can process by Lambda considers an event, in our case we use
APIGatewayProxyRequest event type. Check out this doc for more detail about Lambda Golang SDK. At this step your go project is ready to serve the request in AWS environment, next we going to set up Terraform script to deploy lambda.
In my development, I use Makefile for every project, it is very important to use Makefile as a single point of entry for all the convenience command, and I will include in this sample project to explain the Terraform command. Create a new file
Makefile in your root folder, and fill in with the following contents.
This command will generate application binary and store in
This command will execute the terraform init command on a folder name
infra , this command requires to run once to initialize everything terraform require to provision your infrastructure.
This command will run the terraform plan command on a folder name
infra , this command will not make any changes to your infrastructure yet, but it gives you the detail about what will happen in your infrastructure.
This command will run the terraform apply the command to make a change to your infrastructure, it also auto-approve without human intervention. Make sure to run
make build to build your binary first before executing this command.
This command will run the terraform destroy command on the folder name
infra . This is the destroy command that will remove all the infrastructure component that you provision previously, be sure you know this is what you want to do.
Configure basic Terraform resources
Every section in Terraform is resources, every resource type does a different thing, and some are the mandatory resources and some are optional. Terraform using the concept of
state to record what is the resources that your Terraform script manage, in order to keep your
state the file you can configure local
state , which just does nothing and store the state in your local machine where your Terraform script init if you want to keep the
state file share by a few team member or run in an actual production environment, you need to consider
remote state , it will be out of topic to discuss detail about the
state configuration in this section, we will focus on configuring the basic Terraform resource that uses in this project.
Create a new folder
infra in your root directory, as the command is pointing to this folder, we need to store all our Terraform files in this
infra folder. Secondly, create one file
main.tf in the
infra folder and fill in the following contents.
provider is the first thing to configure in Terraform script, it defines where your resources will be provided too, it can have multiple providers, but you need to use
alias if more than one provider. If no credential configured in this section, it will use the local AWS credential.
Variable is the resource that defines an input argument that can be pass from console or command, the
default attribute defines the initial value of this
variable can be used in supplying controllable value to other resources. We have 2 variables in the file to use as the prefix naming for all our resources.
Locals is the local variable that can use in any place of the
terraform init directory, it can use as a computed variable to simplify complex arguments, in our case we use it to create a compute variable with combine
Data resources can be used to fetch the value from outside of Terraform, or in another Terraform script. It also provides some utility functions such as
archive_file data used to compress a file, in our case we use it to compress a file
build/bin/app and save it to
build/bin/app.zip so that lambda can use this zip file as binary upload content.
Resource components are everything else in the Terraform world, it uses to configure what resources we need to deploy to our environment. Beside use to create infra component, it can also use for utilities function like generate a random number, JSON parsing, and file lookup, in our case we use
random resource to generate a random prefix id.
Output is the export property in Terraform, it defines what is the information will be exposed or print out in the console, and use as
Data in another Terraform, in our case, we would like to print out our newly create API URL.
Configure lambda resource
In this section, we will configure one lambda resource, and role and policy that the lambda allow doing, if you are not sure what is role and policy, please read here. Create a new file
lambda.tf in the
infra folder, and fill in the file with the following content.
In this section, we configure one
aws_lambda_function with the name
lambda_func , if you follow the attribute, we configure the lambda use the
filename from our
archieve_file data and we also configure the Lamba to use
go1.x runtime. In order for the lambda to able to perform anything, it requires proper permission, hence we configure the lambda to able to assume the role with lambda principle and with one inline policy with
AWSLambdaBasicExecutionRole , this permission will make sure the console log out in the lambda program will write into
Configure API Gateway resource
We have a working lambda in the previous Terraform script, now we need to create an API gateway to receive API requests from the world. Create a new file
apigateway.tf in the folder
infra and fill in the content with the following script.
To create a proxy from API Gateway to Lambda, there is some requirement for the API resource name, it must have the following proxy setup.
Hence in the
apigw.tf the script we create 1
aws_api_gateway_resource and 2
aws_api_gateway_method in order to construct the correct proxy setup, besides that, we also configure 2
aws_api_gateway_integration to connect to our lambda, and 1
aws_api_gateway_deployment for the API deployment, lastly, we require
aws_lambda_permission to invoke lambda from the deployment path, here we have done our API setup.
Let's Deploy your API
Run the following command in order to deploy the API
After we have the
api_url , just fire the API and you will see the response
Run the following command to clear all your resources.
Just run it
If you lazy to go through step by step, just clone this repo and run it.
In this article, we go through the step by step setup the API gateway and Lambda with
Golang application, at the time of writing, AWS Lambda support 6 runtimes, including
Ruby . Serverless stack becoming more and more popular due to it remove the complicated requirement to set up your server and remove the overhead to manage a ton of server, it allows developers to focus more on business logic and drive more business value and boost the innovation. On the other side, Serverless Computing fully compliance with
Green Computing Initiative.With the recent announcement of
Firecracker the project, a new virtualization technology that will apply in AWS Lambda, it brings the Serverless Computing into a new era.