Serverless architecture with AWS
Amazon Web Services provides some interesting services to build and run applications without the need to manage infrastructure. In a serverless architecture each service also runs on a server, but the provisioning is completely operated by the cloud provider. Due to that fact we can save much time and can concentrate on the implementation part. However it’s very important to take into account that the entire architecture is affected by this. Traditional backend servers can be replaced with cloud functions and acting as discrete single purpose services (BaaS, FaaS). Therefore we have to decide if it makes sense to use serverless architecture in our solution.
In this article we will concentrate on the AWS services API Gateway, Lambda DynamoDB and how we can implement an example based on these services. We will not go into deep theoretical details for the used AWS services. But to understand them in comparison to a classic architecture, a short specification follows:
- API Gateway — Provides a scalable, secured front-end for service APIs.
- Lambda — User-defined functions that can perform small operations, where AWS manages provisioning and scheduling how it is run.
- DynamoDB — A NoSQL database with focuses on speed, flexibility, and scalability.
But first we are going to have a look at some significant benefits by using cloud services in an application:
- Reduce Costs — We can remove the need for the traditional 24/7 server system sitting behind an application. This means we are not obliged to pay for an always on server but only for each call.
A little comparison (Just compare EC2 with Lambda):
Let’s assume we need an implementation that gets an object with some parameters and puts it into a database. This logic will be invoked 100.000 times per month and each call will take 5 seconds.
- EC2 Instance (Linux on t2.micro / us-east) monthly cost = $8.79
- Lambda (128 MB allocated memory) monthly cost = $1.06 - Reduce administration — The application still runs on servers, but all the server management is done by AWS. It’s no longer required to provision, scale, and maintain servers to run an application, databases, etc.
- Increase availability — AWS services are high-available and fault-tolerant out of the box. Therefore it’s not required to backup your code, configurations, etc. in case of data center breakdowns.
Implementation
To demonstrate how easy it is to realize a solution without a running custom server, we want to implement this following example:
A user calls a post request with a JSON body and the parameters should be put into a DynamoDB table. The API should be provided via the API Gateway service and the resource should trigger a Lambda function. The Lambda function gets the parameters and put them into the database.
It’s not essential to set up a Lambda function between the API Gateway and the DynamoDB, but without it’s not possible to verify, parse the parameters or handle errors.
The common way to describe and create architecture with AWS is to use CloudFormation templates. But we will use the AWS CLI to create the example stack step by step. If you have not already installed the CLI tool and configured your account, please read the official Amazon documentation http://docs.aws.amazon.com/cli/latest/userguide/installing.html. Please use the region us-east-1 (N. Virginia).
During this tutorial we have to keep some output values like IDs for further commands. To make life easier you can export these values to variables (e.g. export id=123, echo $id).
1. DynamoDB
At first we create a DynamoDB table with the name user and set a primary key called uid.
aws dynamodb create-table --table-name user \
--attribute-definitions AttributeName=uid,AttributeType=S \
--key-schema AttributeName=uid,KeyType=HASH \
--provisioned-throughput ReadCapacityUnits=5,WriteCapacityUnits=5
After successful creation we get a response with the object TableDescription.
2. Lambda function
Before we can create the Lambda function, we have to define an IAM Lambda service role and attach rights for execution and DynamoDB access.
We create a new role called lambda-dynamodb-execution-role. To save time we can use our predefined lambda service policy JSON.
aws iam create-role --role-name lambda-dynamodb-execution-role --assume-role-policy-document https://s3.amazonaws.com/byteagenten-public/blog/aws-serverless/iam-role-lambda-policy.json
After that we attach policies for Lambda function execution and DynamoDB access.
aws iam attach-role-policy --role-name lambda-dynamodb-execution-role --policy-arn arn:aws:iam::aws:policy/service-role/AWSLambdaBasicExecutionRoleaws iam attach-role-policy --role-name lambda-dynamodb-execution-role --policy-arn arn:aws:iam::aws:policy/AmazonDynamoDBFullAccess
Now we can create the Lambda function putitem and add the prior defined role. To save time we can use our ready to use Node.js implementation (You can find the code at the end of the article).
aws lambda create-function \
--function-name putitem \
--code S3Bucket=byteagenten-public,S3Key=blog/aws-serverless/putitem.zip \
--role arn:aws:iam::358896327293:role/lambda-dynamodb-execution-role \
--handler putitem.handler \
--runtime nodejs6.10
From the response we have to keep the value FunctionArn for further usage.
3. API Gateway
Next we create the REST API that will trigger the prior created Lambda function. Unfortunately there a some individual commands required to realize it via the CLI. If you are familiar with the AWS web console you can use it too. With the console some steps will be done by AWS automatically, so with a few clicks your API is ready to use. But if you want to understand each required step in detail, the CLI is the proper instrument.
First we create a new API called serverless-backend
aws apigateway create-rest-api \
--name serverless-backend
From the response we have to keep the value id (api-id) for further usage.
Next we need to request the root resource-id from the API. Please replace the api-id for the following command.
aws apigateway get-resources \
--rest-api-id api-id
From the response we have to keep the value id (resource-id) for further usage.
Next we put a POST method to the existing root resource. For this we have to use the prior requested ids. Please replace api-id and resource-id for the following command.
aws apigateway put-method \
--rest-api-id api-id \
--resource-id resource-id \
--http-method POST \
--authorization-type NONE
Now we trigger the Lambda function from the POST method. For this command we also need to replace the api-id and resource-id parameters. In addition we need to build the uri parameter with the help of the value FunctionArn from the lambda create-function response.
aws apigateway put-integration \
--rest-api-id api-id \
--resource-id resource-id \
--http-method POST \
--type AWS \
--integration-http-method POST \
--uri arn:aws:apigateway:us-east-1:lambda:path/2015-03-31/functions/{FunctionArn}/invocations
To passthrough the Lambda function callbacks as response to the API requestor, we have to add a method- and integration-response to the API too. Please replace the ids once again.
aws apigateway put-method-response \
--rest-api-id api-id \
--resource-id resource-id \
--http-method POST \
--status-code 200 \
--response-models "{\"application/json\": \"Empty\"}"aws apigateway put-integration-response \
--rest-api-id api-id \
--resource-id resource-id \
--http-method POST \
--status-code 200 \
--response-templates "{\"application/json\": \"\"}"
Next we need to deploy the created API to stage. Please replace the api-id.
aws apigateway create-deployment \
--rest-api-id api-id \
--stage-name prod
Last but not least we have to grant permissions that allows API Gateway to invoke the Lambda function. Please replace the aws-account-id and api-id.
aws lambda add-permission \
--function-name putitem \
--statement-id apigateway-prod \
--action lambda:InvokeFunction \
--principal apigateway.amazonaws.com \
--source-arn "arn:aws:execute-api:us-east-1:{aws-account-id}:{api-id}/prod/POST/"
Try out
Congratulations! After that CLI trip we can use the API from everywhere. We can request the API with the URL https://{api-id}.execute-api.us-east-1.amazonaws.com/prod/. Do not forget to replace the api-id.
Let’s use CURL to send a POST request with some parameters:
curl -X POST -d "{\"uid\":\"1\",\"username\":\"me\",\"email\":\"me@company.com\"}" https://{api-id}.execute-api.us-east-1.amazonaws.com/prod/
After successful call we receive a response message like this:
"Put user me with email me@company.com to DynamoDB."
Now lets scan the DynamoDB table to verify the data items:
aws dynamodb scan --table-name user
With this solution we have now a working example with serverless architecture.
Henceforth you can improve your implementation for your specific requirements without the need to repeat each of the described steps from scratch. Add GET requests for database items, secure API with authentication methods, refine API JSON object mappings, setup your domain with Route53 for better URL, etc.
Node.js 6.10 implementation
var doc = require('dynamodb-doc');
var db = new doc.DynamoDB();exports.handler = function(event, context, callback) {
var uid = event.uid
var username = event.username;
var email = event.email;
console.log(uid + "," + username +","+ email); var tableName = "user";
var item = {
"uid": uid,
"username":username,
"email": email
}; var params = {
TableName:tableName,
Item: item
}; console.log(params);
db.putItem(params,function(err,data) {
if (err) {
console.log(err);
callback(err);
} else {
console.log(data);
callback(null, "Put user " + username + " with email " + email + " to DynamoDB.");
}
});
};