The LAN Stack: Going 100% Serverless

Chas DeVeas
The Storyblocks Tech Blog
4 min readAug 9, 2018
Photo by ian dooley on Unsplash

With the release of Amazon Aurora Serverless, which recently went into public beta, you can now build a standard web app which is 100% serverless. This post walks through a basic serverless implementation using Lambda, Aurora, and NodeJs.

Lambda Aurora Node… LAN Stack.

Overview

The high-level view of our LAN application is an REST API written in NodeJs and deployed to Amazon API Gateway + Lambda using the wonderful serverless framework with a database hosted using Aurora Serverless. Although not necessary for all web apps, you may easily add a frontend to this setup with a single page app hosted on S3 and served with CloudFront, putting a nice decoupled cherry on top of this delicious serverless cake.

Serverless Framework

The serverless.js library is a great tool which makes it dead simple to deploy Node.js code to Amazon Lambda. The only steps you need to get up and running are to npm install serverless and then configure serverless.yml in the project root.

A bare-bones serverless.yml configuration may look like this:

# serverless.yml
# @see docs.serverless.com
service: my-serverless-apiprovider:
name: aws
runtime: nodejs8.10
functions:
helloWorld:
handler: src/helloWorld
events:
- http: ANY /hello

This deploys a function exported in src/helloWorld.js to Lambda and configures API Gateway to launch that function when someone makes an HTTP request (using any method) to https://<generated endpoint>/hello. The endpoint is returned after deploying serverless via serverless deploy.

That setup works well enough if you want to just launch a simple javascript function when you hit an endpoint, but what if you want to create a fully-fledged web app? When I’m developing my API or web app, I don’t want to think about functions, gateways, or infrastructure at all; I just want to install the web framework of my choice and get down to writing routes.

The Proxy Function

Most NodeJs web frameworks support “inject” functionality for their server classes, which is the key to wrapping a normal web app in a Lambda function. Lambda functions launched by API Gateway are invoked with an event parameter which contains the relevant HTTP request information. We just map that event to a fake request object and inject into the server, then translate the response.

// proxy.js
const url = require('url')

const server = require('../server')

module.exports = async (event, context, callback) => {
const request = {
method: event.httpMethod,
url: url.format({
protocol: event.protocol,
pathname: event.path,
query: event.queryStringParameters
}),
payload: event.body,
headers: event.headers
}

const response = await server.inject(request)

return {
headers: response.headers,
statusCode: response.statusCode,
body: response.payload
}
}

We now just need to map every incoming API Gateway request to this function, which is easy using serverless.yml:

functions:
proxy:
handler: src/functions/proxy
description: Proxies an API Gateway request to the web server.
events:
- http: ANY /
- http: ANY {proxy+}

There are some other important tweaks to serverless.yml, such as installing serverless-offline for local development, serverless-api-cloudfront to setup custom domain names, adding VPC configurations for the function, and adding environment variables.

The full configuration I used is here (on a side project that is in-the-works):

The Database

Aurora Serverless isn’t really “serverless”, it’s just Amazon’s way of abstracting server management away from us developers. To date, this has been a popular trend in cloud servers and storage, but not many (if any?) cloud providers have released a stateful database solution. Their documentation sums up the service well:

With Aurora Serverless, you simply create a database endpoint, optionally specify the desired database capacity range, and connect your applications. You pay on a per-second basis for the database capacity you use when the database is active.

For something like side projects, Aurora Serverless can be a convenient way to deploy a production-ready environment while not worrying about shutting down servers when you go on vacation. As of August 2018, one catch is that you can’t connect directly to the endpoint from outside the VPC. To connect from my computer, I just configured my ~/.ssh/config file to do port-forwarding to an EC2 jump-box in the same VPC:

Host aurora-serverless-tunnel
IdentityFile ~/.ssh/my-pem.pem
User ubuntu
Hostname <EC2 IP Address>
StrictHostKeyChecking no
LocalForward 3306 <aurora cluster endpoint>:3306

When connecting from an application deployed in the same VPC, it is functionally the same as a normal RDS/Aurora setup.

Conclusion

The best parts of this setup are that it can be written and configured almost exactly as you would write a normal web app or API and that it costs pennies on the dollar for a side project. I haven’t used Aurora Serverless at scale, so I can’t recommend it for a production application since I’m not sure how well it performs at scale, but the proxy function approach can easily be used in production. In fact, at Storyblocks Engineering we have started using this approach for our production API. I look forward to seeing the proliferation of technologies like this in the coming years, as it makes web development extremely simple to deploy and scale.

--

--