Say Hello to the AWS Architect

Warren Parad
Sep 1, 2018 · 6 min read

TL;DR if you use Node.js in AWS, you should be using the AWS-Architect npm package.

Now for the longer version, a journey through the development of a minimal architecture for deployment and runtime of API Gateway and AWS Lambda. As AWS is notoriously aware of their lack of attention to solutions, developers using AWS are required to stitch together desperate pieces of technology to actually utilize AWS offerings. In the last couple of years AWS has gotten better but has still not fully provided solution level support.

In this story we are talking about the rapid prototyping of a microservice using AWS. I specifically call out prototyping, because there are many additional levels of maturity of a microservice which can require any number of changes. But the most important features are:

  • Easy stand-up of a new service aka Runtime Framework
  • Full working deployment pipeline aka Deployment Framework
  • Ease of scale for additional features aka Growth and Scalability

From a Runtime Framework, you expect there to a be a single line command to create a new service and provide a most a one line change and a stub method to populate. The framework should provide a seamless experience to integrate with HTTP requests and access to API Gateway as well as Lambda if necessary. Out of the box support for authentication and response handling. And additionally the ability to test and run locally without the need to support a full AWS environment OR integrate directly with AWS resources locally so that non-intrusive development can be done.

From a Deployment Framework, full service CICD must be supported. This means a set of one line commands which can be executed via library calls to create a non-production environments per merge request, redeploy an existing environment, and tare down an environment.

For growth and scalability, any framework must provide out of box direct interface to AWS resources, as well as minimal impact to performance by the usage of the framework. Frameworks which require > 100ms processing time are disqualified. If I want my requests to be fast and scale sacrificing time to the framework is a non-starter.

While the list of Frameworks go on and on:

While you can like which ever one fits the fad of the day, from looking at the source code and running some performance testing let’s us immediately remove some of them from the mix. We should ask ourselves what is most important from a library we are choosing? Should it be the number of stars on github, or the number of times the package is downloaded? While these numbers may tell you something, it is like paying a software developer based on the number of lines code they write.

Still further may you find additional packages which provide functionality, but I’ll share the one that I use. What I really like is no dependencies, but also no complex code. Direct access to the resources the cloud provider has, and not wrappers which have to be continually updated. Take a quick look at the list of closed JAWS issues, most of them I’m like huh? what does any of that have to do with a serverless framework?

I find still others attempt to manage the resources directly in AWS instead of providing a simple interface. For instance, I still can’t figure out which commands I want to run to create my AWS resources using the Claudia.js commandline tool. Or it turns out many of the tools attempt to wrap others which offer only minimal documentation on how manage your service.

To combat these wild issues, and I know I’m a revolutionary trying to create library a which does things simpler, I’ll posit here is the list of things that I want to do, and the thing which should be done automatically in your deployment framework. While some of the packages get the runtime framework right, i.e. no dependencies, an express like syntax, easy to pull in other libraries, few get deployment correct:

  • npm install -g Library — there is should be a simple installation
  • Library init — a straightforward way to create a new project, no magic configuration options, you just get one default and ready to go. The package should basically be already good to go.
  • Set your service name — There is a fine line between configuration hell and magic configuration, that is visible default configuration with a single parameter to set. Additionally you’ll want to be able to set the DNS name of your service.
  • Backup and storage configuration — While it breaks the norm due to how AWS works it can’t be avoided to create an S3 bucket to store your code temporarily before upload to the lambda function. While some hoops can be jumped through and windows broken, I have yet to see a library successfully avoid the need to do this.
  • build.sh — or the like, which contains the commands necessary to actual manage your service

When said and done the result should be a simple output which contains the following files and commands

  • build_and_deploy STAGE_NAME => a one line command to create or update an environment, including API Gateway stage, lambda function alias, essentially the running service, and having this distinct from production
  • a package.json file whose values are used by the framework, where only the dependencies (and not the devDependencies) are actually deployed
  • run => a one line command to run the service locally using the exact files and resources that would be there if the environment was set to something other than production. I want to choose which resources in my local version should use production or a different version
  • destroy STAGE_NAME => a one line command to remove an environment and associated resources
  • a cloudformation template => AWS already provides the best way to construct and manage resources, I want these resources all to be part of the same cloud formation template. When I want to update or include additional resources the template is how I want to do that. Compatibility with CF stacks is a MUST, this removes all the burden of understanding a complex DSL that some random dev picked and allows me to reuse my knowledge of how AWS works. Reuseability is the fundamental paradigm of software development, let’s reuse the CF stake concepts.

Try taking this challenge with any package you like, but I don’t think you will get as far as with AWS-Architect.js. So let’s try it out:

  • npm install -g aws-architect
  • aws-architect init
  • modify the package.json package name to be your service name
  • specify the location of your store S3 artifacts buckets in the make.js file as well as the DNS name of your public service.
  • npm run deploy (wait the obligatory minute for the AWS resources to be created, but if you are curious about the process, since almost 100% uses Cloudformation you can check the progress directly in AWS, no hidden logging)

Congrats you have a running service at your DNS, done!

For the next part of the challenge, let’s add some resources:

  • SQS to trigger the lambda function
  • DynamoDB table
  • and create a MR with a code change

The first two are directly solved by editing the cloudFormationServerlessTemplate.json file found on the output, the third one is solved by simply running deploying the code again passing in a different environment name. Personally I use GitLab, and if so you are already done, the new environment names come directly from the CI_COMMIT_REF_SLUG so there is no additional configuration required.

For the runtime framework while you can pick anything you like, I opt for the 196 lines of code in the openapi-factory required to make a fully featured wrapper around API Gateway and Lambda. Find me a small more performant library, and I’ll switch today. It only does one thing, and the rest of the complexity is reserved for your favorite libraries. That is it ensures that every response is logged or returned. Lambda/API Gateway integrations can be fickle, getting the error handling right is a difficult task, not one you want to do twice.

Check them out and start using them today:

Warren Parad

Written by

CTO and Founder at Rhosys, where we help AI understand humans. I share how to continually innovate and stay ahead in the technology domain.

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade