For some time now, we’ve had a small project running in our office we’ve called “The DevOps DJ”. From within Slack, we can issue commands to a bot which in turn communicates with a client running on some hardware in the office that has a bluetooth speaker connected. Every Friday, we have music in the office day where anyone can request songs via a Slack channel. Fun!
When we originally created this, I’d created the Slack bot side of the solution using php and run it in Heroku. I’d never used Heroku at the time and it was a chance to get some experience there with something new. Having written many AWS Lambda functions in the past, I wanted to have a look at the AWS Serverless Application Model (SAM) and this seemed like the perfect project to see if I could port the Slack bot from Heroku/PHP to AWS/API Gateway/Lambda/Python. Turns out, it was!
Green Eggs and SAM
So what is SAM? I’d written quite a few Lambda functions in the past and packaged them in Cloudformation but there’s quite a few moving parts that you need to manage explicitly — function code into S3, roles, role policies, API events, API gateway. Essentially SAM abtracts a lot of this for you and does the heavy lifting leaving you more time to just write the code. In addition, the SAM CLI (which is BRILLIANT!) lets you run your functions AND the API gateway locally meaning development is much faster than running things in AWS each time. It also provides a quick and easy way to package up your API/functions and deploy them to AWS in an easy way (under the covers, it’s all using Cloudformation).
For this mission, I needed to map out what the old solution did into the SAM world. I started off using the SAM “hello world” as a basis
sam init — runtime python3.6
From here, I then modeled how the existing application in Heroku worked. It all looks something like this:
In SAM, I then had to model 2 API endpoints — 1 to receive inbound requests from Slack, 1 for the “on premises” controller to be able to call to receive these commands. We couldn’t have Slack directly call into the on-prem component because of firewalls so we have the /heydj endpoint just capture the commands and requests from Slack and put them in an SQS FIFO queue. When the controller then calls into the /controller endpoint to get the commands, we just pull them from the queue and send them ordered back to the controller. The FIFO queue gives us the ordering for free!
Local Dev and Testing
This is where I really found SAM and the SAM CLI incredibly helpful. Using just my trust Atom editor and the SAM CLI, I was able to get eveything working locally before deploying to AWS. The only thing I created for testing later was the SQS queue I needed in AWS (I’m sure this could have been mocked out though…). Following the SAM guide on passing environment variables locally, I created a small json file with the variables and then invoked the SAM local API runner (NOTE: You need Docker installed to be able to do this)
sam local start-api --env-vars env.json
At startup, SAM told me
2018–11–01 14:12:57 Mounting HeyDJFunction at http://127.0.0.1:3000/heydj [POST]
2018–11–01 14:12:57 Mounting ControllerFunction at http://127.0.0.1:3000/controller [GET]
2018–11–01 14:12:57 You can now browse to the above endpoints to invoke your functions. You do not need to restart/reload SAM CLI while working on your functions changes will be reflected instantly/automatically. You only need to restart SAM CLI if you update your AWS SAM template
I was able to test the endpoints very quickly using just simple curl commands. As you’ll also note, the SAM local API tester doesn’t need to be restarted each time either as you change code. That’s because for each time you call the endpoint, it fires up a new Docker container which is VERY similar to the AWS Lambda runtime. It meant I could develop and test very quickly.
Building and Deploying
So SAM is awesome for local dev/test. What about deploying. Well, it’s also a bonus here. SAM is able to take your code and upload it to an S3 bucket and then modify the SAM (Cloudformation) template to point to the S3 location of your code. This is all done with a single command:
sam package --template-file template.yaml --output-template-file packaged.yaml --s3-bucket my-s3-bucket-for-lambda-code
Once you have your template created and the code uploaded to S3, you just need to deploy the packaged template. Again, SAM gives you a simple command to do this:
sam deploy \
--template-file packaged.yaml \
--parameter-overrides SlackToken=mySlackToken ControllerToken=myControllerToken \
--stack-name slack-music-controller \
--capabilities CAPABILITY_IAM \
That’s it! After running this, I had a full API Gateway solution backed by my 2 Lambda functions running in AWS. And I was pretty confident it would just work because running everything locally was using the same code and environment as it would in Lambda.
One feature I didn’t play with in SAM is the ability to send simulated events to functions. Having had to create many test events when developing Lambda functions in the console, this seems like an incredibly valuable feature which I will be using in the future.