Serverless computing: terrible name but brilliant service

Paul Harrison
Feb 23, 2018 · 8 min read

Serverless computing, or function-as-a-service, is a cloud-based computing model where you only need to worry about code. No servers. No operating systems. Just code. It’s a super convenient way of running simple functions, and even building whole microservice architectures with minimal fuss. The name, however, is very misleading. Of course your functions are running on a server (or rather in a container on a server), it’s just not yours, you have no way of accessing it, and if it breaks you’re relying on the provider to fix it in a timely manner. But assuming it doesn’t break, it’s amazing! You write your code, drop it in a function and boom, you’re done.

Since serverless seems to be all the rage at the moment I thought I would test it out, so I set myself a little project: write a service to query daily exchange rates from fixer.io’s open API and store the results. I decided to go with AWS Lambda as my serverless provider — primarily since I’ve been using AWS for years and they have a generous free tier — though all the big players have equivalent offerings (e.g. Google Cloud Functions, Microsoft Azure Functions, and IBM Cloud Functions). I used DynamoDB for storing the data. For those who want to get straight to the meat, you can find the code here.

The getting started tutorials are straightforward, but they’re not very helpful if you’re planning on going serverless in a real project and leave many open questions. How do you manage dependencies? How do you deploy to multiple environments? How do you implement a continuous integration (CI) process? For this project I chose to use Apex, a neat little tool for managing the entire workflow of AWS Lambda functions, including building, testing, and deploying to multiple environments. So without further ado, let’s get into the code.

Getting setup

After installing Apex and ensuring your AWS credentials are properly configured with the correct IAM policy, initialising an Apex project is as simple running

apex init

Enter you project name (in my case exchange-rates) and description and the bones of your project will be created. You’re given a project.json file, which defines things like your functions default memory allowance and runtime environment, and a folder containing a hello world lambda function. If you like, you can go straight ahead and deploy the hello world function with

apex deploy

and test it with

apex invoke hello

though it won’t do a lot.

Writing the Lambda function

Now we have a project structure, let’s get cracking with the function itself. There are actually two functions we’re going to write: the main handler function and a function to turn the JSON response into the correct format for DynamoDB. I removed the hello world function entirely and created a new one called `get_rates` in the functions folder, with a main.py file containing the code. Our first function will query Fixer’s API using Python’s requests library. Fortunately, Fixer’s API is open so there’s no need for the OAuth dance.

credit: SlideShare

Before writing the code let’s take a look at at typical response from Fixer by running the curl command below. Windows users can paste the URL into a browser or find further assistance running curl commands here.

curl http://api.fixer.io/latest?base=GBP

yields the following JSON:

{
"base": "GBP",
"date": "2017-10-26",
"rates": {
"AUD": 1.7131,
"BGN": 2.1973,
"BRL": 4.2726,
"CAD": 1.6898,
"CHF": 1.312,
"CNY": 8.7634,
"CZK": 28.748,
"DKK": 8.3622,
"HKD": 10.302,
"HRK": 8.4434,
"HUF": 348.63,
"IDR": 17955,
"ILS": 4.6448,
"INR": 85.642,
"JPY": 150.26,
"KRW": 1483.4,
"MXN": 25.13,
"MYR": 5.5906,
"NOK": 10.658,
"NZD": 1.9232,
"PHP": 68.463,
"PLN": 4.7579,
"RON": 5.166,
"RUB": 76.126,
"SEK": 10.922,
"SGD": 1.7987,
"THB": 43.785,
"TRY": 4.9812,
"USD": 1.3204,
"ZAR": 18.806,
"EUR": 1.1235
}
}

The response is a very simple JSON, and our task is to store each field (date, base, and each exchange rate) into a new DynamoDB row. Let’s dig into the code and look at the first function, handle:

The inputs event and context are the required inputs to a Lambda function, though we’re not actually using them in this case. However, it threw a hissy fit when I removed them so I left them in. event is any input data to the function. For example, AWS’s getting started tutorial gives the event template below.

{
"key3": "value3",
"key2": "value2",
"key1": "value1"
}

context, the context object gives metadata about the function, such as the function’s timeout or it’s associated CloudWatch log group.

The other function, reformat_json, reformats the response from Fixer into the correct format for DynamoDB. It took some digging to find this format, but I finally found it here. The key point to note is the API requires you to specify the data type of each attribute, but each attribute value needs to be given as a string. For example, to store a number {“my_number”: {“N”: “10”}} is correct whereas {“my_number”: {“N”: 10}} is not. The code, then, is below.

Before actually deploying anything, we need to create the DynamoDB table. Apex can manage your other infrastructure by wrapping around Terraform but I’m yet to test this out, plus it’s probably overkill for this application. Create a table by following the example in the docs; I cleverly called my table exchange-rates-table, and made a string field called date the primary partition key.

If it ain’t tested, it’s broken

Of course, until we have some tests our code is broken, so let’s write some. I’m not going to go through the nitty-gritty as it’s not the point of this post and the code is available on GitHub, however, there are a couple of pain points worth noting. I’ve always found mocking external calls a bit fiddly, and for this function there were several to DynamoDB and Fixer’s API. Mocking Python’s requests library was straightforward thanks to this handy StackOverflow post, which I implemented pretty much as-is:

Mocking DynamoDB, on the other hand, took some digging. I faffed around with stubbing the put_item function I was using, but couldn’t get it working correctly. A couple of commutes later (I code on the train, because I’m cool like that) I was saved by a blog by Kathryn Inez, who illustrated how to mock the underlying API call. She only seems to have the one post on her blog, but it’s a good’un.

credit: Wikipedia

My test class then ended up looking like this:

TEST_RESPONSE and TEST_RESPONSE_REFORMATTED are testing fixtures with set responses in JSON/dict format.

Deployment

Deployment is more involved now we have something more complicated than a hello world function. We have dependencies and tests to worry about. Fortunately, Apex makes this process simple through the use of function hooks. To deal with dependencies we will use the build hook — a command to run prior to building the zip file for deployment. We must first create a requirements.txt file for our function in the function’s folder (not the root directory as with standard Python projects). The hook command we want to run is

pip install -r requirements.txt -t .

This installs dependencies into the function directory, ready to be packaged up with your function. Next, we want to add a deploy hook — run prior to deployment — to run our tests. Since I’m using nose for running tests in this example, this hook command is simply nosetests. Finally, we want to clean up any build artefacts — in our case the dependencies installed into the function’s directory — using the clean hook. To keep the hook command clean we can write a bash script to remove to artefacts, and run it with the hook command

sh clean_up.sh

Putting this all together, we add these commands to the function.json file, also in the function’s directory. Our function.json is then as follows:

{
“hooks”:{
“build”: “pip install -r requirements.txt -t .”,
“deploy”: “nosetests”,
“clean”: “sh clean_up.sh”
}
}

And there we have it, we have a function for making a request to Fixer and storing the day’s exchange rates in a table in DynamoDB, which we can now deploy to AWS Lambda with apex deploy. Assuming everything works you can now head over to your AWS Lambda console to view the function.

Lambda function dashboard

Go ahead and test it if you like; view the function and click on “Test” in the top right (it doesn’t matter what input you give it). All being well it will run successfully and a new row should appear in DynamoDB.

Scheduled execution

Finally, we need to configure the function to run each weekday. For this we can configure a schedule using CloudWatch events (as described in the docs here) from the Lambda function’s triggers tab. We can use a cron expression to set the schedule; in our case we will use the expression cron(0 18 ? * MON-FRI *), which will run every weekday at 6 pm.

Lambda schedule

Serverless FTW!

It works! We can view the data on the DynamoDB console, which displays it in no particular order, or download it for further analysis.

DynamoDB exchange rates table

I didn’t use Apex to it’s fullest extent, for example, I didn’t touch on how one can use it to manage other infrastruture or multiple environments. However, I hope this post has highlighted to usefulness of going serverless; I probably only spent a few hours building and deploying this function and it has been running reliably for over a month now at the time of writing. My only question is how to incorporate this workflow into a continuous integration pipeline; Apex is a great tool for lone rangers, but I’m unsure on how well it will scale to large microservice architectures maintained by a larger team of developers. It also seems quite easy to accidentally deploy to production! If anyone has any thoughts on this, or comments in general please leave them below.

Paul Harrison

Written by

Data scientist, software developer, and AI enthusiast.

More From Medium

Top on Medium

Top on Medium

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade