Serverless Crystal

TPei
4 min readDec 5, 2018

--

Crystal has yet so see support from major Serverless vendors, nevertheless you can already run Crystal functions today!

A Crystal Function running on OpenFaaS

While Crystal is not yet officially supported by any of the major cloud vendors, nor by any of the open source serverless frameworks, many of the open source solutions allow you to run any arbitrary programming language using Docker. Let’s have a look at how we can run Crystal today!

So what is Serverless? Serverless or Function as a Service, allows you to seamlessly provision small, stateless bits of code, without having to worry about or manage deployments (in theory). For more info on serverless I recommend this in-depth explanation.

Most promising in my opinion is the OpenFaaS approach to implementing new languages on their platform. This has multiple reasons, one being that they actually have a specified format for supporting new languages, the other that this actually makes the language feel like a first class citizen in their application.

So how do you usually work with OpenFaaS?

Using the faas cli, you first create a new function, specifying the runtime:
faas new my-function --lang ruby
This will do a few things for you:
- it will download the ruby template from github
- it will create a folder with the template code for you

➜ ~/Code/functions ᐅ tree
.
├── my-function
│ ├── Gemfile
│ └── handler.rb
├── my-function.yml

The generated code includes a yaml deployment and configuration file (Infrastructure-as-Code yay!) and then you have a function file as well as a Gemfile for your dependencies.

Now, OpenFaaS allows you to write new templates using Dockerfiles. This is cool insofar as that all templates are already Docker based. Hence, you’re not building some second-class language template, but something that is on the same level as all other supported languages.

Now, when we have our language template finished (like: https://github.com/tpei/crystal_openfaas) we can simply use the faas command line to first pull it
faas cli template pull https://github.com/tpei/crystal_openfaas
and then use it to create a new function
faas cli new crystal-function --lang crystal

Now, the crystal template includes pretty much what you’d expect:

➜ ~/Code/functions ᐅ tree
.
├── crystal-function
│ ├── handler.cr
│ └── shard.yml
├── crystal-function.yml

Instead of a Gemfile, we have a shard.yml and the rest is basically the same.

Now we can simply write our crystal code in the handler.cr and deploy the whole thing using faas up

Nice! 🎉

And, as we expect from Crystal, the performance is pretty great! Here are some numbers that I gathered on identical functions written in JavaScript, Ruby, Go and Crystal on OpenFaaS: 📈

Now, what’s even cooler, is that with the recent addition of the OpenFaaS template store, we can submit our template as a community template and have it show up when users list the available templates:

➜ ~/Code/functions ᐅ faas template store list NAME       SOURCE              DESCRIPTION
go openfaas Official Golang template
ruby openfaas Official Ruby 2.5 template
ruby-http openfaas-incubator Ruby 2.4 HTTP template
...
crystal tpei Crystal template

This then allows users to pull that template directly (instead of using a GitHub link to the repository):

➜ ~/Code/functions ᐅ faas template store pull crystal [7:04:50]
Fetch templates from repository: https://github.com/tpei/crystal_openfaas
2018/12/05 07:05:02 Attempting to expand templates from https://github.com/tpei/crystal_openfaas
2018/12/05 07:05:04 Fetched 1 template(s) : [crystal] from https://github.com/tpei/crystal_openfaas

Oh yeah! 🎉

Now, off you go setting up OpenFaaS (it’s really easy) and developing your first serverless Crystal functions 💻

For those interested, I also created a Docker Template for Crystal on OpenWhisk, unfortunately, OpenWhisk handles docker actions fundamentally differently (by default wrapping them in a flask process) so the performance is not all that great :/

Note:
For benchmarking I used my local OpenFaaS setup:
- Thinkpad T480
- 32GB RAM
- Intel® Core™ i5–8250U CPU @ 1.60GHz × 8

I used apachebench with 1000 requests each and a concurrency of 1

ab -n 1000 -c 1 http://localhost:8080/function/javascript-benchmark
ab -n 1000 -c 1 http://localhost:8080/function/ruby-benchmark
ab -n 1000 -c 1 http://localhost:8080/function/go-benchmark
ab -n 1000 -c 1 http://localhost:8080/function/crystal-benchmark

The functions used in sequence were:
- 2x a hello world function
- 2x a hello #{your_name} function
- 2x interpolating longer sentences
- an echo function
in each case orchestrated using the Faas-flow library, which itself is written in go

Disclaimer: The goal was not to compare the different programming languages using complex algorithmic problems, I simply wanted to get a baseline for invoking functions with some request parsing, string interpolation and response object creation.

--

--

TPei

Software Engineer doing mainly Ruby at gapfish.com. Current area of interest: FaaS