Hello Cloud : a Serverless journey

an alternative way to waste time and discover how to make a funny Serverless Hello World with 4 languages and 4 public cloud providers
Photo by Aleksandar Pasaric from Pexels

Serverless (or FaaS) is the buzzword of the moment in IT world, so after reading various articles I started writing examples on different public cloud platforms. Each of these is giving away a free tier where you can play with a lot of services and, of course, you can find a sort of Serverless engine.

I don’t want to explain what Serverless is, where can be used and what could be the benefits of using this approach, but I simply want to see how the different type of Serverless engine, with several languages and trigger mechanism, can be used together.

The architecture of Hello Cloud

The starting point of this example is a GIT repository, which is monitored by an IBM Cloud Function trigger that receives every push event. From this Python function we make an HTTP POST request to a Google Cloud Function, developed in Node.JS.

This script receives the information triggered by the HTTP request and create a new JSON file in an S3 bucket. The creation of a new file in S3 triggers an event, captured by an AWS Lambda Java function.

The content of this file is used to create a new Document in a CosmosDB collection and the final step is a C# Azure Function triggered by a timer every five minutes, which check the items in the collection and send an email using SendGrid service.

Hello Cloud architecture

It’s a bit crazy flow but it’s useful to discover the different events integration available on the most famous public cloud providers. So let’s see how deep the rabbit hole goes…

IBM Cloud Function

IBM Cloud Function is based on Apache OpenWhisk project and provides an interesting framework to create trigger/event/action. The action is where we insert our code and in this platform we can use JavaScript, Swift, Python, Java, PHP, Docker, Go and arbitrary executables to create action (for more details read this).

There is already available a system utility, whisk.system/github, that we can use to capture Github event. From the bx command line utility, we create a package with this utility and our settings

bx wsk package bind /whisk.system/github myGitCloud 
--param username GITHUB_USERNAME
--param repository GITHUB_REPO
--param accessToken GITHUB_ACCESS_TOKEN

After the creation of the package, we can initialize a new trigger that will user push events from the utility

bx wsk trigger create myGitTriggerCloud
--feed myGitCloud/webhook
--param events push

It’s time to start coding, so let’s see a simple Python function that receives the push event and make an HTTP POST request to a Google Cloud Function using some data from the triggered event

We can now upload this code to BlueMix, creating an action in our repository and a rule that will fire this action when the myGitTriggerCloud is called

bx wsk action create helloCloud step1.py
bx wsk rule create gitPushCloud myGitTriggerCloud helloCloud

Google Cloud Function

The Serverless solution provided by Google can be used to offer a service triggered by a simple HTTP request. We will use Node.JS because it is the only development runtime available in this platform (some other runtimes can be useful to have more choices). The code receives a request from the IBM Cloud Function with Github event and uploads a new JSON file into an S3 bucket

The function is simple and uses s3fs module to upload the new file. In the end, we return the HTTP 200 status code so the caller can check if everything goes well. To upload this function to the Google platform, we can use the useful gcloud compute command-line tool from the directory with our Node.JS module

gcloud beta functions deploy helloCloud — trigger-http

And now also the Google part of this journey is completed

AWS Lambda

AWS Lambda is the most mature Serverless platform available, mainly due to its launch in 2014 while other platforms are younger. The supported languages are Java, C#, Node.JS, Python and Go; in this example we use Java to catch the S3 event triggered by a new file in the bucket and then there is the creation of a new object in the CosmosDB collection, using the content of the JSON file.

We firstly receive the S3 event which has only bucket name and the key of the object. Using AWS S3 client we retrieve the related JSON and then create the new object on CosmosDB. It’s a bad practice to use variables like user/account in your code so here we are using the environment variables defined in the AWS Lambda console.

Azure Function

The last step of our journey is Azure Function, Microsoft Serverless proposal integrated in the Azure platform and which supports C#, JavaScript, F# and Java languages (Python, PHP, TypeScript, Batch, Bash and PowerShell are experimental). For the final step we use a C# function that is periodically called by a timer. Using the concept of binding we have two integrations :

  1. CosmosDB input binding with the list of documents in a specific collection
  2. SendGrid output binding so we can use this external platform to send email from Azure

We can find the bindings in the function.json file

The binding file is also used to provide a DocumentDB client to C# code, in order to delete the objects. The function, called every five minutes, simply reads the objects passed as argument and uses it to create the content for an email.

Conclusion

Finally, we receive an email with details of every push. This journey was a funny joke but I used it to explore and show how Serverless architectures are interesting, especially for integration of events inside a complex cloud platform. You can find the code samples in this GIT repository.