First steps in #serverless with fnproject.io

Ralf Mueller
11 min readSep 10, 2018

--

Disclaimer: I work as an Architect of Oracle Integration Cloud in Product Development at Oracle Corporation. The article here and opinions expressed are my own and do not necessary reflect those of Oracle.

In the recent months I’m reading a lot about FaaS and Serverless as yet another way to build software systems and I must say I have become quite enthusiastic about it. In fact much more enthusiastic than with micro services when they were considered the greatest thing on earth for development for the Cloud. Not that Serverless doesn’t have any caveats but it seems to fit the idea of Cloud as an utility so much more than having tons of Microservices up and running for the eventual case that a customer wants to do something. I’m not going further through the pros and cons though, leave this for some evangelists in the world. Instead, I’d like to write about my own experience and how I started the journey into Serverless using fnproject.io which is an open-source serverless infrastructure and Oracle being the main contributor (so far).

Overview

I wanted to get started with something useful and not the trivial “Hello Function” kind of a thing. I have a specific set of use cases in mind where Serverless architectures can help in certain integration scenarios that include Systems, People and Developers. So the first function I’m going to develop does something like this

  • Function will be invoked by some input that conforms cloudevents.io standard. I’m a big fan of standards and this seems only natural to use for Serverless.
  • Function will extract Data from the input and then call into Business Rules for some decision logic.
  • Depending on the outcome of the decision logic, the function will create another cloudevents.io conformant event and call into some other function (or for the sake of this first exercise, print the output to the console)

That’s it for the moment and with a bit of creativity you might acknowledge that this might be valuable for a variety of use cases in all kind of areas

  • Routing data to systems or people depending on decision logic
  • Calculating customer promotions and invoking a SaaS function to give a Promotion to a certain customer
  • etc.

Setup

Lets dive straight into it and get started. fnproject.io is hosted on github.com, which apart from all the bits and pieces required to run Fn has also tons of examples, tutorials etc. See the github page on how to install Fn, here I’m just mentioning on how to get fn installed, which is a CLI for Fn deployment, config, etc.

> brew install fn (Mac OSX)
or
> curl -LSs https://raw.githubusercontent.com/fnproject/cli/master/install | sh (Linux)

This is it for now. If you’d like to get started quickly on your own, see fnproject git page for tutorials. For my development, i wanted a complete environment with everything in the stack including

  • Fn Server
  • Fn Flow Server
  • Fn UI
  • Fn Flow UI
  • Prometheus for metrics collection in Fn
  • Grafana for dashboards

Docker is your friend here, more specifically docker compose. The fnproject git repo comes with a docker-compose.yml file. Unfortunately I wasn’t able to get it working with the MySQL DB configured in the compose file. So i tweaked the docker-compose.yml file and changed it to use Postgres instead and also include Fn UI, Fn Flow and Fn Flow UI. This should be an easy exercise. Here is what you need to use Postgres instead of MySQL

db:
image: "postgres"
restart: always
networks:
- fn-network
ports:
- "5432:5432"
environment:
- "POSTGRES_PASSWORD=welcome1"
volumes:
- ./data/postgres:/var/lib/postgresql/data

In the services config for fnserver, you need to set the following environment to have the Fn Server connect to Postgres instead of MySQL

FN_DB_URL: "postgres://postgres:welcome1@db:5432/funcs?sslmode=disable"

Its probably best to take a copy of docker-compose.yml and do the changes in a copy, lets say docker-compose-postgres.yml

What is still missing is the creation of a Database that can be used by Fn Server. For this, startup the environment with

docker-compose -f docker-compose-postgres.yml up

You will see some error messages complaining that fn_server is unable to connect to the DB. Lets fix this by the following two commands

docker run -it --rm --link func-postgres:postgres postgres \
psql -h postgres -U postgres -c "CREATE DATABASE funcs;"
docker run -it --rm --link func-postgres:postgres postgres \
psql -h postgres -U postgres -c 'GRANT ALL PRIVILEGES ON DATABASE funcs TO postgres;'

That’s it, after this the error messages should go away but it is probably safe to restart the environment with docker-compose.

DMN

Before we start the development of our first function, lets have a word on DMN. DMN is an acronym for Decision Model and Notation which is a OMG standard for Business Rules. If you’re interested, the spec for DMN is hosted on the OMG web site over here: DMN 1.1 specification. DMN 1.2 is in Beta as of this writing. Oracle implements DMN 1.1 as part of its Oracle Integration Cloud PaaS offering. The documentation of OIC has a chapter Creating Decisions which gives an overview of the capabilities of DMN within OIC. For this exercise though, i have isolated DMN from OIC and created a docker container which I run in the same docker-compose as my Fn environment. Unfortunately, this is not officially released, its an ongoing experiment from my end. If you’d like to try this, you’d need a full OIC service provisioned on Oracle Cloud.

I’m not going into the details of DMN here, just wanted to explain few concepts required to understand this article. Typically, Business Rules or Decisions are exposed in DMN via so called Decision Services. A Decision Service does the following

  • Accept input
  • Apply Decision Logic
    Decision Logic can be in the form of if-then rules, Decision Tables, simple expressions, lookup tables etc. For the full set of decision logic see the DMN spec. Probably the most user-friendly implementation of decision logic in DMN is via Decision Tables as they offer a spreadsheet like modeling experience for Business Rules
  • Produce Output
    The output of a Decision Service is the result of applying decision logic

Decision Services in DMN are exposed via REST (that is an implementation detail of Oracle’s DMN implementation and not part of the standard), the input and output is supposed to be in JSON format.

The following Decision Table for example calculates a discount and when it expires depending on customer sales data (sales amount, status, region)

Promotion Decision Table

In this scenario, the situation is as following

  • Input of the Decision Service are the values for salesAmount (number), customerStatus (string) and region (string)
  • The decision table PromotionDecisions is evaluated with the given input.
  • The output is determined by the last column (in yellow color) of the decision table and is either of the form {promotion : true/false} or {discount : <value>, expire : <date>}

The REST endpoint for the Decision Service PromotionDecisionService can be obtained by clicking on the Decision Service hamburger menu and choose “Payload”

PromotionDecisionService URL and request/response payloads.

In our example, the URL is

http://localhost:8088/bpm/api/4.0/dmn/spaces/My%20Space/decision-models/PromotionDecider/versions/1.0/definition/decision-services/PromotionDecisionService/

This URL is later needed in our function when it comes to call into a Decision Service. Above URL will be used as value for the function config variable DMN_API_URL.

Testing Decisions

Before using the DMN decision model and its Decision Service it is good practice to test the model and check, if it gives the desired results. The Decision Model can be tested from within the DMN UI itself. For this, press the “play” button at the top right of the UI, a screen like this appears:

Dialog for testing a Decision Model

You might want to fill in some values for customerStatus, salesAmount and region and then press “Start Test”, which will show the following result:

Result of Decision Model Test

This proves that our Decision Model is working and we can now focus on the development of the Fn Function.

Fn Function

Lets get into the development of the Fn Function itself. Since I’m learning Go Programming Language (please bear with me if my Go code looks quite basic which it really is, I promise to work hard on my Go language skills) at the same time as I’m trying Serverless we’re going to create the function in Go using fdk-go. The Go FDK comes with some utility packages to get the function configuration, its headers and data etc. Another advantage of using Go here is that by the time of this writing, preliminary support for cloudevents.io is available for Fn in Go, more of this later.

Importing packages

So we start our function by importing couple of Go packages we need

package mainimport (
"bytes"
"context"
"encoding/json"
"io"
"io/ioutil"
"log"
"net/http"
"github.com/google/uuid"
// Import Fn Go FDK and cloudevents.io support
cle "github.com/fnproject/cloudevent"
fdk "github.com/fnproject/fdk-go"
)

Function Handler

Next, we’re going to develop the function itself

// main function calls fdk.Handle function
func main() {
fdk.Handle(fdk.HandlerFunc(withError))
}
func withError(ctx context.Context, in io.Reader, out io.Writer){
err := myHandler(ctx, in, out)
if err != nil {
log.Println("unable to decode STDIN: ", err.Error())
fdk.WriteStatus(out, http.StatusInternalServerError)
out.Write([]byte(err.Error()))
return
}
}

The idea is that we call into a function withError that calls into myHandler and in the case of some error creates a message and prints it to console and also signals the FDK that the request to the function failed. This is all boilerplate code of course, so lets get to the meat of it.

Creating a CloudEvent from the function input

First, we’re going to convert the input into a CloudEvent as following

func myHandler(ctx context.Context, in io.Reader, out io.Writer) error {
var ce cle.CloudEvent
err := json.NewDecoder(in).Decode(&ce)
if err != nil {
return err
}
var data = ce.Data

Since CloudEvent is a JSON structure, we can use the JSON decoder functionality here. After the CloudEvent is created, we can get the Data attribute out of it and store it in a Go variable.

Calling into DMN REST Service

Next we’re calling into a Decision Service activated in DMN Microservice as following

// Get the Decision Service URL from the Fn Config
url := fdk.Context(ctx).Config["DMN_API_URL"]
// Marshal the data to JSON and prepare HTTP POST call
jsonData, _ := json.Marshal(data)
req, err := http.NewRequest("POST", url, bytes.NewBuffer([]byte(string(jsonData))))
req.Header.Set("Content-Type", "application/json")
// Call into DMN Microservice and handle error
client := &http.Client{}
resp, err := client.Do(req)
if err != nil {
return err
}
defer resp.Body.Close()
// Get the result body and signal FDK that the call was successful
body, _ := ioutil.ReadAll(resp.Body)
fdk.WriteStatus(out, http.StatusOK)
// Unmarshal response into interface{}...
var f interface{}
err = json.Unmarshal(body, &f)
if err != nil {
return err
}
// ... and cast to map
m := f.(map[string]interface{})
// Get the interpretation of the result and cast to map again
imap := m["interpretation"].(map[string]interface{})

The config variable DMN_API_URL can be either set using the fn CLI utility as following

fn config app <app name> <key> <value>

As an alternative, you can set the config also in the Fn UI (running on http://localhost:4000/#/ in docker-compose) as shown below. Click on the Fn app and then choose “Edit Route

Edit Source dialog in Fn UI.

Creating the result CloudEvent

From the response of the REST call to DMN, we can create a new CloudEvent as following

// Create a new CloudEvent
var cer cle.CloudEvent
// Create an UUID and assign to the EventID of result CloudEvent
uuidb, _ := uuid.New().MarshalText()
cer.EventID = string(uuidb)
// Copy some data from the incoming CloudEvent
cer.CloudEventsVersion = ce.CloudEventsVersion
cer.ContentType = ce.ContentType
cer.EventTime = ce.EventTime
cer.EventType = "com.oracle.oic.myaction"
cer.EventTypeVersion = "1.0"
cer.Extensions = ce.Extensions
// The result Data will be the output of the Decision Service
cer.Data = imap

This completes the example. You might want to use the result to call into some other function or push the event to an Oracle Event Hub so that it might get picked up by some other system etc. Leave this to the creativity of the reader.

The func.yaml file

You need to create a func.yaml file for Fn functions. This file contains some metadata for functions. In my case, this was quite simple

schema_version: 20180708
name: customerpromotion
version: 0.0.1
runtime: go
entrypoint: ./func
format: json

Putting it all together

We have everything complete now for deployment of the function. We can now use the fn CLI to deploy the function to Fn Server

fn --verbose deploy --app customerpromotionapp --registry phx.ocir.io/oicpaas1/ralmuell/fn

This will create a docker image from your Fn app and pushes the docker image to a registry, in this case my private registry on Oracle Cloud Infrastructure. Next, we need to create a route to the function customerpromotion in app customerpromotionapp

fn create route customerpromotionrapp /customerpromotion customerpromotion:0.0.1

Thats it, we have successfully created a function and deployed it to Fn Server.

Testing

As a final step, we’d like to test our function. For this, lets create a JSON file with a cloudevents.io event data, for example (lets say we name the file ce.json)

{
"eventType":"com.oracle.oic.example",
"eventTypeVersion":"1.0",
"cloudEventsVersion":"1.0",
"source":"TBD-Source",
"eventID":"265b5c2b-cd0a-460a-9994-1131058946ba",
"eventTime":"2018-08-30T14:25:17Z",
"contentType":"application/json",
"extensions":{
"destination" : {
"function" : "customerpromotionapp/customerpromotion"
}
},
"data": {
"customerStatus" : "GOLD",
"salesAmount" : 10000,
"region" : "US"
}
}

To test if the function is working we again use the fn CLI as following

> cat ce.json | fn call customerpromotionapp /customerpromotion

If all goes well, this should print out another JSON in cloudevents.io format

{
"eventType":"com.oracle.oic.myaction",
"eventTypeVersion":"1.0",
"cloudEventsVersion":"1.0",
"source":"TBD-Source",
"eventID":"a509de1a-d0dd-4a6c-aa8b-4ca1da1b86e4",
"eventTime":"2018-08-30T14:25:17Z",
"contentType":"application/json",
"extensions":{
"destination" : {
"function" : "customerpromotionapp/customerpromotion"
}
},
"data": {
"discount" : 15,
"expire" : "24-12-2018"
}
}

Certainly, one could use the REST endpoint of the Fn function also and do a HTTP POST against the REST endpoint. The URL of the Fn function REST endpoint can be queried with fn

> fn list routes customerpromotionapp

which returns something like

PATH                IMAGE                                   ENDPOINT
/customerpromotion phx.ocir.io/oicpaas1/ralmuell/fn/customerpromotion:0.0.1 localhost:8080/r/customerpromotionapp/customerpromotion

Monitoring

Fn Server exports metrics into Prometheus so that one can build nice dashboards in Grafana. There is a useful Fn Usage dashboard available here for import, see Grafana Examples

Example Fn usage Grafana Dashboard

What’s Next?

The initial Proof-of-Concept was quite promising and I’ll definitely dig deeper into Serverless. I guess my next POC will be about the use of Machine Learning in Serverless. I started experimenting with GraphPipe, which leverages Google flatbuffers for serialization of data and thus allows ultra-fast scoring of Machine Learning models. GraphPipe was recently open-sourced by Oracle. At the end I’m envisioning some Fn environment that would comprise of the following

  • Fn Server infrastructure supporting functions developed in various programming languages
  • Some “Flow” infrastructure to build orchestrations of functions. Fn has Fn Flow for this and I’m going to explore this in one of the next articles.
  • Event Infrastructure centered around cloudevents.io standard (using Event Hub for example)
  • DMN to implement Decision Logic for Event Routing and to externalize decision logic from function code in general.
  • Powerful Transformations for Data Mappings
  • GraphPipe for ultra fast Machine Learning and AI

The one missing piece in this picture is of course the Human. Human involvement seems to be an anti-pattern for the per se short-lived functions in Serverless architectures. But hey, I haven’t promised to reveal everything in my first post on Serverless. Stay tuned and thank you for reading all the way through the end.

--

--

Ralf Mueller

Software guy, photography enthusiast. I work for Oracle Corporation, opinions expressed here are my own.