Deploying Machine Learning Models for Elixir applications #2: Getting Excited

Paweł Dawczak
5 min readDec 9, 2018

--

If GenServer was OO

Previously we have set up the environment and managed to Elixir to talk to Python and we are in a good position to take the next step — let’s create new Python script — py_app/second.py; first, imports:

The process we will implement here, will have only one responsibility — receive a call with a string, and reply with the same string upper-cased.

Let’s implement the GenServer then:

In the constructor we will do the wiring:

- Delegate to GenServer class with node_name
- Register what calls it will handle by specifying accepted_calls
- Register self under ”py_process” name

And let’s provide the implementation of the method — it will return the upper-cased string.

Lastly, let’s add main() method:

Nothing super special here — we’re creating a new instance of PyGenServer. This will complete the setup of a node.

For convenience starting this node, let’s update the Makefile and add the
following entry:

Let’s give it a try:

All good so far!

Elixir talks to Python and hears back

Let’s fire up iex session:

and give the following a try:

OK, I know what you’re thinking — yes, this is cool, but most certainly not exciting. What’s the deal?

Yes, I agree — this is a small step but in good direction…

Next, let’s try to make Python node a bit smarter…

Python for Machine Learning

Iris classification problem is a Hello World in Machine Learning world.

It contains data about flowers, alongside data associated with it. The problem is simple enough, to be able to build very first, simple classification model.

Let’s try to build such a model, and then, we will integrate it with Elixir.

First, let’s ensure you have pyex_env virtualenv enabled:

and install the following dependencies:

Let’s download the dataset; for the exercise, I’ve used this link to obtain the CSV file.

Next, let’s start Jupyter:

after the Jupyter’s server start properly, it will be opened in the web browser:

Main screen of Jupyter notebook

Let’s create a new notebook:

Creating new Notebook

Here, let’s import pandas, load data, and for a sanity check — let’s output the
DataFrame for brief inspection:

Loading data

Looks like we have accessed the data properly!

For convenience, let’s assign different “slices” of data for easier access.

By convention, the features that will form an input for our Machine Learning
model will be called X, and labels (values we want our model to predict) will be y.

Slicing data, step 1
Slicing data, step 2

This looks good!

Now, when we have our variables set, we can start training our model — let’s
import first model, and fit the variables in. This is, when the model gets
trained:

Training the model

And now, this is the most exciting moment — did it learn anything form the data?
I’ve checked the CSV and picked two examples. I’ll let the model predict the
values for me:

Predicting

Yes! These labels indeed are associated with the data!

Let’s export the trained model from Juputer Notebook, so we can load it in our
Python node.

For this, we will use pickle format:

Exporting trained model

Loading the Model in Python

Given the model has been serialised to pickle format, we are able to deserialise it in “pure” Python script.

Let’s do the following — first, import pickle:

next, change the constructor:

  1. Our GenServer will handle call classify, which will use our trained model
  2. We are loading the serialised model

Next, let’s implement classify method

NOTE: Unfortunately, at the time of writing this post, Pyrlang doesn’t implement full and proper serialisation and deserialisation of ETF (External Term Format), a format, that is used by Erlang to serialise data sent between nodes. To avoid this problem, I’ll serialise my data to JSON, but once issues in Pyrlang are addressed, this could be reverted.

Firstly, let’s import json:

next, the method itself:

  1. We’re implementing classify method, which will accept encoded_params_to_classify. These will be encoded in JSON
  2. Decoding the parameters
  3. The model is able to perform classification over a list of inputs, and as such, it returns a list of results. In this exercise we will classify only a single set of data, hence, will will access the record the first element from the result
  4. We have to serialise the value to JSON before returning it

The Elixir side

First, let’s create a fresh new Elixir project:

Next, add Jason dependency in mix.exs, we will use it for JSON encoding:

and issue the command:

to install the dependencies.

Next, let’s implement a function that will call corresponding process in Python node:

  1. This defines a reference to specific process in the :”py@127.0.0.1" node
  2. Transform a map of values to a list of parameters, and encode it to JSON
  3. Send the call to the process with the params
  4. Decode the JSON response

Trying it all together

Python side

Given we have a corresponding entry in Makefile defined already, in one terminal window issue the command:

Elixir side

In another terminal window start Elixir node:

Once this completes, let’s issue the following:

What’s happened

We managed to create a process in Python node, that was responding to calls. We have trained our first Machine Learning model, and finally, we managed to load the model into our Python code. This allowed our Elixir application to interact with it the way it didn’t know the node it sends messages to, and receives messages from, isn’t Elixir, or any other BEAM based language.

I don’t know about you, but for me this is extremely exciting — we were able to make our program to classify numeric values, all without a single if statement! Computer was able to learn the classification rules by providing it the data set only!

Today, it only predicts flowers, but tomorrow it might be the engine that will recommend your next car, or predict the price you’ll sell your house for!

--

--