Machine learning in the cloud, the easy way

Nabil Servais
3 min readApr 11, 2019

--

Google Cloud Platform released a new serverless service based on knative: cloud run.

As an ops, it is a very good news, deploying serverless application on multicloud become easier (this is the goal on knative).

As a developer, it is also a great news, I don’t care on how it will work, I just have to expose a HTTP port and make my applications stateless.

Let show how it can simplify development and putting quickly in production some code.

Olivier Wulveryck (@owulveryck) is one of contributor of gorgonia and the main contributor of onnx-go. He made a great demo of his works on his blog using webasm generated by golang.

The demo is a drawing application guessing the number. There is no magic, it is based on gorgonnia, a library that helps facilitate machine learning in Go, and onnx-go, the Go Interface to Open Neural Network Exchange (ONNX).

The concept is very simple, you load a onnx schema (in this case the mnist) , load it with onnx-go and run it with a backend. In this case gorgnonia. For more details, read (again) the article blog of Olivier, he is better than me for this kind of explaination.

For running it in cloud run, you have to containerize it. Nothing difficult, gorgonia and onnx-go haven’t any Cgo dependencies, the only thing needed is a classic build image for golang.

If you use the gcloud build image service, you can use a multistage images.

For the build, we can take time to build the image but for runtime we need the smallest images possible. Don’t forget a container can be destroyed or respawn anywhere, anytime.

For golang apps without cgo, Alpine is a good choice and if you don’t have any external dependencies, you could just use “FROM scratch”. I choose to store images generated from the draw in google storage for a future usage, that why the image needed ‘ca-certificates’.

Ok now we have a container ready to use. But I made some modifications so I have to fork.

I also add a solution for store images (it could use goroutines in the next version) and support cors. The static files are uploaded to a bucket and go for it.

The result : https://storage.googleapis.com/demo-onnx-gorgonia/index.html

The code is located here :

https://github.com/blackrez/onnx-go/tree/cloud-run/example/gorgonia/mnist-reader

The final yaml!

I have a lot of works to do for a better industrialisation and I want a some cool features like evaluation of the results. But with in a reduced time frame and with little means, I can make a quick and rapid prototype of a application using machine learning with autoscaling.

Easy, simple and quickly way for deploy apps.

Thanks to Olivier and gorgania developers to make this application and thanks to the knative team (and GCP teams in this case) to help developers for publishing apps more and more easily.

--

--