Serving up cool and refreshing neural nets on-device with Swift

Carson Farmer
Textile
Published in
3 min readDec 14, 2017

--

Neuralyzer is like a brain for your phone!

In our second open source release in recent times, today we are announcing the release of Neuralyzer, an artificial neural network (ANN) library for Swift that supports on-device learning using a variety of neural network architectures, optimizers, loss functions, and layer types.

We created Neuralyzer because we wanted to a) be able to train models on-device using b) an API that is similar to our offline workflow (i.e., TensorFlow and/or Keras). Also, c) since most ANN frameworks for Swift are designed for larger convolution neural networks (for use in image classification etc.), they aren’t generally designed for on-device training. So we created Neuralyzer to address this gap in our workflow. Design goals for the project include a) ease of use (our data scientists aren’t Swift developers), b) speed (we use Accelerate wherever possible), and c) lightweight (it’s a small library, with ‘no’ external dependencies and minimal components. We even use our own matrix library).

To give you a better idea of what we’ve come up with, here is a quick example that fits a simple neural network to the famous Iris data-set:

Iris Example

The Iris flower data set or Fisher’s Iris data set is a multivariate data set introduced by the British statistician and biologist Ronald Fisher in his 1936 paper “The use of multiple measurements in taxonomic problems as an example of linear discriminant analysis”. The data sets consists of 3 different types of irises’ (Setosa, Versicolour, and Virginica) petal and sepal length, stored in a 150x4 JSON array. The rows being the samples and the columns being: Sepal Length, Sepal Width, Petal Length and Petal Width. See the Wikipedia article for more information about this dataset.

In this example, we’ll attempt to train a neural network to classify the type of Iris (Setosa, Versicolour, or Virginica) using each flower’s petal and sepal length and width. If you play along with this example in Neuralyzer’s Playground, the data are stored as a JSON array in the Playground’s Resources folder (`iris.json`).

We start by loading the Data and decoding it using Swift 4s Codable niceness. Once we have a reference to the JSON data, we can convert it into an array for extraction into inputs (X) and targets (Y).

Model setup and training proceeds pretty much the same as in other neural network packages; we specify a 3-layer neural network with 4 input features (petal and sepal lengths and widths), a 16-unit hidden layer, an 8-unit hidden layer, and a 3-unit output layer. We’ll use sigmoid activation functions on the first two hidden layers, and softmax on the output layer (for classification probabilities).

As we’re doing classification, we use a CategoricalCrossentropy loss function, and will use RMProp as our optimization routine. Here I’ve increased the learning rate to 0.1 to make our model adapt a bit faster. We should be able to achieve > 80% accuracy in under 20 epochs with these settings.

The actual backpropagation training proceeds very similarly to something like Keras or some other wrapper around something like TensorFlow. Along the way, we can evaluate our model accuracy, and even do timings, using simple Swift commands.

Finally, we can produce predictions, and export our fitted model to a Swift Dictionary or JSON object.

And there you have it! A very simple, fast, and super lightweight feed-forward artificial neural network library in Swift. As you can imagine, this opens up a whole slew of possibilities for on-device learning using neural networks.

So check out our official open source release of Neuralyzer over here, and let us know what you think. Neuralyzer is a work in progress (let’s say α stage), so input is very much welcome. Having said that, if you find it useful, or you have some issues, ideas, merge requests, or complaints, please let us know. And of course, if you know of any other nice neural network implementations for Swift, we’d love to hear about that too.

--

--

Carson Farmer
Textile

Works at Textile.io. Former prof, turned dweb professional. Writes about ipfs, textile, dweb, decentralization, etc.