26 Followers
·
Follow

A full PyTorch-like experience on iOS using NimTorch

Image for post
Image for post

Why PyTorch-like?

In short: We are actually using NimTorch.

Why do we care? There already is ONNX, CoreML, Tensorflow Lite, etc…

This is the full experience. You can literally use and deploy the same code you have on your workstations. No intermediate steps, no translation, no lite versions.

You are able to to train, modify and be creative with your neural networks within your apps.

Some use cases:

  • Unleash transfer learning, using pre-trained VGGs to learn and detect objects tailored to your app’s final customers, improve and infer OCR models directly on devices rather than on the cloud, e.g. when sending data over the internet is not an option.
  • Train style transfer networks to generate camera filters, audio filters, etc., straight in your apps, without having your customers disclose their data.
  • Use partially trained models and complete training straight on your customer's devices, allowing complete privacy and customization.
  • Web 2.0 of neural networks: Reduce your company’s costs by having your customer’s devices do some of the work you would normally do on your servers and GPUs.

What about libTorch?

So far it does not properly compile on iOS. More importantly you would have to use C++ or serialized models. Goodbye to productivity!

Also in PyTorch there is currently a distinction between mobile builds and desktop builds.

Instead In the case of NimTorch we have a desktop build of PyTorch low level components on iOS.

Show me some code!

This is the same code you saw running in the screenshot above.

What’s the relationship between PyTorch and NimTorch?

At Fragcolor Inc. we love PyTorch, in fact NimTorch uses the very low level components of PyTorch itself (C10, ATen, etc). We weekly rebuild, merging with upstream and actively contribute.

You can find more info in our previous article: Introducing NimTorch.

How about performance?

Thanks to the power of Nim we are dealing with 100% native code and largely outperform our python counterpart already. In-depth benchmarks are coming soon!

Moreover, current generation iPhones and especially iPads are extremely powerful. These devices already run lots of neural network inference with excellent results.

If there is enough demand and growth, we might also consider adding proper GPU support!

Can I write code on the go?

No, this is a limitation of iOS. JIT or process execution is not allowed.

That being said, this could be worked around by using Nim’s JavaScript backend or Nim’s compile time VM.

Where do I get it?

The easiest way is to contact us. We don’t publish this specific build yet, since the process is a bit tedious and needs more documenting on the cmake and xcode magic involved.

NimTorch is open, fetch it here: https://github.com/fragcolor-xyz/nimtorch

Contact us!

Notes:

  • We used the open source blink shell to simulate a shell environment and deploy this command, no jailbroken devices involved.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store