On-Device AI — What I know so far

What is on-device inference?

On-device inference is the process of making predictions using a trained model which will run on the device. On-device inference is becoming more popular due to its lower latency and increased privacy when compared to cloud-based paradigm. However, executing such compute-intensive tasks on small devices can be harder due to lack of computation power and energy consumption.

What is on-device training?

Answer is in the question! 😂 Training your model on the device. Again, just like on-device inference, major challenges of using on-device training are computational power limitation on such devices and energy consumption. But on-device training gives more advantages. Your model can learn from user’s data. Since the model runs on the device, it can easily get user’s behavior. Besides that, it can be personalized to that user. Since training happens in the device, there is no need to upload these data to cloud. So, it guarantees data privacy. Also, there is no need to host a server to train your model, so it saves your money also.

Mobile apps which do training/inference on mobile

Here are some popular mobile apps/features which do on-device inference and training on your mobile.

Frameworks

All product names, logos, and brands are property of their respective owners

Processors which support On-Device AI

Samsung Exynos
Qualcomm Snapdragon

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store