Our take on human machine interfaces: Bio-potentials, Deep Learning and implementation on the Apple Watch

Human machine interfaces evolve at a slower pace than other technologies we are used to. The mouse and keyboard, introduced in the Mother of All Demos was one such landmark. Another landmark moment was introduced by Steve Jobs at the iPhone reveal. At Wearable Devices we are taking a shot at another interface evolutionary step. I know this sounds a bit grandiose but I tell it as it is 😀

The touchscreen is not a good interface for a smartwatch. We would like to control the smartwatch single handed and effortlessly using only our fingers.


We crave effortless command of the digital world around us, however FPs ruin the experience. Here is our approach to the problem.

Mudra Band for the Apple Watch

Controlling digital devices with finger movements is a holy grail. We all want to be able to command the growing digital world around us effortlessly and intuitively. One of the difficulties with such technology is recognizing unintentional movements as gestures. We may wave our hand or scratch a surface and such movement will be (incorrectly) recognized as a gesture.

With the Mudra Band we aim to reach this holy grail. The band captures neural signals sent from your brain, through the wrist, to your fingers. Our patented SNC sensors capture the signals, while our deep learning AI algorithms decipher the…


The Challenge — acquire bio-potentials from the surface of the skin, through the medium of an electrode, then process those data streams and classify each pattern of signals as the correct finger movement.

Nowadays, when you’re developing technology, especially consumer electronics, you want the best-of-the-best in terms of user experience. At Wearable Devices we are developing Mudra, a neural input technology in the form factor of a watch strap which acts as an extension of the hand into the digital world. It detects neural signals from the wrist and translates them into control functions on digital devices. Intuitive and natural HMI is the missing piece in the puzzle for smartwatches \ smart glasses. …


Let’s say you want to develop a mobile app which includes deep learning functionality. However, deep learning is only part of the software stack. Google’s Tensorflow deep learning framework is ideal for such usage. It is written in c++, with an API in c++, but surprisingly there is no example of c++ usage on Android… Google currently only supports a Java API on Android via the JNI (libtensorflow_inference.so).

We found a seemingly innocent comment regarding Tensorflow c++ on Android here. Pete Warden from the Tensorflow team points to the benchmark tool as an example for cross platform usage (Linux PC…


At Wearable Devices we are building the next big thing in human-machine interfaces. So what does this have to do with deep learning and biopotentials? Biopotentials are electric potentials (typically on a scale of micro-volts) that are measured between points on living cells. We measure such biopotentials directly from the wrist. This phenomenon holds the key to unlocking a truly great Human Machine Interface (HMI). You can think of a biopotential HMI as a “spy” which listens to the brain — nervous system — wrist conversation and translates it into a language we can understand.

So why is the above…

Leeor Langer

CTO of Wearable Devices - Data Science Architect

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store