Real Time Activity Recognition with accelerometer and gyroscope sensor (MYO)

HiuKim Yuen
Aug 26, 2018 · 2 min read

We have been collaborating with a dermatologist on a research project about itch cure. This article is focusing on the technical side, so I won’t go into details about the medical theory. In short, our objective is to able to detect and predict itch and scratch actions, so measures can be taken to interrupt the behaviour.

Our end goal is to use EMG, accelerometer and gyroscope signals on hands to detect scratch actions, and this project (article) is a proof-of-concept work. With limited resources on equipments, we decided to go for MYO armband (, which is an affordable and easy-to-use option.

Video below shows the final result. We are able to detect four simple activities, “Relaxing”, “Working”, “Moving”, “Scratching” in real time with the MYO armband.

Source code:

Data Collection

MYO provide easy to use SDK to collect EMG and IMU data from the armband. For the purpose of our research experiments, we want to collect multiple times of certain activities (e.g. scratch). Each activity trial is around 2 to 3 seconds long. To facilitate the data collection process, we have included a `` script in the source repository.

What we did is to repeat “Scratch” action for 40 times, “Relax” action for 40 times, “Moving” action for 40 times and “Working” action for 40 times.


The training purpose is to build a classifier to be able to distinguish between these four activity types. We have been trying numerous Machine Learning methods with different features set using EMU and IMU data. It turns out that we are able to obtain reasonably good results with only IMU data using Fast Fourier Transform (FFT) to obtain frequency features. Details of FFT can be easily found on the Internet, so we will skip the details.

After running FFT on the IMU data, we further reduce the number of dimensions using Principal Component Analysis (PCA). The processed data becomes a 40-dimensions feature set. We then build a SVM classifier using this feature sets. With the above collected data, we achieve >90% accuracy.

The source repository provides a jupyter notebook `train.ipynb` that shows the detailed steps.

Real time activity tracking

Finally, we use the trained classifier to do real time activity traking. The idea is simple, we collect IMU data continuosly using the MYO armband, and then run the classification on 0.8 seconds window (40 data points under 50Hz). We run classification continuous in every 0.4 seconds. Video at the beginning shows the live tracking result. The tracking source code can be found in ``


The purpose of this article is to share some quickstart idea to do activity tracking with sensor data. Optimizing performance and increasing classification accuracy is not the main goal of this proof-of-concept work. Therefore, we haven’t provided accuracy reports or comparisons of any kind. All the used parameters are more of less by intuition.

SoftMind Engineering and Research

Softmind Engineering and Research Publications

HiuKim Yuen

Written by

SoftMind Engineering and Research

Softmind Engineering and Research Publications