Offline gesture recognition for “alya-smart-mirror””

ali alsaeedi
3 min readMar 15, 2018

--

Less than 5 hours is what left for eclipse open IoT challenge 4.0 deadline, and we are super excited to show a new feature of our project alya-smart-mirror(ASM). We will talk about the offline gesture recognition support of Aly. Now, Alya supports for now palm and fist gestures using Matrix creator.

Architecture

We have used the Matrix Creator offline gesture recognition capabilities to detect the user’s hands and recognize its coordinates. MATRIX OS has an out-of-the-box support for a powerful computer vision engine that can be used for motion and gesture detection. We have implemented an application that runs in the MATRIX OS and uses the computer vision engine. This app is getting the gesture data and forward it through mqqt to an addon on Alya. the receiver addon in alya is asm-dashboard-addon, which can receive different data such as sensor data as well as voice/face/gestures recognition data. When a gesture is detected, an MQTT message will be sent to the backend through Amazon IoT. The backend will send the message to Alya through socket notifications. The mirror will then load the data to the screen.

Architecture

How it works

The matrix board will send events to topic alya-data whenever one of the following gestures are detected:

  • ‘fist’ when a hand fist is detected, payload :
{
"dataType": "matrix-fist-detected",
"data": {"location":{"x":228,"y":79,"width":133,"height":133},"tag":"HAND_PALM"}
}
  • ‘palm’ when a hand palm is detected, payload :
{
"dataType": "matrix-palm-detected",
"data": {"location":{"x":228,"y":79,"width":133,"height":133},"tag":"HAND_PALM"}
}

Demo

Alya can smartly identify the position of your hand if it is palm or fist. In this In this demo, we will show how the values of the fist and palm changes according to the position of the hand of the user in front of Alya.

Palm moving example
Fist moving example

Development

The gestures can be used in many different places of Alya like taking pictures, moving results of videos or images, playing some games with Alya and many other capabilities that can make Alya more friendly and more useful. One of the main features of Alya is the gesture detection, and we are planning to support a wide range of different gestures. The team is currently working in integrating Alya with Xbox Kinect. With the help of Kinect, Alya can detect different gestures of your body and it can be used to add many additional great capabilities to Alya. For example, imagine that Alya is your trainer and it can help you with your daily exercises ! playing some complex interesting games, and many many other features. Projects built on-top of Xbox Kinect.

--

--