Offline gesture recognition for “alya-smart-mirror””
Less than 5 hours is what left for eclipse open IoT challenge 4.0 deadline, and we are super excited to show a new feature of our project alya-smart-mirror(ASM). We will talk about the offline gesture recognition support of Aly. Now, Alya supports for now palm and fist gestures using Matrix creator.
Architecture
We have used the Matrix Creator offline gesture recognition capabilities to detect the user’s hands and recognize its coordinates. MATRIX OS has an out-of-the-box support for a powerful computer vision engine that can be used for motion and gesture detection. We have implemented an application that runs in the MATRIX OS and uses the computer vision engine. This app is getting the gesture data and forward it through mqqt to an addon on Alya. the receiver addon in alya is asm-dashboard-addon, which can receive different data such as sensor data as well as voice/face/gestures recognition data. When a gesture is detected, an MQTT message will be sent to the backend through Amazon IoT. The backend will send the message to Alya through socket notifications. The mirror will then load the data to the screen.
How it works
The matrix board will send events to topic alya-data
whenever one of the following gestures are detected:
- ‘fist’ when a hand fist is detected, payload :
{
"dataType": "matrix-fist-detected",
"data": {"location":{"x":228,"y":79,"width":133,"height":133},"tag":"HAND_PALM"}
}
- ‘palm’ when a hand palm is detected, payload :
{
"dataType": "matrix-palm-detected",
"data": {"location":{"x":228,"y":79,"width":133,"height":133},"tag":"HAND_PALM"}
}
Demo
Alya can smartly identify the position of your hand if it is palm or fist. In this In this demo, we will show how the values of the fist and palm changes according to the position of the hand of the user in front of Alya.
Development
The gestures can be used in many different places of Alya like taking pictures, moving results of videos or images, playing some games with Alya and many other capabilities that can make Alya more friendly and more useful. One of the main features of Alya is the gesture detection, and we are planning to support a wide range of different gestures. The team is currently working in integrating Alya with Xbox Kinect. With the help of Kinect, Alya can detect different gestures of your body and it can be used to add many additional great capabilities to Alya. For example, imagine that Alya is your trainer and it can help you with your daily exercises ! playing some complex interesting games, and many many other features. Projects built on-top of Xbox Kinect.
Run
This app needs to be deployed in a Raspberry connected to a matrix creator one.
Please follow the steps here to create or deploy applications on Matrix OS:
Prerequisites :
Then deploy the app :
git clone https://github.com/alronz/alya-smart-mirror.git
cd gestures-core
matrix deploy matrix-gestures
More readings
- “alya” a DIY modular personalized open-source smart mirror
- “asm-backend” the backend for alya smart mirror
- Offline voice recognition for “alya-smart-mirror”
- Offline face recognition for “alya-smart-mirror”
- Cloud-based Face Recognition for Alya Smart Mirror
- “asm-app” a react native mobile app to configure alya-smart-mirror
- Cloud voice recognition “Alexa” for “alya-smart-mirror”
- https://github.com/alya-mirror/alya-smart-mirror/tree/dev/voice-recognition/matrix-pocketsphinx