Kinect for Hand Tracking and Finger detection
The official Windows SDK for Kinect provides 20 skeletons joint. For body detection, it perfect enough to get users’ body gestures. It applies in gaming is awesome. But you will not want to use it on PC controlling after you really do it once. The body tracking for PC manipulating sounds fancy, but it will exhaust you completely.
Therefore, let’s get back to use hands in PC controlling with Kinect. And the first step is tracking hands and even fingertips.
OpenNI is the famous NUI open source library. It can receive color, depth, and skeleton frame from NUI sensors. And NiTE is the middleware based on OpenNI.
If you do not need other unnecessary resource such as color stream, you can use NiTE directly without init OpenNI.
create hand tracker:
Due to we only track hands’ position without skeleton frame.
So we need to use some GESTURE to tell NiTE to track specific hand. Click, Wave, and RaiseHand is native support by NiTE.
After setting the gesture as start signal, it can ready to get frame.
in get frame loop and find hands we want
Above is pure hand tracking, it can find users’s hand position after CLICK GESTURE.
For further usage, you can use Convex Hull algorithm to caculate the fingertips from contour of hand. The OpenCV provide “convexHull” function to ouput hull-point array, and “convexityDefects” for defect-point array.

Originally published at bustta.logdown.com. @ August 14, 2013
