HP ENVY 17 Leap Motion SE Notebook

LEAP into JavaScript handsfree gestures

It’s been almost 3 years since Leap Motion has been available and it’s popularity hasn’t increased a great deal since that time. Regardless of this fact HP has released a laptop, an all-in-one machine and a keyboard with built in Leap Motion sensors.

We pre-ordered the sensor months before the release date and started testing it on the first day we received it. This device is typically something that gets everyones attention and sets high expectations. Before the first test everyone is nervous a little bit; we all want it perform at least as good as we’ve seen it on Youtube. We all imagined ourselves organising windows and flicking through image galleries by waving in the air.

Despite the fact that the idea of the controller is simply loved by everyone after the first test only a few of us were blown away.

Cnet only gave it 3 stars out of 5 and summarised their review with these thoughts:

“Don’t buy the Leap Motion Controller with the expectation that it will replace your mouse, touch pad, or touch screen for day-to-day computing. For now, it’s for would-be futurists with a penchant for experimentation who are looking for a fun (if limited) glimpse of the motion-controlled future.”

Testing the Leap Motion

It was obvious on the first day that the tiny, £50 usb device is extremely accurate. In the data visualising demo apps the 3d model of both of our hands were refreshed 60 frames a second but as soon as we leave the playground carefully set up by the creators we immediately feel that it just doesn’t work right.

The perfect hardware is unfortunately not backed with perfect software and drivers and the apps won’t perform as smoothly as expected. You have to repeat certain gestures many times before the software picks up what you wanted and this could be frustrating when the Leap is used as an everyday input device. Good news is that this can potentially be sorted with the release of a new driver so let’s hope they work hard on new updates!

This is the main reason why I don’t recommend the Leap Motion to an average user. Everyone expects it to replace their good old mouse but we are just not there yet. On the other hand the Leap Motion is so much fun to develop with! If you are a developer looking for something cool to show off, a brand new pet project then this is something you should immediately invest in!

The day I decided to buy a Leap Motion controller was the day I found the Javascript SDK. This technology is just something you wouldn’t expect to be available in the browser.

The Javascript SDK

Unfortunately the official documentation and getting started guide is a bit dry and takes long time to digest.

Luckily, Leap Motion has released a new, updated SDK featuring “Skeletal Tracking” which is much simpler compared to the original SDK. I highly recommend checking it out. It’s awesome!

The only thing you have to do is load the new JS file in the head of your document from http://js.leapmotion.com/leap-0.6.0.js then the below code will work straight away. Leap.loop() technically replaces the older ws.onmessage function.

var controllerOptions = { enableGestures: true };
Leap.loop(controllerOptions, function(frame) {
// Body of callback function
console.log(frame);
});

The way this works is quite simple. We are receiving the data from the controller through the usb port through the browser’s web socket approximately 60 times a second. The data is streaming through the web socket port. It’s like an AJAX call that returns a JSON object which is then available to read in the success function but this time the success function is called and the frame object is received 60 times a second. This JavaScript object then contains coordinate, speed, angle, etc. values of each hand, finger and finger tip.

This is a section of an example JSON we receive from the Leap Motion:

{
“currentFrameRate”:92.5945,
“gestures”:[
],
“hands”:[
{
“direction”:[
0.297448,
0.0924398,
-0.950252
],
“id”:88,
“palmNormal”:[
-0.0843021,
-0.988871,
-0.122585
],
“palmPosition”:[
-45.9597,
185.77,
-0.021841
],
“palmVelocity”:[
117.182,
16.6257,
-70.9919
],
[...]

If you open up your browser console you can see that the tracking data is being logged many times a second from inside the Leap.loop() callback function.

Let’s look at one of the returned frame objects with the tracking data and investigate the values that are useful for us:

// Array containing all the fingers
frame.pointables
// Array containing the coords of the first finger
frame.pointables[0]
// x coordinate: horizontal position
frame.pointables[0][0]
// y coordinate: distance from the sensor (height)
frame.pointables[0][1]
// z coordinate: distance from the display (depth)
frame.pointables[0][2]
// y coordinate of the centre of the palm
frame.hands[0].position[1]

This is just a quick teaser. The frame object has length, speed and angle values from both of your hands and all fingers. Full documentation can be found on the Leap Motion developer portal.

A simple example

Let’s look at a simple example that changes the CSS of an element using values from the Leap Motion. It will be a div element that follows our fingertip on the screen. The plan is to read the x and y coordinates of the fingertip and assign that value to the top and left CSS values of my div.

Let’s add a div into the html, give it some basic styling and position it absolutely. After that we initialise leap.js and add these to the Leap.loop() callback function:

// If any finger found
if(frame.pointables.length > 0){
var fingertipX = frame.pointables[0][0];
var fingertipY = frame.pointables[0][1];
mydiv.style.left = fingertipX;
mydiv.style.top = fingertipY;
}

This is basically how we read certain values from the JSON file. We can complicate things by combining values in conditions and that’s how we define custom gestures.

Working with gestures

We mentioned gestures in the introduction and you probably thought it would be quite complicated to define them from fingertip and palm coordinates and speeds. The good news is that we don’t have to. Apart from numeric values in the JSON object we have a special object called gestures.

As soon as we perform a gesture the value of this property changes to the name of the gesture. The gestures recognised are: circle, swipe, screentap and keypress. Screen tap is a forward, key press is a downward tapping movement of any finger.

One important thing about the swipe gesture is that it doesn’t return the direction of the movement. If we need this information in our application we will have to combine this with the speed value of the fingertip. The speed is a negative number if we move our fingers left and positive if right.

The condition will be something like this:

if((frame.gestures == “swipe”) && (frame.pointables[0].speed > 0)) {
// this is a right swipe
}

Conclusion and the future

I personally haven’t spent too much time with the Leap Motion as I prefer the XBOX Kinect for gesture tracking especially because it recognises full body gestures.

The level up and get started with the XBOX Kinect 2 sensor and JavaScript have a look at this get started guide from Web on Devices:

Get started with XBOX Kinect 2 JavaScript development

There is a separate guide that helps you interpret more complex gestures with the XBOX. You can actually use these principles for Leap Motion gestures tracking as the SKD and the skeleton data is organised very similarly but instead of fingers you work with limbs:

XBOX Kinect 2 JavaScript gesture tracking

Let me know if you have done anything interesting with handsfree gestrue tracking with JavaScript or other web technologies.