Hand motion capture

Dimitra Blana
The quest for a life-like prosthetic hand
3 min readApr 21, 2018

I’ve had lunch and dinner outside in the last three days. In England. In April. I will remember these days fondly in August, when it is cold and rainy.

But science doesn’t stop when the rain does! I spent quite a few hours in our (windowless) lab this week. The reason is that, as I mentioned in an earlier post, we decided to collect a bit more data.

Here’s a reminder of what our experiment is about, for those who, like me, cannot even remember what shirt they are wearing today. The rest of you can skip the next three paragraphs.

We have developed a computer model of the human hand, which is a set of mathematical equations that describe how the hand moves: how the muscles are attached to the bones, how they contract and produce force, and how these forces are applied to the bones and result in movement. If we give our model information about which hand muscles are active, it will estimate the resulting hand movement.

We can record the activity of muscles with a technique called electromyography, and since most hand muscles are located in the forearm, we can use this technique even with people who have lost their hand or were born without one. That means that given information about muscle activity, our computer model can estimate the resulting hand movement, even if the hand is not there.

This is what we want to test with our experiment: if we give our computer model electromyography data from volunteers with intact hands, will it predict the resulting hand movement correctly? (We want intact hands at this point, so we can compare what the model predicts with how the volunteer’s hand actually moves.) If the model predicts the movement successfully, then we could use it to control prosthetic hands.

When we went to Newcastle a couple of months ago, we had a few volunteers control a robotic hand with their electromyography data quite successfully. But because of some hardware problems, we couldn’t actually record the volunteers’ hand movements.

So we’ve decided to redo part of the experiment*. We don’t have a robotic hand in our lab (this is why we went up to Newcastle in the first place), but we don’t actually need the robotic hand to test the model. The robotic hand has an excellent controller of its own, so it copies the model movements perfectly. It basically acts as a physical representation of our computer model. This is useful because by looking at it move we can tell if our model is working correctly, but it’s not essential.

Vicon markers on the hand, and stick figure with the estimated positions of the markers

The important part is recording how our volunteers’ hands are moving, which we can do in our lab with the Vicon motion capture system. This involves sticking small markers covered in retroreflective material on the hand. Cameras mounted on the walls of the lab emit infrared light, and this is reflected straight back to the cameras by the markers (thanks to their retroreflectivity). The 3-dimensional position of the markers can then be estimated, as long as they are hit by infrared light coming from at least two cameras.

Motion capture has many fun applications, not just in the study of human movement, but also in computer animation, tracking animals, robots, and …playing charades.

*If you’re worried that our trip to Newcastle was wasted, don’t! We still need data that include the robotic hand, to ensure that the entire setup (electromyography recording and processing + computer model + robotic hand) runs fast enough to be used for control. It’s a study in two parts and two Newcastles :)

--

--

Dimitra Blana
The quest for a life-like prosthetic hand

I am a biomedical engineer, and I develop computer models to help understand and treat movement impairment. I am Greek, living in the UK.