This post is a bit late because earlier this week I was on strike, along with many other UK university staff, protesting against unfair and unnecessary changes in our pensions. If the dispute is not resolved, my work on our Royal Society exhibit (and everything else) will be very limited for the next couple of weeks…
Our visit to Newcastle-upon-Tyne started quite well. It was sunny and we arrived around lunchtime, so the first thing we did was go to the cafeteria and catch up with our Newcastle University colleagues.
After lunch we went to the lab to get things set up. It went downhill from there.
We had to get three devices and one computer model to work together, and we had issues with every single one. Some were easily solved (uhh… is the power switched on?), others required digging into old emails for instructions and activation codes. Near the end of the day, we finally connected everything and got ready for the first test.
Here’s our experimental setup, which we were testing on one of us (Ed Chadwick — we don’t involve recruited volunteers until we have solved all technical problems…):
We had four EMG sensors on the forearm, placed over muscle groups that are relatively independent: 1st extends the middle, ring and little finger, 2nd extends the index finger, 3rd flexes all fingers, and 4th extends the thumb. (In practice, it is very difficult to get independent signals from surface EMG sensors. Muscles are tightly packed in the forearm, so even if a sensor is placed over a particular muscle, it can receive signals from surrounding muscles as well. What a pain!)
We mapped the EMG signals to the equivalent muscles in the computer model, and mapped the output of the model (i.e. the “movement” of the digits) to the robotic hand. At the same time, I was wearing an instrumented glove, which has sensors to record hand movements. I used this to instruct Ed on what movements to make, and keep a record of these movements.
When we hit “go”, it was an exciting moment: nothing crashed! I started moving my hand, Ed copied my movement, we recorded EMG from his muscles, we processed them and sent them to the computer model, the model “moved”, and the robotic hand copied the movement.
What we were hoping to see was Ed’s movement copied by the robotic hand via the computer model. (Ed is not an amputee, so we can see his hand moving. But the idea is that an amputee thinking of moving their hand would produce similar EMG.)
What we saw was… not that. The robotic hand seemed to move randomly. It would open its fingers long after Ed had opened and closed his. I tweaked the EMG gains, triple-checked the mappings, pleaded with the computer gods -nothing. The robotic hand pointed when it should be giving a thumbs-up, rested when it should be stretching.
At that point, I proclaimed that I was quitting and moving back to Greece to become an olive farmer. Ed has heard that before, so he wasn’t particularly alarmed (I am prone to melodrama).
Instead, he observed that the robotic hand movements were not actually random, they were just very delayed. A bit of digging revealed that having both the EMG and the instrumented glove running at the same time caused a large delay in the EMG recording. When we turned the glove off, the robotic hand copied Ed’s movements really well!
With just two days to collect our data, we made the decision to remove the instrumented glove from our setup instead of trying to figure out the source of this problem. We were only interested in a limited number of movements anyway (see part 1), so we did not need to continuously record the movements -it is easy to infer these from the EMG.
I’m happy to say that everything went well after that. When I next post about our “open experiment”, I will explain what we are going to do with the data we collected.