Rock, Paper, Scissors;

Reviewing the Myo Gesture Control Armband

We were excited to get a copy of the Myo, a gesture control armband made by ThalmicLabs that is currently in pre-order. Here’s what comes in the box:

I downloaded Myo Connect Beta, launched it and then slid the Myo band onto the thickest part of my forearm with the logo facing up. The video mentioned that the device needs to be snug and have a direct skin connection so that the detectors (electromyographs) surrounding the band can sense the electrical signals produced by muscle movement. Then it just needed to “warm up” for a few minutes before syncing could begin.

Syncing is easy; hold your arm like you’re a waiter at a restaurant, and sweep it out, as if to say, “Look at this delicious Baked Alaska.” You’ll get a little buzz from the Myo if you do it correctly. The set up then walks you through a series of interactions that use the Myo’s main gestures.

According to the ThalmicLabs website, the band also contains a highly-sensitive nine-axis IMU containing three-axis gyroscope and three-axis magnetometer. I’m assuming that this is to report motion, speed, pitch and yaw but only rotation detection was immediately evident in the demo.

Controls for a few popular programs like Spotify, Keynote, and Acrobat Reader are built into Myo Connect, so I fired up Spotify and gave it a go. Here’s what I found:

Gesture recognition can be coarse

It feels like you’re playing Rock Paper Scissors with your computer and the computer keeps winning. Spotify is tricky to control, the Myo recognized my gestures about 50% of the time. Fist gestures were recognized around 10% of the time.

I think that part of the reason was user error. It is unclear how long the “unlocked” state (where the Myo recognizes gestures) lasts for. Sometimes the Myo recognized a gesture while locked. Other times we needed to perform the unlocking gesture and immediately perform the second gesture for recognition to occur. Staring out, this looked hilarious, but quickly became embarrassing when I cranked Dolly Parton’s “Joleen” all the way up and couldn’t get it to turn back down.

These poor results could have just been me, so I handed the Myo around to a few of my Instrument teammates to try out, and the results were the same. We tried recalibrating via Myo Connect. Again, we experienced the same results.

We switched applications to HandyBrowser, and the results were a little better. The app had a few suggestions that helped, including a warning about combo-gesture timeout and helpful directions about disabling auto-lock. I’m not sure about the fist-gesture based scroll, but it worked better than fist-gesture based volume on Spotify. Thank you Maik.

Augmenting an experience with the Myo
should make that experience better

Keynote was our last test and it turned out to be the most popular use of the Myo among the designers. It was simple and it worked every time. The app relies on the three gestures that are most consistently recognized by the Myo; unlock, wave left and wave right. That’s really all that Keynote needs, and in this case, the doing less actually makes for a better experience.

Fatigue is a real issue

I got the best response from the Myo when a gesture used a lot of muscle contraction. I found myself making movements that fell outside of my standard range of motion, which caused fatigue in a moderate amount of time. I assume that this will get better as the Myo evolves, but in the meantime, something for experience designers to consider:

Not all gestures are equal

Waving outward fights against the way our wrists work. Waving towards the body is a lot easier and can be sustained longer, even if that motion needs to be extreme. The double tap was the second easiest gesture to make and was almost always recognized by the sensors. Making a fist and then spreading your fingers requires the most effort and may be the most difficult for users with motor control issues or arthritis.

Good experience design
can smooth the rough edges

The Myo has the potential to be a nice interface device, and even in these early stages there are some applications that are good enough to use. It would be easy to make them better. Take Spotify for example. These are the current interactions:

Waving towards your body to go back to the previous track and closing your hand to control volume feel counterintuitive. Opening your fingers to let the music flow feels right, but in practice to consistently trigger it I found that I needed to go from a closed fist out to spread fingers and that could trigger the volume control.

Here is how I would modify the interaction:

First, do away with the timed unlock or at least extend it to 5 seconds so you can get a gesture off and recognized before time runs out. Double tap is the most reliable gesture so it should be paired with the most important function; start and stop. The fist gesture is gone, replaced by the open hand combined with rotation to control volume. Slow wave in to scroll/pan through tracks and a double wave to advance one track at a time. A double wave out loads the previous track.

Conclusions

I think that ThalmicLabs is onto a good thing here and I’m looking forward to seeing how the Myo evolves over time. I assume that the coarseness of some of the gesture recognition will be resolved as more people use the device and ThalmicLabs acquires a bigger dataset to draw upon. With this data should come greater control and less reliance on the kind of broad gestures that characterize this current moment in interaction design. Hopefully we will eventually have controls that can seamlessly be incorporated into everyday life and take me ever further away from touching a screen.

Have any experience with the Myo that you would like to share? Catch any hands with too many fingers on them? Excited about some new wearable technology that we should take a look at? We’d love to read about it in the comments. We’re planning to follow up this article up with a developer’s take on the Myo and thoughts on building a gesture based interaction language.

Matt Sundstrom is a experience designer working in the interactive field since 2001. A unique blend of illustration and experience design allow him to take the 30,000 ft view of complex problems and communicate solutions through sketching and collaboration. Find him over at Instrument and on twitter at Mattink.

Instrument is a an independent digital creative agency in Portland, Oregon. We launch brands, products, campaigns and interactive experiences for every screen.

Images © Matt Sundstrom 2015

One clap, two clap, three clap, forty?

By clapping more or less, you can signal to us which stories really stand out.