Smart Bionics: Our Plan To Revolutionize the Field of Bionics with Mixed Reality & Machine Learning
When I learned that my child was going to be born with only one hand, it didn’t take me long to recognize that I had been offered a tremendous opportunity. I looked at the current state of prosthetic and bionic hands, and I saw a glaring unmet need: No one had managed to create an open source bionic hand with the functionality of a human hand. I looked at my own life — a designer working in additive manufacturing at Autodesk, one of the top tech companies in the world, in San Francisco, one of the tech capitals of the world. After a few conversations with people who were experts in the multiple fields required to build such a hand — robotics, programming, mechanical engineering, industrial design, and more — I realized that I could build a team that could rise to this challenge. This realization solidified into the nonprofit called The Luke Hand.
There’s never been a better time for a revolution in bionics. As the powers of mixed reality and machine learning grow in the massively interconnected space of global open source development, humanity is rapidly gaining abilities to create novel solutions to wicked problems in ways that have only been imagined in science fiction. Our team of designers, developers, and engineers are focused on using these emerging technological superpowers to solve one wicked problem in particular: Helping my son — and ultimately, all people with hand amputations — transition from disabled to superabled. We’re working hard to help make science fiction science fact in the field of bionics.
How? Here’s a high-level overview of our plan…
We’re starting by creating a live, immersive, Mixed Reality simulation of The Luke Hand that’s controlled by myoelectric sensors worn on the affected forearm. We’re using the HoloLens to project the simulation of The Luke Hand onto the amputee’s residual limb, making it look and behave like an actual bionic hand. EMG data streaming from the myoelectric sensors will be input into a machine learning system. A support vector machine classifier will decode the person’s intended movement to trigger commands for the bionic hand.
This immersive, closed-loop bionic simulation, projected onto the arm in Mixed Reality, and optimized with machine learning, creates an ideal environment to learn how to control a bionic hand — even if you’ve never had a hand, like my son. This system enables people to explore the range of possible motions of the bionic hand, including how best to grip and manipulate different types of (virtual) objects.
When this bionic training system has been tuned to an individual’s unique muscular and behavioral patterns, and when it can be used to control an actual, mechanical bionic hand, it will disrupt the field of bionics, in which current devices fall far short of the kind of functionality the term “bionic” implies. Therefore, we’re designing the mechanical version of The Luke Hand so that its control system can easily receive “intelligence uploads” from the personalized training system. Once the training is complete, the system uploads a personalized control profile to drive the mechanical version of The Luke Hand. All the intelligence that was created through personal interaction with the simulated hand will be loaded into the mechanical hand, creating an intelligent, customized bionic control system.
Every part of this system is being developed and shared a well-documented open source asset. You’re welcome to download and fork our work on our GitHub repo.
This is the 10,000 foot view. We’re currently in the early phase, and there are many details along the way that remain to be worked out. We’re working on it.
If you’d like to learn more about The Luke Hand, our origin story, and our team, check out this interview with the Autodesk ReMake team. You’re also welcome to read this Open Letter To the The Luke Hand’s Board of Advisors. And while you’re at it, join our community on Facebook.
Have suggestions or comments? Want to get involved? Let us know in the comments.