Final Prototype

(Instinct Prosthetics App for Dynamic Productivity)

Explorations in Rapid Prototyping: Exercise 9


Concept

For my final prototyping project I chose to explore the field of wearable technology once more by building off of my True Interactions mobile app prototype in Exercise 8.

The True Interactions prototype allowed users to create their own positions for their prosthetic hand, but they still had to manually activate which position they wanted the hand to form to. My previous application was designed to be used as a controller for a prosthetic device that receives position commands from the app, but this time I wanted to experiment with the idea of a prosthetic hand that responds automatically to the user’s environment.

Over the past couple years Near Field Communication (NFC) technology has grown to become a standard in most smartphones. A basic application of NFC technology has been seen in the Google Wallet and Apple Pay systems that allow people to make purchases at stores by simply touching their phone to the checkout machine.

Instinct Prosthetics allowing the user to automatically grab a water bottle.

The great thing about NFC is that the patch triggering a response does not require power, and currently, these patches can have tasks programmed into their memory to activate features on a smartphone (e.g. turn on Bluetooth and automatically pair to car stereo). For my scenario with prosthetic hands, an NFC patch could have a hand position programmed into it by the phone, and the prosthetic hand could theoretically have a sensor that reacts automatically to the program in the NFC patch when they are close.

This addition to prosthetic devices could possibly be cheaper to develop than prosthetic hands that respond to muscle and nerve stimulii. It wouldn’t be as robust as such devices, but it has the potential to help users with their daily tasks like opening doors at home, using the computer’s keyboard, and grabbing their favorite water bottle.

THE GOAL - Use NFC patches to naturally trigger the customized hand positions for a user’s prosthetic hand.

Design

As a final prototype, my design process was influenced by several different techniques I learned over the previous nine weeks:

  • 3D Printed Models
  • Mobile App prototyping using Axure RP
  • 2D Model prototypes using a laser cutter
  • Behavioral Prototyping

The Raptor Hand by e-Nable

3D Printed Hand

I first had to figure out how I would go about obtaining a 3D printed prosthetic hand. Through a lot of research into 3D printed prosthetic hands, I learned that there were many different open-source models online with the appropriate files to print the parts. It looked simple enough, but after looking into the availability of the 3D printers on campus and the possibility of messing up the initial prints, I chose to contact a research group that had access to their own 3D printed prosthetic hands. The group let me borrow a couple of their hands that were based on the open-source files for the Raptor Hand by e-Nable that can be found here: http://enablingthefuture.org/upper-limb-prosthetics/the-raptor-hand/

Since the Raptor Hand is controlled manually by the user’s own wrist movements, the user group that would benefit from this hand are those that have partially developed hands past the wrist. The hand is limited to a simple grip interaction, which is caused by the tension in the wires that line the hand from the wrist down to the joints in the fingers. By loosening the Phillips screws that dictate the tension in the wires, I was able to create different hand positions for the case of a behavioral prototype test.

Mobile App Prototype

Building off the True Interactions prototype I made in Exercise 8, I wanted to make this app prototype feel more complete. The top features I needed to fix from the previous prototype were the clarity of the apps purpose and the slide-menu since it didn’t simplify the workflow of the app.

I attempted to fix the first problem by designing an introduction page with the Instinct Prosthetics title and logo on it, followed by a button that instructed the user how to pair their prosthetic hand with the application. I then removed the slide-menu, and instead, I spent more time working on the conditions and variables to allow a “Register New Position” button that would drop down every time a new position was added.

The biggest difference with the Instinct Prosthetics application compared to my original True Interactions app is the implementation of NFC technology. When it comes to registering a new position to the prosthetic hand, the position needs to be registered to a specific NFC patch because each position is unique and thus needs a unique patch to activate it. I implemented a loading screen that displayed the instructions for registering the hand position to the NFC patch.

Try it out! http://4mavkw.axshare.com/

Laser Cut NFC Patches

It is possible to buy NFC patches, but since they usually cost about a dollar per patch, I chose to draw up some quick sketches in Rhino and then laser cut them in the design lab. Each patch has a number associated to it, and upon creating a new position for the prosthetic hand, the user needs to indicate the patch number in the provided text field. This will help the user know which patch is associated to which position.


Instinct Prosthetics in action!

App prototype: http://4mavkw.axshare.com/

Testing & Reflection

For my user test scenario, I told two users to go through the steps of pairing the application with the prosthetic hand, registering custom hand positions to the NFC patches, and activating the hand positions by waving the prosthetic hand over the NFC patch they programmed. When the participant would get to the last task in the study, I used a screwdriver to adjust the tension and position of the fingers on the hand. This was noted as disruption to the flow of the test.

From these tests, I wanted to see if the users thought that all of the steps for registering the positions were intuitive and if the users found the positions practical for the tasks they needed to do. Unfortunately, both of my participants did not have any disabilities with their hands. I had to note this distinction because my results would not be practical results for users that actually use or need a prosthetic hand.

However, I still received some important feedback from the tests.

  1. Pairing hand to application- One of my participants noted that there should be a different way to pair the hand with the application (the current method requires the user to close and open the hand). This makes sense because the prosthetic hand is not necessarily supposed to be controlled manually.
  2. Locking and unlocking the position - Both participants commented about the process of unlocking the hand from the current position after activating it on a NFC patch. For example, the prosthetic hand would need to know when it is appropriate to stay locked to a position and when to unlock. For example, the hand should know to stay locked when holding a water bottle but should also know when to unlock from holding it.
  3. Further customization of hand positions- One participant wished that there were more customization options when creating a position to register. This was understandable because the app only showed a small sample of possible hand positions. The participant also noted that it would help to specify how tight of a grip the hand should make.

Future Iterations

Although I did not get to test my product with a potential user, the feedback I received did show me ways I can improve my product for higher fidelity prototypes. The improvements I’m going to note will still be focused on iterations that do not include a fully functional 3d printed prosthetic hand equiped with servos, an arduino, NFC sensor, and any other components I would need to make a complete device.

  1. Rather than having the hand pair to the application through movement of the hand, there should simply be a bluetooth button on the prosthetic hand that pairs it to the application.
  2. The hand should have a way to notice when certain objects should retain the hand position until the appropriate time to unlock the grip. As an initial design, there could be an unlock button that would be pressed by the user’s other hand to release the grip that was activated by the NFC patch.
  3. The position customization options available in the current prototype get the basic point across, but there should be more specific customization to the position of individual fingers on the prosthetic hand. Along with this, it would be interesting to look into the option of allowing the user to set how strong they want the grip to be for the set position.
Like what you read? Give Kirk Lestelle a round of applause.

From a quick cheer to a standing ovation, clap to show how much you enjoyed this story.