Gloves for Virtual Reality — Dev Diary #3
TL;DR We have split the project into two parts. For first part, we will build one single joint, and we will extend it to a full finger in second part. This post talks about the progress we’ve made for stage 1. We have designed and 3D printed our single joint, and we are developing the game environment to test the joint.
We have split the project into two stages(Figure 1). On stage 1 we will build single joint, and on stage 2 we will build 3 joints interlinked with each other forming the gloves of one finger. For this and the upcoming blogs, we will talk about the development progress of the single joint.

Motor selection
To replicate real life interaction with our fingers, we need a strong motors. But so far we haven’t investigated in to the exact requirement of the motor needed for our situation. Instead we plan build our first joint with off the shelf motor and use that baseline to improve upon. So we bought a brushed DC motor with Extended Motor Shaft with 297.92 : 1 gear ratio(The rare shaft has to make approximately 298 revolution for the extended shaft to rotate once) from Pololu(Figure 2).

3D printing the joint
Firstly we wanted to make a simple joint to test our approach. So we came up with the following design to 3D print — Figure 3.

But soon we found out this model can be vastly simplified. The simplified model is the following — Figure 4.

We used the Makerbot Replicator 3D printer at the Imaging Center Services of our school to print the model(Figure 5).


In-Game environment
At the same time we were building the game environment to test the single joint. We are using jMonkey game engine to build the game environment.

In Figure 5, the red cylinder represents the motor. The blue extensions represents the current angle of the joint.
As we mentioned previously, the current angle of the joint will be measured using the reading from the incremental encoder. If for any particular angle of the joint, the finger is colliding with a virtual object(notice, the angle becomes red), the game will send a signal to the hardware control system to trigger the motor. The motor will spin up or down preventing the physical finger from going through the game object — and this will give the player tactile immersion in Virtual Reality. The player will feel as if he is holding an object of that shape in real life.
Now that we have the 3D model of the joint and the game environment to test it in, we will connect them together.
Any constructive feedback would be most appreciated.
This research is funded by Bennett Undergraduate Electrical Engineering Summer 2017 Research Fellowship at South Dakota State University, and supervised by Dr. Zhen Ni, Assistant Professor in Department of Electrical Engineering and Computer Science at South Dakota State University.
