An unspoken language

Technology is amazing: we have devices that have the ability to translate between dozens of languages in our pockets, and it’s begun to help bridge the gap between people who speak different languages. There is one language, however, that hasn’t seen as much technological progress as others: sign language.

According to the World Federation of the Deaf, there are around 70 million people worldwide whose first language is sign language. Gallaudet University estimates anywhere from 500 000 to two million ASL speakers in the US alone. This is a large group of people.

Seeing the lack of support for translation between oral and signed languages, Clive Chan, Colin Daly, Wilson Wu, and I decided to do something about it. We spent a weekend at McHacks, a 24-hour hackathon held at McGill University in Montreal, Québec, building a project we felt would aid in sign language communication.

Logistics

Having brainstormed in advance, we began gathering supplies for our project. Wilson and Colin bought wood, servos, and string. Colin also 3D-printed hand “bones” so that we would be able to get the flexibility of a real hand. Unfortunately for us, no one told Colin that people usually have one left hand and one right hand so he accidentally printed two left hands. Luckily, he caught his mistake a couple of hours before we had to leave and quickly printed a fragile right hand (nothing hot glue couldn’t fix).

When we arrived on the day of the event, we lugged all of our supplies into the building and were told we had to head to opening ceremonies. We figured that this would be a quick walk to a neighbouring building, but we were wrong. The ceremonies were held in a building almost all the way up a very steep hill; this would be an obstacle in the future, but we didn’t know about it yet.

👌… ✌… ️👆… Hack!

When hacking began, we wasted no time. Clive and I created a couple of GitHub repos and began coding. Clive got to work playing with the Leap Motion API: detecting movements and writing them into a format that could be translated back into hand movements through servos. At the same time, I set up Node (incorrectly) and began working on the code to control the servos.

Meanwhile, Wilson and Colin decided to get to work on the hardware side of things. They began by assembling the box that would encase our hands. It was at this point when Colin realised that he didn’t bring a hammer with him. After much panic, Colin and Wilson decided to use a cast iron clamp as a makeshift hammer for the weekend.

As they hammered nails into the wood, Clive and I hammered our fingers into the keyboard. Every once in a while, Clive would wave his hands over the Leap Motion like a wizard or the servos plugged into my breadboard would rotate. Things were going according to plan.

As is often the case in hackathons, the smooth sailing didn’t last for long. Something was missing. Wilson and Colin had finished the box and began putting tendons (nylon string) into the 3D-printed bones for the hands. As they put together the right hand, they noticed there was a missing distal phalanx: an important part of the index finger. In true hacking fashion, Colin pulled out a piece of wooden dowel and an X-Acto knife and began to whittle a distal phalanx.

You can hardly tell one of the fingers is made of wood!

Eventually, both hands were created and some code was written. It was time to put the hardware and software together and see if we could make the hands act like hands. Of course, going with the theme of the weekend, something was missing for us to test the hands. Colin had forgotten the power supply cord at home which meant that we couldn’t test until we could find a desktop power cord.

Colin and Wilson went around to various buildings on campus in search of the cord that would allow us to test the hands. They ventured near and far, but had no luck. We called all of the local computer repair stores but all of them were either closed or didn’t have what we were looking for. Eventually, Colin and Wilson went to the first floor and spotted a power cable tucked behind the front desk of the building we were in. An organiser gave us permission to take the cable and we could finally test our hands!

Our moving hands!

Next we had to get voice transcription and x/y/z-movement for the hands working.

Since Nuance was sponsoring this hackathon, we decided to use their API for the voice transcription. Clive went over to the Nuance table and spent most of the night there, figuring out how to recognise the spoken English.

Meanwhile, Colin and Wilson began putting the arms into the wooden box they built earlier. I took a quick nap and when I woke up Colin and Wilson had put the arms in place and began wiring up the servos. I joined in on the wiring action. Colin and Wilson then went to sleep while I finished hooking all of the wires up to the breadboard.

When Wilson and Colin woke up, we got to work on calibrating the system. Unfortunately, we were using an Arduino Mega which, despite having 14 PWM ports, can only handle 12 servos at a time meaning that we would have to use two boards to control all 14 of our servos.

We rewired everything so that we were using two Arduino Unos (which we discovered after having rewired things could only handle 6 servos each — less than the 7 each that we needed). Since we now had to control two boards, my code had to be changed. Unfortunately, the Johnny Five library isn’t extensively documented for multiple boards (or perhaps we were having trouble comprehending things on little sleep). There was an issue with the library accepting an undefined value and not throwing an error, causing us only to be able to control one board — after a stressful hour or so, we were able to resolve it and control the servos again.

At this point, time was running low. After all, the hackathon was only 24-hours in length. Now all we had to do was get the voice-to-sign-language translation working and the project would be complete.

Clive returned from the Nuance table and began testing his voice recognition software and his sign language mappings on the hands. Colin and Wilson also began coding words in sign language for our robot to say. With just minutes before the end of the hackathon, our project signed “hello”. We had done it!

Voice to sign language!
Sign language mirroring!

During demos, we got to talk to a lot of people about our project and it was cool to hear the thoughts of the other hackers. One hacker (I didn’t get a name) mentioned how where he was from there was a large lack of funding among the deaf community as far as teachers and translators and that something like our project could help alleviate this. Others thought it was cool and some just asked us to make it give the middle finger.

When judging was over, we were notified that we had made it to the Top 3 and that we were being asked to present at the closing ceremonies. We were pumped but then realised it meant we had to lug our machine up the giant hill to the closing ceremonies.

Being careful not to dislodge the wires, Wilson and I grabbed the box while Clive took the power supply. Colin carried his laptop. We slowly hiked up the hill to the closing ceremonies and, after what felt like hours, we arrived at the closing ceremonies. We presented our project and made some hand puns. We even came away with first place!

So, what did we learn?

McHacks taught us a lot of things. The most important of which was: where there’s a will, there’s a way. So many things went wrong on the weekend, and it would’ve been so easy to give up, but, we pushed through it all and came away with a really cool project!

Some other things:

  • We learned some basic words in sign language
  • NodeJS is good for absolutely everything
  • A distal phalanx is a bone in your hand
  • I’m pretty good at trigonometry

We definitely want to keep improving this project in the future! If you like this project, feel free to check it out on Github and Devpost or share this with your friends!