Lets Converse - The sign language translator app

Srinivasa Raghavan
Newolf Society
Published in
2 min readSep 2, 2020

Since the dawn of time , humans have communicated with each other in one form or the other. Communication helps people understand one and other by expressing their ideas and views.

Starting from gestures and signs to speaking and alphabets, communication has come a long way. Every single day it continues to change with new technology. Humans have moved from communicating with pigeons to sending mails today!

“Good communication is just as stimulating as black coffee, and just as hard to sleep after.”

Photo by Antenna on Unsplash

Speaking is a powerful tool, but not everyone is fortunate to have it. Can we imagine how hard it would be for a person to communicate and express his thoughts if he cannot speak? Even if he/she has learnt different ways and did their best to communicate using sign language, would a common man understand? Unfortunately due to this sort of barrier the section of the society consisting of deaf and dumb people are not able to communicate freely!

ASL-source : google

So now the question arises, what are we doing to to solve this?

Presenting Lets converse! a sign language translator app. Made using the concepts of image processing and deep learning , this app helps users understand the meaning of signs(currently based on American sign language)

Sign language is the noblest gift God has given to deaf people.

Whatever the sign maybe ,take a pic on the app and we instantly get the alphabet it means! As easy as it is,just one click of a finger.

Developed using deep learning concepts such as ConvNet and pytorch, this app is built on Flutter(platform independent app).

Model used in link below!

github link : https://github.com/JayaramKarthik/SignLanguage/tree/master

Note: The model is still in testing stage. Any views or opinions can be mailed to sraghavsrg@gmail.com. App still in development stage !

--

--