Humans For AI
4 min readJun 12, 2023
Person Reaching out to a Robot · Free Stock Photo — PEXELS, www.pexels.com/photo/person-reaching-out-to-a-robot-8386434/. Accessed 19 May 2023.

Using AI to Translate Signed Languages

by Xavier Contreras

Translating signed languages into spoken words is very difficult. Even though American Sign Language (ASL) or British sign language (BSL), for example, both derive from English, they are not mutually intelligible. Mutual intelligibility is the relationship between two distinct languages that are similar enough that their speakers can understand each other. With over 300 spoken languages (1) which correspond to different regions of the world, the task of translating them becomes even more daunting. Not only are none of the signed languages mutually intelligible, but they are also styled differently within the same signed language. These differences could be variations in the speed, as well as the vocabulary used. (2)

There have been attempts to translate ASL using gloves with sensors to track finger and hand movements as early as the 1980s but none of these iterations have been very successful as they are usually limited in their vocabulary or they are inconsistent. Many of the gloves could only translate the letters of the ASL alphabet but words that did not include those were not included. People who use sign language know that the majority of the time signing each letter one by one, known as fingerspelling, is only used when the correct sign isn’t known or when signing names. Another problem with the development of these gloves is that they seemed to omit input from actual deaf people or people who use sign language. SignAloud was a 2016 project created by two undergraduate students from the University of Washington which won the two students the Lemelson-MIT Student Prize for their achievements of being able to recognize some basic ASL signs. But some people in the deaf community and other linguists did not feel it was an accurate portrayal of the language. Lance Forshay a director of ASL at UW said that he was “surprised and felt somehow betrayed because they obviously didn’t check with the Deaf community or even check with ASL program teachers to make sure that they are representing our language appropriately.”(3)

Sign language uses more than just the alphabet or the correct sign in order to communicate. It has to include the speed or intensity at which people sign with their hands, arms, and the rest of their body. Facial expressions are also very important when signing, not only to express emotion but to change the meaning of words. Like using a change of tone to signify sarcasm in spoken language, facial expressions can communicate something different than what their hands are explicitly saying. This is something that the gloves are unable to do. There is an inherent variability in the way people sign that cannot be properly captured just by tracking hand movements. Does this mean that translating sign language is impossible? Of course not, it just means that there is a lot more data that needs to be taken into account. Luckily for researchers, the advancement of AI technology can take massive amounts of data, and using machine learning, can produce outputs that can be verified by humans to ensure they are learning correctly.

There are already groups around the world using AI to translate sign language into spoken or written language. SignAll, a company from Hungary, is a team working on different applications to translate ASL. They have programs for professional and classroom use as well as an ASL learning app that can be accessed on all devices. Using machine vision to track head, hand, and body movements, this app can narrow the translations down to three possible choices which allows the speaker to select the message that best fits their signs. This process would not be possible without the help of AI since the amount of information that is being compiled and processed is too much for traditional algorithms. SignAll consults the deaf community and has linguistic experts on their team to ensure that their technology correctly represents the community they want to help. SignAll works in partnership with Gallaudet University, the world’s leading university for the Deaf and Hard of Hearing, which has compiled the largest ASL vocabulary database in the world.

SignAll’s mission is to be a bridge for the deaf and hearing communities. They do this with their products named SignAll Chat, SignAll Lab, and SignAll Online respectively. Their technology, in an attempt to tear down language barriers will help more people interact with each other, bringing humanity closer together little by little. SignAll’s CEO Zsolt Robotka said “SignAll not only benefits the Deaf, but the hearing individuals they interact with on a daily basis. Improved communication benefits everyone involved — friends, family, work colleagues, businesses, and more.” (5)

References:

(1) https://www.pexels.com/photo/person-reaching-out-to-a-robot-8386434/ — Title Image

(2) “International Day of Sign Languages.” United Nations, United Nations, 2017, https://www.un.org/en/observances/sign-languages-day. ((Missing an Author))

(3) Cheng, Katelyn. “Deaf Culture: Sign Language ‘Accents’ or ‘Styles’: Start ASL.” Start ASL | Learn American Sign Language with Our Complete 3-Level Course!, Katelyn Cheng Https://Www.startasl.com/Wp-Content/Uploads/StartASLlogoFinal-1.Png, 6 July 2021, https://www.startasl.com/sign-language-accents-or-styles/.

(4) Erard, Michael. “Why Sign-Language Gloves Don’t Help Deaf People.” The Atlantic, Atlantic Media Company, 10 Nov. 2017, https://www.theatlantic.com/technology/archive/2017/11/why-sign-language-gloves-dont-help-deaf-people/545441/.

https://www.signall.us/