Google Translate addresses its bias issue

Thomas Moore Devlin
Babbel On
Published in
3 min readDec 7, 2018

This past week, Google unveiled a redesign of its very popular Translate service. The most noticeable changes were in appearance, but one of the biggest changes was how the app deals with grammatical gender. In a blog post, Google’s product manager James Kuczmarski talked about how the company is addressing the bias that’s inherent in its product. This marks one of the first steps that’s been taken thus far in figuring out how to fix gender-bias problems in artificial intelligence.

But let’s back up a moment. Why is Google Translate biased at all? Algorithms, after all, seem like they should be neutral. In reality, it all has to do with how Google Translate actually does translation. It doesn’t just take a dictionary and translate word by word, but instead uses the vast resources of the internet to do translations. It takes in a huge amount of information and tries to find the most likely translation based on what already exists.

For the most part, this system works pretty well. There was, however, a very conspicuous issue: when more than one translation was viable for a sentence, it was forced to choose one, and that choice was very flawed.

The most famous example had to do with the Turkish language. Turkish pronouns don’t have any gender, and instead are always o. But when you put Turkish into Google Translate, the algorithm would have to pick which single translation it thought was “best.” This would lead to translations like these ones:

Because there were more instances of “He is a doctor” than “She is a doctor” on the internet, Google Translate would pick the masculine pronoun to translate, and vice versa for “nurse.” What this and other examples showed was that Google Translate reproduced gendered stereotypes in its translations. The source of the issue is very old human biases, but it showed how artificial intelligence is likely to perpetuate them.

Google’s solution, then, was to go in and tweak Google Translate so that it would show both “He is a doctor” and “She is a doctor,” as you can see in the image at the top of this article. It’s certainly a welcome improvement, and it is now applied when translating from English into Spanish, French, Italian and Portuguese, or when translating from Turkish into English.

This is all a good thing, but it’s only the beginning of what will be needed to really fix bias in artificial intelligence. This Turkish example happens to have caught on and become a representative example of bias, but it’s far from the only one. And some bias will be far less visible than others. Amazon, for example, had to get rid of a job-recruiting tool when they found out it was biased against women. Clearly, it’s going to take a lot of work to take human biases out of the products we create. In the long run, however, it will make the world a better and more equitable place.

--

--