Whose Story Is This

Monika Mani Swiatek
Dec 1, 2020 · 4 min read

AI powered translators and autocorrect are praised as powerful tools but still fail when translating into gendered language, giving voice just to a “man”.

Image for post
Image for post
This is how I feel when my text is autocorrected to fit male narrative

A few days ago, I tried Microsoft Word translation feature. I had a short story written in English but I realised would be more suitable for Polish audience (looking at current political and human rights turmoil). I'm a native Polish speaker and wanted to see how this translator will perform.

Translation wasn’t bad, but while I wrote my story with a female narrator in mind, in Polish version it magically changed into a male.

Taking a breath…

This is interesting and infuriating that if a language has two versions depending of the sex of the narrator (he or she) the default version is he seen as a default version of human being….

Similar issue visible in the world of design was presented by Caroline Criado Perez in her book Invisible women. While in design manufacturers are turning things pink to make it more “feminine” (what’s still a crappy adjustment) in the language processing, systems just force male version on you.

Progress and… a few steps back

Since a while it’s more common in the press, scientific papers and everyday conversation to be inclusive. What means in case of certain languages including she/he combos or using these versions interchangeably.

Unfortunately, it applies only to the content written by people, people who write consciously thinking about their audience. AI systems are programmed in a way which doesn’t care about it as probably people who set it up.

Not only translations

It’s been about 6 months since I have my phone and still when I’m writing in Polish, the autocorrect changes the verb from female to male- it’s infuriating! I tried to teach it that I’m a woman and I want to use female version, I was correcting it but it was pretty resistant to it and even wasn’t using words which I saved in my personal vocabulary. And it affects not only singular version, plural as well, what causes headache when I describe my family what, with my wife, we were up to over the weekend.

When you are a woman, it’s more difficult to be heard, and if systems are treating all users as men, then female perspective seems to be ignored as a male narrative is imposed by default. You can either give up or keep on correcting it unless you do the work (which developers should do at first place) and provide enough data to make the system understand that you’re “the other", the female version and it has to get used to that and not interfere.

Not so “smart” systems

This is one of these situations where a human translator is better than an AI powered system because we, as people can read the context and draw conclusions while automated systems have a default setting and change the noun only if “she” is explicitly mentioned in the original text.

And not many seem to care…

I ask a question; if AI “can” do so much and is advertised as something clever, why there’s no simple option to select if I want a piece of text to be translated from a female perspective? I’m writing messages and I AM a WOMAN and I want to keep it that way?!

They know but don’t care

Many sign-up processes have question about gender. Why this information can’s be used in our favour at least once?

Google! Microsoft! can you hear me?!?!

Doesn’t it just sound like a negligence? More people should call it out.

Have you tried?

Thanks for reading.

If you’d like to share your experience with translators, autocorrect please feel free to add a comment.

If you’re a native English speaker and would like to know what exactly I’m writing about just drop me a line and I’m happy to explain what gendered language is about.

I described my experience, if you know systems which work better please let me know I’m happy to find out about good products and thoughtful designers out there.

In my next post I’ll write about how male and female voices are used in AI powered systems and how it’s use is promoting harmful stereotypes we were fighting against but now are being encoded and scaled up in modern systems.

For disclosure I’m not an AI or linguistics professional but a woman working in IT who tries to understand the world and shares the exploration process.

Medium's largest active publication, followed by +755K people. Follow to join our community.

Monika Mani Swiatek

Written by

Trying to decide if I should be a warning or an example to others today... Feminist, sceptic, alleged stoic. Passionately curious UXer.

The Startup

Medium's largest active publication, followed by +755K people. Follow to join our community.

Monika Mani Swiatek

Written by

Trying to decide if I should be a warning or an example to others today... Feminist, sceptic, alleged stoic. Passionately curious UXer.

The Startup

Medium's largest active publication, followed by +755K people. Follow to join our community.

Medium is an open platform where 170 million readers come to find insightful and dynamic thinking. Here, expert and undiscovered voices alike dive into the heart of any topic and bring new ideas to the surface. Learn more

Follow the writers, publications, and topics that matter to you, and you’ll see them on your homepage and in your inbox. Explore

If you have a story to tell, knowledge to share, or a perspective to offer — welcome home. It’s easy and free to post your thinking on any topic. Write on Medium

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store