Who are we (not) designing for? Part 1: Designing for men — The plough has shaped our modern world

Spotless
Spotless Says
Published in
6 min readJun 5, 2019

by Caroline Butler

Who gets left out in user-centred design?

Through my work as a UX designer and now as a Service Designer at Spotless, I have helped to craft products and services centred around the user — with their needs and the needs of the business being the focus of the design decisions.

As designers, researchers and strategists we’ve been talking about the merits of taking this user centred design approach for sometime now. This approach has created winners; those people who can successfully use and be delighted by products and services that support them to achieve their goals.

However, having a target audience at the centre of the design also creates losers. It creates problems for those left behind, for those not considered in the design process and is causing wider issues for society .

“Pleasant to use doesn’t equal healthful any more than pleasant to eat does.”

Erika Hall,

With the privilege of hindsight, we can learn about the impact of designing for a primary user or users by looking at products, buildings, policies and services throughout history, to critically evaluate our current design approach and explore the impact of the decisions we make now on our future.

In a series of articles I will explore examples throughout history of user- centric designs and how we make the transition to taking a society-centric approach.

Caroline Butler

Part 1: Designing for men — The plough has shaped our modern world

The shift from foraging to farming reshaped society. This fundamental invention kick started agriculture, making it possible to grow an abundance of crops and driving prosperity and health.

Photo by Zbysiu Rodak on Unsplash

Early scratch ploughs formed shallow furrows to prepare the soil for sowing. They were operated by humans and lugged by horses or oxen.

As the design of the plough evolved, it became heavier and needed more upper body strength to successfully operate it and to get optimum results for the growing of crops. Men became the primary users, women who were often pregnant were now unable to use this heavy machinery safely.

Whilst men worked the field, woman increasingly found themselves at home preparing food and caring for children. Before the invention of the plough, people were mostly nomadic, hunting and foraging for food, often with children in tow or strapped to their backs. Men and women did similar tasks, taking it in turns to lead on the job that needed to get done.

Men became prosperous from the fruits of their labour by selling crops and becoming landowners. With money and land, men became powerful. They made the decisions and kept the best resources for themselves .

To reassure them that their children were legitimate (so they could pass on their land and money) they monitored women’s sexual activity. Many marriage rituals such as the father giving his daughter away and dowries are a legacy from when women were ‘owned’ and kept by men.

“Alexa, you’re a legacy of the plough”

Photo by Rahul Chakraborty on Unsplash

The gender division the plough inadvertently created is still felt keenly in today’s society. Digital products are reinforcing and perpetuating gender stereotypes and the qualities associated with them.

“Emerging technology is being designed to rely on these ancient stereotypes.”

Julie Carpenter

An extreme example of a digital product that does this is, Absher — the Saudi government app. It allows men to monitor and control the travel of female family members and helps to apply the Saudi guardianship law. When a woman goes through an airport, their husband or father gets a text alert and within a couple of clicks they can prevent them from leaving the country. If Absher allowed women to be users too, it would enable them to organise their own travel and bring much needed liberty. This would be true tech ‘disruption’, giving women autonomy and free to escape abusive families should they wish.

The majority of digital voice assistants have names and voices that imply they are ‘female’. This is either the default or the only option available for the voice of the assistant. Siri in the US has a female voice as the default option. Microsoft choose a female voice for Cortana (who was named after a nude character in the video game Halo) as it met their objectives of building a helpful, supportive, trustworthy assistant. The female voice that was tested for Alexa was seen as being more helpful, supportive and caring over the male voice, which was interpreted as being more authoritative by users. IBM have leveraged this association of authority and by choosing a male voice for their uber powerful A.I assistant Watson, an assistant that can help companies create their own versions of Alexa and Siri.

Popular culture may have also contributed to why male voices are interpreted as less helpful and trustworthy. In movie plots where technology becomes evil or become out of control, the A.I persona or robot is often male. Hal 9000 the original virtual assistant from Space Odyssey went from being helpful and servantile to being powerful and evil. Jarvis from Iron Man had autonomy more free and is eventually replaced by female servants. In Ex Machina, Ava does kill her creator but is seen as victim rather than being inherently evil and powerful.

A.I assistants are an emerging technology. There is probably not enough gender segregated data gathered to provide insights into whether it’s women or men that prefer their digital assistants to speak with female voices. I personally, was a fan of using celebrity voices in sat navs. John Cleese and Brian Blessed made driving a lot more entertaining than listening to the default voices.

Digital assistants and speech recognition software is not as receptive to the commands of real women. Male voices have a higher success rate of being understood. This could be a result of not enough female voices being used to train the A.I and the pitch of women’s voices often means they struggle to be heard by these devices above background noise.

Whatever the current reasons are for the prevalence of female digital assistants, it doesn’t subvert the derogatory attitudes. Early versions of Alexa, enabled users to hurl abuse at Alexa without any product feedback to discourage this behaviour. Insults such as “Alexa, you’re a bitch” would get the tolerate response “well, thanks for the feedback”.

It’s not surprising that in a recent study from market research firm AYTM it was found that only 32% of women think technology is designed for them, with 70% saying the look, feel and tone of products are often unsuitable to their needs.

Change is on the horizon

Recent updates have seen Alexa pushing back and taking a more feminist stance. Alexa now has a disengage mode and will now reply with “I’m not going to respond to that”.

Google have more recently assigned colours instead of names to their assistant. There are six voice types to choose from, ranging from soothing to dynamic and implying a mixture of genders. Google Home has also gained the ability to recognise individual voices, which allows the device to personalise its responses for everyone in a household.

Some companies are steering away from using only females voices. Gender neutral voices are being adopted by products to avoid this bias. Capital One made the decision not to convey a race, an age or gender when creating their A.I bot called Eno. Customers are free to interpret Eno as they wish.

A gender neutral voice assistant, called the Q has also been developed. Where the voice is pitched at a neutral point on the range between a man and a woman and was developed by listening to recordings of transgender and non-binary people.

Part 2: Did the design of the House of Commons cause Brexit?

You can also follow this thread in our dedicated series

--

--

Spotless
Spotless Says

Spotless is a boutique design research and service design agency based in London: https://www.spotless.co.uk/