Where are all the female crash test dummies?

(Or why the need for diversity in STEM is a matter of life or death)

It may come as a surprise for you to learn that it wasn’t until a few years ago that car manufacturers started regularly carrying out testing with ‘female’ crash test dummies in the driver’s seats. For more than 30 years it was assumed that it would be sufficient to use a standard crash test dummy, a dummy which had been designed to reflect the measurements and biomechanics of the average male body. As a result, women were more susceptible to injury and were killed more often in car crashes.

A 2011 study conducted by the University of Virginia’s Centre for Applied Biomechanics determined that female seat-belted drivers in actual crashes had a 47% higher chance of injury than male belted drivers in equivalent crashes. This percentage rose to 71% for ‘moderate injuries’.

The reason for women being killed and injured in car accidents at a disproportionately higher rate than men? Their exclusion from the design process and the failure of manufacturers to test the safety of their cars on female bodies.

In 2011, car manufacturers started to use smaller crash test dummies to replicate proportions that were more representative of the female form. Far too late, manufacturers were being confronted with the real and potentially devastating impact that crashes involving their cars might have on female drivers and passengers.

Let’s take the 2011 Toyota Sienna as an example — it’s testing revealed that, when colliding with a barrier at a speed of just 35 mph, the female dummy in the front passenger seat registered a 20 to 40% risk of death or seriously injury. This is a substantially higher than the risk incurred for a comparable crash featuring a male test dummy. In some cases, according to tests with female dummies, small women were almost three times as likely as their average male counterparts to be seriously injured or killed.

The above situation is clearly indicative of a widespread problem within male-dominated industries: when women are not present and their contributions and perspectives are not considered, fundamental issues go ignored. Had a woman held a key position in any of these car manufacturers (or indeed, if as many women had been present throughout the development and testing processes as men) perhaps this intervention and the understanding of the need to test for the safety of women (and, of course, children) might have come sooner.


It should be noted too that design biases are as problematic when it comes to issues of race, ethnicity, disability, socioeconomic class and many more. As a case in point, Google’s tagging algorithm for it’s ‘Google Photos’ app famously mis-labeled photos of black people as gorillas, while Microsoft’s facial recognition software was reported to fail to recognise people with darker skin.

Today, one of the most prominent uses for computer vision is in self-driving cars, which are reliant on this technology to make sense of their surroundings. If these systems are unable to accurately identify the faces of humans with different skin tones, what can we expect the safety implications to be? Are we to find ourselves in a reality in which a self-driving car might prioritise the lives of people based on the colour of their skin?

Worryingly, it seems we are very much in danger of replicating the biases exposed in the manufacture of cars in the design and training of driverless cars. Paysa conducted a review of hiring for self-driving car space across a six month period during 2016 and found that just 5.4% of those taking jobs were women. The same data showed that 42.1% were white with a further 21.6% being unaccounted for. These figures reveal that there is an obvious and severe lack of diversity in a space that will impact the lives and safety of millions of people.

We have seen a repeated pattern of incidents that serve to demonstrate that having a homogenous (predominantly white and overwhelmingly male) team results in design biases towards those groups that are under-represented (or, in some cases, completely unrepresented). The lack of diversity in AI (and indeed across the tech and STEM industries more generally) is not merely of social or cultural concern. It is, fundamentally, a life or death issue. In order for us to build safer products, products that promote the equal value and humanity of all people regardless of their gender, skin colour or any of their distinguishing characteristics or experiences, we need to encourage diverse perspectives through the prioritisation and commitment to creating diverse teams.