Live from SXSW: Tackling Fairness: Gender, Diversity & Algorithmic

Data Scientist, Ines Marusic, joined a panel at The Female Quotient’s Girls Lounge to discuss how Artificial Intelligence (AI) is changing the game as well as perceptions of women in STEM. Here Ines expands on her views on this important topic.

Ines Marusic at The Female Quotient Girls Lounge

Over the last decade or two, AI has gone from a niche academic discipline to a leading technology of our time. One of the major gender concerns about AI is that it will simply replicate (if not amplify) the biased and stereotyped modes of thinking that are pervasive in society.

This raises an important ethical question — how do we create AI that represents the kind of thinking for the world we want to live in — and perhaps not merely reflect the “lowest common denominator” of beliefs and mindsets on gender issues that unfortunately exists today?

Algorithmic fairness

However, very often the training data contains biases that exists in our society. This bias can be absorbed or even amplified by the systems, leading to decisions that are unfair with respect to gender or other sensitive attributes (e.g. race).

The topic of fairness was discussed at SXSW, for instance at the excellent panel on Hacking the Racial Bias in Artificial Intelligence by Timnit Gebru of Microsoft Research, Ayanna Howard of Georgia Tech / Zyrobotics, Jeff Nelson of Cinchapi and Princess Sampson of Kingonomics. You can view our highlights from the session here.

The panel at Hacking Racial Bias in Artificial Intelligence.

The machine learning research community has recently started to address the issue of fairness through a variety of ways. One simple approach is to ignore the sensitive attribute, which often does not result in a fair procedure as the sensitive variable might be correlated with some other variables. More recently, a causal definition of fairness was proposed by Kusner et al. (2017), called counterfactual fairness, which states that a decision is fair toward an individual if it coincides with the one that would have been taken in a counterfactual world in which the sensitive attribute were different (e.g. decision made if a job applicant were a man instead of a woman with all other features being the same). Last month, a novel fairness definition was proposed by Chiappa and Gillam (2018) called path-specific counterfactual fairness, which states that a decision is fair toward an individual if it coincides with the one that would have been taken in a counterfactual world in which the sensitive attribute along the ‘unfair’ pathways were different.

Research in algorithmic fairness is in many ways still in its infancy, but the hope is that it will help combat gender, racial and other forms of bias made by algorithms that make decisions.

AI development that would improve the lives of women around the world

If we were able to use AI to automate some of these activities, these routine tasks could be simplified or eliminated. Tools such as washing machines and kitchen appliances simplified a lot of household work in both higher- income countries and emerging economies. Innovations such as home-cleaning robots may one day make a leap forward in automating or streamlining many more tasks.

The time saved by automating these activities could allow these women to have time for a paid job or further education — which may bring economic prosperity to their families and enable them to be financially independent. What’s more, research has shown that there may be intergenerational benefits for the children of earning mothers.

Impact of women on AI

Having more women in AI could help combat the gender bias in the training sets, could help design AI tools that address solution spaces relevant for women, and more generally would help insure we create an AI that represents the kind of world we want to live in.

Despite its name, artificial intelligence is a technology designed by people and intended to be used by people. As such it is very reflective of its creators. It is important that the demographics of AI creators reflects the demographics of the people using this technology — which, ultimately, is the demographics of the world.

Advancing Women in the Workplace

We are proud of the impact this research has had across the corporate world, and it is a widely quoted reference for status of women in the workplace. We also use findings of that research ourselves to help drive us forward towards equality, and these findings are built into our values. Indeed, guided by this research, we have taken various steps to help attract and retain women, from basics such as flexible working to a global sponsorship initiative and training programs in unconscious bias and Inclusive leadership.

Another important piece is having access to mentors. At QuantumBlack, our data scientists have access to mentors not only inside but also outside the company, through attending events such as the annual Women in Machine Learning workshop, which is co-located with the leading machine learning conference NIPS.

References

Kusner, M. J., Loftus, J. R., Russell, C., and Silva, R. Counterfactual fairness. In Advances in Neural Information Processing Systems 30, pp. 4069–4079, 2017.

Chiappa, S., Gillam, T. P. S. Path-Specific Counterfactual Fairness. arXiv:1802.08139, 2018.

Other interesting sessions we attended and our highlights at Day 2 of SXSW:

> When Health Care Goes High-Tech

> We Want It Now: The Future of AI is in Your Hands

> Unfinished Business: Race, Gender And Equality

The Female Quotient discusses Unfinished Business: Race, Gender & Equality with members from The Atlantic.

We’re live reporting from SXSW!

#SXSW #SXSWi #SXSW2018

QuantumBlack

We use data, analytics and design to help our clients be the best they can be.

QuantumBlack

QuantumBlack, a McKinsey company, helps companies use data to drive decisions. We combine business experience, expertise in large-scale data analysis and visualisation, and advanced software engineering know-how to deliver results.

QuantumBlack, a McKinsey company

Written by

An advanced analytics firm operating at the intersection of strategy, technology and design. www.quantumblack.com @quantumblack

QuantumBlack

QuantumBlack, a McKinsey company, helps companies use data to drive decisions. We combine business experience, expertise in large-scale data analysis and visualisation, and advanced software engineering know-how to deliver results.

Medium is an open platform where 170 million readers come to find insightful and dynamic thinking. Here, expert and undiscovered voices alike dive into the heart of any topic and bring new ideas to the surface. Learn more

Follow the writers, publications, and topics that matter to you, and you’ll see them on your homepage and in your inbox. Explore

If you have a story to tell, knowledge to share, or a perspective to offer — welcome home. It’s easy and free to post your thinking on any topic. Write on Medium

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store