Technologist Ethics

Barry Leybovich
Life with Barry
Published in
2 min readJun 13, 2017
Photo by timJ on Unsplash

I just finished reading an article about an ex-Facebook executive that I’m sharing this in part because it’s so disappointing.

Paradigms that are programmatically (as opposed to socially) enforced are incredibly dangerous because the driving forces behind them (product managers, developers, sales people) can try to eschew responsibility for the institution that they build, and blame the users instead.

When Coursera was building a recommendation engine, the developer (Emma Pierson, a computer science PhD herself) noticed that it would push female students away from computer science — not actively, but by exclusion. This was based on completely reasonable and real data: there were few women interesting in taking computer science (and likely other STEM fields). As a result, the engine predicted that women would not be interested in these courses, and would not recommend them to women, instead pushing them to more ‘liberal-arts’ type courses. However, implementing this system would serve not just increasing course participation, but also creating an institutional system reinforcing existing societal views of women in STEM.

These programmatic systems are all the more dangerous because they’re so opaque, and will only become more ingrained as machine learning, artificial intelligence, and big data become ubiquitous. With metrics like churn rate and advertisement revenue at fore, what incentives will technologists have to behave ethically? Coursera decided to take gender data out of its recommendation engine, but now it’s up to businesses to make the right choice. Facebook, it seems, did not.

--

--

Barry Leybovich
Life with Barry

Product Manager, Technology Enthusiast, Human Being; Contributor to Towards Data Science, PS I Love You, The Startup, and more. Check out my pub Life with Barry