Mind to Mind panelists suggest keeping the human factor in algorithms

By Emily Lemmerman

More and more decisions are being made using algorithms — from how much an individual is charged for auto insurance, hiring at-large employers such as Wal-Mart and Target, to how a web platform like Facebook decides whether an account is fraudulent.

Each decision made algorithmically has consequences for real people, said Jeff Larson, a news app developer at ProPublica. Considering these consequences is key to how we should think about preventing bias in algorithms, he added.

“Keep that human factor in mind,” Larson said. “What happens if no one gets to see that resume? What happens if someone gets classified as a high-flight risk?”

Larson works on the Machine Bias series at ProPublica, which reports on how algorithms can produce discriminatory results. The work of the Machine Bias series includes identifying tech companies that are selling products by algorithm and examining racial discrimination in auto insurance prices.

Titled “The Dark Side of Data,” the panel featured two professors, Ben Hansen of the University of Oregon, and Ifeoma Ajunwa of Cornell University, as well as Larson. The panelists discussed how to seek out the discriminatory nature of an algorithm, the future of regulating algorithms and the pitfalls of algorithms.

On seeking out what makes an algorithm discriminatory, the panel discussed strategies and theories for exposing shortcomings, given the proprietary, and often secret, nature of such algorithms. There is no way to pinpoint whether an algorithm is discriminating, except by the results it produces, the panelists said.

However, “just because you can’t get the algorithm, that’s not a barrier to getting the story,” Ajunwa said. “You need to describe what’s happening.”

The panel also addressed the ways in which algorithms are difficult to regulate. Machine learning algorithms depend heavily on the datasets that they are trained on, and rely on updating the weighting of certain statistical outcomes as time passes. Ajunwa, a lawyer by training, addressed that the law looks at “disparate treatment, as opposed to disparate impact.” That means it’s harder to regulate rapidly changing algorithms that update based on the data inputs.

Hansen, an economist, has examined unintended discriminatory impact of ban-the-box legislation. His work was published in a National Bureau for Economic Research (NBER) paper and featured in The Atlantic. When employers don’t know a potential employee’s conviction history, they will use race as a proxy, which leads to hiring discrimination against black and Latino men, his work showed.

“There’s the role of machine bias and also the role of human bias,” Hansen said.

Ajunwa, who also is a faculty associate at the Berkman Klein Center for Media and Society at Harvard, looks at legislating discriminatory algorithms and data collection for workers. Her upcoming paper, “Hiring by Algorithm,” investigates the discriminatory nature of online automated hiring practices, now used by many employers.

Her interest on algorithms stems from one common complaint by people recently released after incarceration. She said that many of them noted “the lack of a ‘human element’ in hiring” as a key thing that had changed in society, due to online hiring practices. “Hiring by Algorithm” advocates for a more holistic evaluation of workers, where employees have more information, not less.

Ajunwa also suggested that more health information given to employers could result in more discriminatory behavior — such as not hiring an applicant based on illness history. Her piece, published in the Harvard Business Review titled “Workplace Wellness Programs Could Be Putting Your Health Data at Risk” detailed how health data collection as part of workplace wellness programs is unregulated.

Emily Lemmerman is a junior studying Sociology at Stanford. She is part of a journalism class covering the Mind to Mind Conference.

Like what you read? Give Stanford Journalism a round of applause.

From a quick cheer to a standing ovation, clap to show how much you enjoyed this story.