The unforgiving side of data

Seth Wickham
SI 410: Ethics and Information Technology
2 min readFeb 13, 2021

Context matters. We can agree that this a baseline principle for our entire life.

Runner enthusiasts often journal their run with the weather conditions, their amount of rest, and their diet for the day. Sports fans debate the GOAT (greatest of all time) for their respective sport, often quoting their candidates’ teammates, opponents, and the time period when arguing.

Yet none of this contextual analysis is afforded to those applying for jobs, school, and life-changing loans for under-resourced individuals when the schools and corporations rely so heavily on uncontextualized data.

Image via “Popular Mechanics” magazine

In the book “Weapons of Math Destruction”, mathematician Cathy O’Neil tells the story of how an aspiring quality teacher, Sarah Wysocki, was fired from her job after a new testing evaluation program determined that she had not performed well enough in comparison to her colleagues in the district.

In reality, the evaluation simply was too small and oblivious to many elements that might influence her students’ test scores. Wysocki says in the book, “there are so many factors that go into learning and teaching that it would be very difficult to measure them all.”

O’Neil also informs through the book how corporations are also now using algorithms that incorporate prospective job candidates’ credit scores. This makes it even harder for low-income parents to break out of their income bracket and escape poverty. A bad month where circumstances may have caused them to miss a bill payment may stick with them forever if corporations deem the applicant unhirable over and over because of it. This likely causes more financial problems, a declining credit score, and the problem repeats.

Data is an extremely useful tool, one that I hope to draw useful conclusions from in my future career. However, if we overly rely on it, without factoring context into these numbers, any conclusions drawn are moot and have potentially dangerous consequences for people.

HR staff should keep a human element to their recruiting and decision-makers must be able to still evaluate a human in a nuanced way that includes their potential instead of solely basing opinions on numerically represented scores that accentuate unfixable past mistakes.

Since data-based decision-making will likely always be here to stay, the most important point is that data scientists and programmers must be able to incorporate context and broader understanding when making programs that evaluate fellow human-beings.

Because without context, false conclusions would be drawn about us all, even to the minute things like our progress as a runner and our discussions of Lebron vs MJ. No human or argument can be simplified to 1’s and 0's.

--

--