The Subjective Problem with Objective Algorithms

When comparing a machine and a human, the differences are pretty easy to name: humans have emotion and thoughts while machines run processes and plot to take over the world. Okay maybe not that last one, but it is true that many aspects of modern society rely on technology. Our smartwatches auto-recognize when we’re active. YouTube recommends us the videos that we’re actually interested in. After a while, it makes you wonder — how do they know all of this?

https://www.google.com/url?sa=i&url=https%3A%2F%2Fwww.pewresearch.org%2Ffact-tank%2F2019%2F02%2F13%2F7-things-weve-learned-about-computer-algorithms%2F&psig=AOvVaw3m5q52AcyLLBrOpk271LmU&ust=1613253149928000&source=images&cd=vfe&ved=0CAIQjRxqFwoTCJDSmryq5e4CFQAAAAAdAAAAABAD

Frankly, these things are pretty much guesses made by algorithms. Cathy O’Neil, a self-proclaimed lover of all things mathematical, talks about the increase in automated decisions she witnessed after the 2008 housing crash. She describes how these algorithms are often seen as “fair and objective”. However, Cathy saw trouble in this obscurity and separation. She brought up a school district that implemented an assessment tool to weed out the bad teachers. But the outcome was that the honest teachers, the one who didn’t inflate scores, lost their jobs.

This reminded me of my own experience working at a popular retail chain. My job was to complete online orders by running around the store collecting products before packing them up in a box to ship. Every aspect of this was monitored: the time to grab all of the items, the number of unfindable items, the time it took to pack the items in a box, and how long the whole day’s worth of orders took. Now this seems pretty reasonable and motivating, right? Except moving around a crowded store is much different than moving around an empty one.

https://pas-wordpress-media.s3.amazonaws.com/content/uploads/2014/09/positive-customer-service-approach.jpg

At any given moment, a customer may need help but my order timer kept ticking the entire time, and if my speed score went too low a nasty chain of events was kicked off. My manager would talk to me, the store manager would talk to her, the district manager would talk to our store manager, and the store hours then get cut. And with fewer employees working, I would have to help customers much more often, lowering my speed score. The real consequence of this system was that the employees doing what they’re supposed to were punished.

In both cases, some analysis was automated by an imperfect algorithm intended to accomplish one thing, but instead triggering the opposite to happen. The question remains whether these are just bad algorithms, or if the greater problem lies in the push for cold objectivity itself.

--

--