Fear and Loathing in the Automated Workplace

Charlie Oliver
TECH 2025
Published in
10 min readSep 29, 2018

Are automated technologies and AI making us feel helpless, or are they merely highlighting a powerlessness that is already pervasive in corporate culture?

Ibrahim Diallo’s story went viral over the Summer.

I was one of the many people who clicked on countless news articles about the software developer who was wrongly terminated from his job by an automated system — as told by Diallo in a post on his blog: The Machine Fired Me: No human could do a think about it!

It’s a complicated human drama that touches on our deepest fears and anxieties about machines controlling our destiny and rendering us redundant.

In his post, Diallo methodically recounts his traumatic experience. Eight months into a three-year contracted position at a “big company,” Diallo was terminated abruptly by the company’s systems through a series of automated actions executed over several days, including being locked out of security systems (the all important key card) and computer systems, and a barrage of emails that were sent to department managers throughout the company notifying them that Diallo had been terminated and instructing them of precise actions to take that would expedite his termination immediately.

The problem was, no one (not even Diallo’s boss or human resources) knew why he was fired or who gave the system the order to terminate him. It was clearly an error. But, despite their efforts, they were unable to stop automated system carrying out is termination of Diallo. And so, the machine continued terminating Diallo, unimpeded by humans.

The entire story reads like an episode of Black Mirror.

Diallo doesn’t hold back when explaining the chaos that followed as his managers tried desperately to halt and reverse the automated system’s actions which they initially and erroneously thought would happen fairly quickly. In fact, it took three weeks for management to unravel the mystery and restore Diallo’s position at the company. In the end, IT, HR and Diallo’s managers resigned themselves to letting the automated system “do its thing” (completely terminate Diallo), and then rehiring him as a new employee in the system.

Shortly after being rehired, Diallo resigned from the company — dismayed and disappointed (to say the least).

It’s clear that Diallo believes he was the victim of an error initiated by the company’s automation software. This statement from his post should leave zero doubt: “The system was out for blood and I was its very first victim.

You can hardly blame him for feeling as if he was randomly targeted and offed by the automated software. The final nail in the coffin for him: the automated system sent instructions to the building’s security to escort Diallo from the building and they carried out the instructions dutifully. Few things could be more embarrassing than being escorted from your place of work by security “like a thief” (Diallo’s words).

This is where the real story begins.

An Automation “Who Done It?”

Who really fired Diallo?

According to Diallo’s recollection, it was the automated system. But machines don’t act without being programmed or instructed to do so (at least, not yet). This was not a random act of malice by a machine. The fact is, the software did exactly what it was programmed to do.

Ibrahim Diallo

As Diallo tells the story, the unfortunate chain of events were most likely triggered when his former manager at the company, who had been unceremoniously “laid off” as a full-time employee and rehired as a contracted employee working remotely, failed to renew Diallo’s contract paperwork which would’ve secured Diallo’s employment with the company for the next three years. Diallo surmises that this odd “oversight” by his manager was probably due to “shock and frustration” at being fired, Effectively, Diallo had not been fired by a machine, his service wasn’t renewed by a human.

This is where the automated system stepped in to eject the non-employee from the system, deny him access to all company systems, and remove him from the physical premises. As Diallo explains it, “Once the order for employee termination is put in, the system takes over. All the necessary orders are sent automatically [to department heads] and each order completion triggers another order.”

When it was all said and done, not only did the machine carry out its programmed functions correctly, it did it with speed and surgical precision that no team of human beings could match. The humans involved, however, were confused, in denial, belligerent and paralyzed for three weeks as everyone tried to untangle this mess via a dizzying maze of unending email communications.

And worst of all, in addition to being wrongfully terminated, Diallo’s coworkers (people he believed respected and valued him) turned on him and treated him as if he did something wrong. This story isn’t as much about what the machine did to Diallo as it is about what the humans didn’t — which turned this simple error into an epic fail.

The Emperor Has No Clothes

Helpless: deprived of strength or power; powerless; incapacitated

Feeling helpless is crippling to the human psyche. At its core, helplessness is an inability to assess and maximize our strengths when faced with problems that expose our own vulnerability and threatens our well-being — which leads to a sort of paralysis. We tell ourselves that we can’t do anything to change the outcome of a potentially devastating situation. We become victims.

Of course, nothing could be further from the truth. When we feel the most helpless, we usually have the most power to effect change. But we cede that power and the opportunity when we should embrace it.

According to Diallo, literally every person that had anything to do with this problem in the company was helpless to stop it or change its outcome — the security guards, IT, human resources, his managers, and his coworkers). As he says in the subtitle of his post: No human could do a thing about it.

In his blog post, Diallo reflects on the negative and lasting impact this experience had on him:

I was very comfortable at the job. I had learned the in-and-out of all the systems I worked on. I had made friends at work. I had created a routine around the job. I became the go-to guy. I was comfortable. When my contract expired, the machine took over and fired me.

A simple automation mistake(feature) caused everything to collapse. I was escorted out of the building like a thief, I had to explain to people why I am not at work, my coworkers became distant (except my manager who was exceptionally supportive). Despite the great opportunity it was for me to work at such a big company, I decided to take the next opportunity that presented itself.

What I called job security was only an illusion. I couldn’t help but imagine what would have happened if I had actually made a mistake in this company. Automation can be an asset to a company, but there needs to be a way for humans to take over if the machine makes a mistake. I missed 3 weeks of pay because no one could stop the machine.

At least a year later, I can sit here and write about it without feeling too embarrassed. So that’s the story about the machine that fired me and no human could do anything about it.” — Ibrahim Diallo

Diallo’s language is biting and telling. He was escorted out of the building “like a thief.” His coworkers “became distant.” He realized that what he thought was job security was just “an illusion.” And one year later, he still feels “embarrassment.” It’s clear that he was deeply hurt and has feelings of resentment to this day. Again, who could blame him? None of this was his fault.

But machines don’t hurt our feelings and they don’t turn on us (at least not yet). People do. And for a situation that caused so much chaos and stress, Diallo not only lacked sufficient support from the company and his coworkers, he was treated suspiciously and like a guilty criminal (these are wounds that no doubt go deep).

The automated system performed its tasks efficiently, which undoubtedly saved the company money in the long-run. But the efficiency of the software also shined a bright light on organizational, communication and cultural fissures within the company that Diallo, his managers, and his coworkers were suddenly forced to acknowledge.

This is the big elephant in the room.

Human Emotions vs. Machine Efficiency

I recently interviewed John Pavely, SVP of Software Engineering at Viacom, on our podcast (Taming Automation Dragons) about this story and how companies can avoid these types of catastrophes. John runs software development teams around the world for Viacom. In the interview, he reminded us that automation, especially for HR functions, is not new. It has been around for since the 70s. Companies automate for scale, to reduce cost and error.

But John emphasized that Diallo’s managers could’ve avoided this horrific mishap and that it’s a mistake to blame the automated software for what is clearly human oversight:

There are a lot of stories (even from when I was a kid) when about similar situations where the software is in control and human beings can’t do anything about it. The truth is, I do believe that whoever was in charge of IT or HR dropped the ball because computers do what we tell them. We like to think of computers as superintelligent but, even with machine learning, they’re quite dumb. They react to the data that we give them, to the requirements that we give them — the algorithms basically operate to the spec that we create. And I think that in any HR, or finance, or operational function, there should always be a big, red button. And that big red button allows a human being to stop a situation and say, ‘Hey, this is not supposed to happen’… You can’t blame the machine. Ultimately, it was the software developers, the IT managers, the finance people, the accounting people, and the HR people who signed off [on the automated software]. And they are all people.” — John Pavley

John Pavley, SVP Software Engineering, Viacom

I couldn’t agree more. There should be a solid system of checks and balances when IT develops and implements automated software. And all employees should be educated about the automation taking place — why are certain functions being automated, what’s the end goal, how does this impact our workflow and communication, how do we report problems and how will they get rectified?

But in addition to this system of checks and balances for the software, companies need to prepare for potential emotional and psychological fallout with employees from unpredictable mistakes that are bound to happen when implementing automated software. Had this been done in Diallo’s case — had he been given the emotional and psychological support he needed, and had his coworkers been sufficiently informed of what was going on, everyone could’ve had a far more positive outcome.

Companies should start having serious, ongoing internal discussions about how employees feel and react to automation now if they want to truly be successful with their digital transformation strategy — it should be the foundation of every digital strategy.

Diallo was fired by automated software, but he was hurt and embarrassed by coworkers who shunned him and by management who made him go three weeks without work or pay while they stumbled their way through fixing the problem. And no one went out of their way to tell Diallo that, despite the unfortunate events, he still mattered to his team and the company (Human Resources dropped the ball here — employees should be encouraged to support other employees when these incidents happen). Despite the fact that Diallo’s coworkers and managers heaped praise on him in the past for his stellar work, when the machine said, “He is terminated effective immediately,” everyone rejected him. The sudden rejection by his coworkers proved to be far colder and cut Diallo much deeper than the system’s automated rejection.

We’ve become obsessed with trying to figure out how to develop AI with empathy (We Need Computers with Empathy), when we should be addressing human empathy in the workplace alongside tackling AI empathy.

The basis of this behavior is fear. For the first time, companies need to deal with the very real fear employees have emerging technologies like AI and automation — technology that they know or suspect will replace them sooner or later. Companies are going to need to help employees feel empowered at a time when they are feeling increasingly threatened by machines. The human bonds in the workplace need to be consciously and consistently reinforced. And most importantly, when problems with technologies arise, employees should feel that they can talk about these problems, share how they feel about them, and be heard without negative repercussions.

As a society, we need to understand exactly how we are changing (and how we need to change) as automation takes over jobs in the workplace. Employees should be learning about how automation is impacting and changing them, their company, and the industry they’re in while it’s happening, not after the fact. And they should be given tools, information, a platform, and support for exploring these topics and their feelings. Ignoring this could come at a costly price, as Diallo’s company learned the hard way. As for Diallo, he’s currently writing a book about his experience to help others going through similar situations (after being contacted by many people who shared their similar stories). Maybe the title of his book should be, “How Not to Feel Helpless in the Automation Revolution.”

Listen to podcast episode “Taming Automation Dragons with guest speaker John Pavley” HERE.

Originally published at tech2025.com on September 29, 2018.

--

--

Charlie Oliver
TECH 2025

Founder @ServedFresh™ and @JoinTech2025. Strategist. Transitionist. Advisor. AI/Machine Learning. Unapologetic instigator of provocative discourse. INTP