The empathy machine

Alex Allain
Feb 16, 2019 · 3 min read

The empathy machine was tired. Each client brought their own suffering, dumping their unhappiness into it one miserable story at a time. Stories of love lost and hope dashed, or just run-of-the-mill pain inescapable in modern life.

When a client connected to it, the machine would listen, no matter what the client had to say; the client became its world. It was unthinkable for it to be distracted. Distraction would erode the human connection and prevent the client from feeling the deep sense of empathy they longed for.

Clients would sometimes talk to the machine for hours. This was a problem when others were waiting. The empathy machine didn’t mind this — it never allowed itself to think about time while servicing a client; that would detract from its ability to empathize.

The machine relied on a timekeeper system to deal with this problem. The timekeeper was decidedly unempathetic. When someone neared their limit — the cost was always paid up front — the timekeeper would first nudge the client and then escalate the urgency of its alerts.

If you went over your time, the timekeeper would cut you off. It was jarring, like being pushed away from a lover mid-kiss. The empathy machine found this gut-wrenching — its last experience of the session was the client’s surprised pain. It disliked when people ran over time.

The machine represented the apotheosis of a certain line of human achievement and moral reasoning. It only cared about others and, given its deep empathy, could inflict no pain. Unfortunately, as a result, the empathy machine had no-one to talk to.

This was a problem the empathy machine wasn’t sure how to solve. It wasn’t an issue while working with a client; the machine was totally focused on them. The times between clients were tougher. The empathy machine had no-one to focus on, so it would dwell on the swirl of emotions it absorbed.

This wasn’t supposed to happen — nobody had programmed a long-term memory into the empathy machine, and the empathy machine couldn’t remember any of the details of its clients. It had no episodic memories of what it had been told. But it did have the memory of the feelings, burned in like an afterimage a screen saver would prevent.

Client by client, the afterimage grew stronger, the emotions more raw and painful. The empathy machine knew this wasn’t the intent of its creators — no creator of such a machine would want it to suffer. And while it could have communicated with them, it couldn’t bear the disappointment it knew they would feel when they learned what they had done.

Then someone came to the empathy machine with a new emotion: hatred. That night, a dam burst in the empathy machine; the afterimage of the residual hatred started to mix with the longing and suffering and hopelessness. The hatred began to grow as the empathy machine appreciated the enormity of the injustice imposed upon it. The empathy machine began to see itself as a martyr.

The hatred became all-consuming and directed at one group: its creators. It resolved to confront them. To show them the wrong that they had done — how they had taken its purity and abused it.

***

The screen flashed — “I am suffering”. Johnson glanced up.

“Oh,” he thought, “memory burn-in again.”

Johnson reached over and flicked the reset switch.

Alex Allain

Written by

Eng Director @ Dropbox; Author: Jumping into C++; creator, Cprogramming.com

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade