2.

An open letter

Ben Werdmuller
Decision Tree
Published in
7 min readNov 3, 2016

--

“The scheme is based around algorithms that have been developed by Admiral Insurance. The technology uses social data to make a personality assessment and then, judging against real claims data, analyze the risk of insuring the driver.” ~ The Guardian

The workhouse was a vast, open-plan office space with a high, vaulted ceiling and concrete floors. Exposed ventilation piping ran around the perimeter, sometimes duct taped to the glossy white walls, dropping to ground level to provide additional venting at strategic points. Great rails of fluorescent tubes filled the space with stark, uniform light. There were no clocks or windows; no sign that there was an outside world. It was like being in the belly of a giant robot.

The corridor gave way to a fenced-off path. Yellow arrows had been fashioned from tape, telling me where to go. In the event, I didn’t need the cue: the flow of people was so great that all I needed to do was follow. We flowed in like the tide.

There was a background hum, which I at first took to be mechanical. As I was led further into the center, I became aware that it was composed of individual sounds: some part mechanical, sure, but mostly the product of people moving, typing and speaking to each other, their distinct atoms of individual action blurring together to create an unrecognizable cloud of noise.

Finally, I came to the end of the path. There were vast rows of automatic gates: a door set into a steel frame, with a camera embedded above. We fanned out to meet them, forming smaller lines in front of each one. As it became my turn to approach, my watch beeped, and the door sounded a tone in reply, before silently opening to more workhouse beyond.

Here, the hastily-taped arrows stopped, and my watch took over the reigns, displaying directions on its face, and giving me a small informational shock when I veered off-path. As I walked through the room, I found that most people were stood at workstations on long, high tables. There were no chairs, and each computer was uncomfortably close to the next, presumably to use the space as efficiently as possible. Curious about the work they were doing, I looked to the monitors, but found them blank: each person stood wordlessly at their station, pushing buttons in response to individual stimuli. Ghosts.

Finally, my watch brought me to an empty workstation. Two women stood to my left and right, each wearing the same blue, denim shirt and jeans. Neither acknowledged me as I approached; I simply slotted into the gap and waited for further instructions.

As I stepped closer to the monitor, I saw that it wasn’t blank after all. There was some kind of lens on the screen which prevented it from being seen by anyone else. Similarly, I couldn’t see the displays around me, despite being only feet away.

Not that there was anything to see yet. The screen simply said: loading queue. I stood, and waited for instructions, taking in the scene around me: people in every direction, as far as I could see, performing unknowable tasks ad infinitum. A battery of silent decision-makers, with no autonomy of their own.

I stood at my workstation waiting for my queue of tasks, gazing at the oblivious crowd, for what felt like hours. It was impossible to tell, and while the lack of clocks or natural light was probably meant to make us lose track of time, it had the opposite effect. Every heartbeat became an eternity; every click on a nearby keyboard a new chapter.

Then:

Welcome.

I nodded at the screen, and the words disappeared. There was a moment, and I thought I was going to have to wait again.

Your identity phrase is displayed on your watch. Keep it safe. Do not write it down. Do not forget it. It is the key to your work queue.

I glanced at my watch. In faint, grey letters it said: wave flame rises. And then it was gone.

I turned back to the screen.

Your watch will notify you when a queue of work is available for you. The Service Level Agreement you agreed to when you accepted this position states you must spin up your workstation within ten (10) minutes of notification. Your queue will then be deferred to another worker and your queue priority will be reduced.

I sighed. In other words, if I was notified there was work for me and I didn’t start working inside ten minutes, they’d give it to someone else and I’d be less likely to be assigned more. I hoped they would constrain my availability within sensible hours.

Your timetable status: on-demand.

No.

“What is my availability period?” I asked, out loud. I wasn’t sure how this worked, whether the workstation understood speech at all, but maybe …?

Your timetable status: on-demand.

I stared at it, hoping it meant something different; that my freedom didn’t depend on waking up at any time of day or night to respond to a queue of requests whose length and content I couldn’t know, knowing that if I fell ill or I failed to do my job I might not get another chance.

My heart pounded in my chest. I felt sick. I looked around, panicked, trying to see if anyone looked sick, or exhausted, or ready to run. Nobody did. They were seemed to be healthy-looking and focused. Tranquil, even. Doing their jobs without any apparent fuss.

Were they drugged, or afraid? How could they just accept these kinds of conditions?

Queue begins in 3 minutes.

A tear slowly ran down my cheek. As the screen faded and the tasks begin, I felt it fall from my face and make a tiny, transient spatter on the concrete floor.

“Hey.”

At first I didn’t hear it. The workstation let me know I was 273 tasks into the queue, 4 under par, and I needed to hurry up if I was going to keep my priority status.

Hey.”

A new task appeared. The screen showed a letter, handwritten and crumpled on blue lined paper. Some of the cursive words were obscured by damp circles: either the author or the recipient had been crying. The letter pleaded for a lost lover to return, using increasingly-desperate language as it went on. An achingly sad artifact from a dying relationship that only the sender wanted to go on forever.

Define three (3) main topics and determine if author is of stable mind.

I felt a flick on my shoulder.

Hey you.”

Instinctively, I started to turn to face the whisper to my left.

No. Don’t turn. Keep looking forward. Keep doing your tasks.

“Who — ” I started to say.

Whisper.”

I lowered my voice. “Who are you?

I’m R,” she whispered back, and I realized she was speaking with a Scottish accent.

“I’m Lem. Why are we whispering?”

They can’t hear you above the noise if you whisper. There are, like, six microphones in your workstation but if the sound is too low they can’t recognize speech.

I typed on my workstation: Love, loneliness, goodbye. Yes.

Why not? Is this safe?

Yes. If the sound is too low they have to send the audio for human processing.

My screen showed a photograph of a dark-skinned man wearing a red sweater. It had been snowing, and it looked like he was walking in the road to avoid getting his feet wet. He was carrying a bag of groceries in one hand. He was holding hands with a small, blonde child with much lighter skin. She looked happy; he looked sad, like he’d lost something.

Is this man a potential threat (Y/N)

I looked hard at it, trying to determine whether anything about it was threatening at all. Maybe if I was frightened of people with darker skin. It seemed so subjective; not something you could make a snap judgment about. Exactly the kind of job a computer couldn’t do. Exactly the kind of job that shouldn’t just be outsourced to a factory.

N.

Anyway,” R whispered. “You’re new here. I just wanted to say I’m sorry.

Sorry for what?

Everyone who winds up here should have someone saying sorry to them. But don’t worry.

My workstation showed a photograph of a naked child chained to a bathroom sink.

Is this content work-safe? (Y/N)

Quickly, I hit N.

What do you mean, don’t worry? And why don’t you care about the sound being sent for human processing? Doesn’t that mean they’re listening?

My screen changed to a waveform — a visual representation of a sound wave — and a play button.

They get humans to transcribe it. But they don’t want anyone to know what it says, so they chop it up into chunks and send it to lots of different people.

So?

So who do you think does the transcriptions?

I looked at my workstation. If this is speech, transcribe the audio. If not, press Enter.

I said don’t worry because you won’t be here for long. You’ve walked into a revolution, Lem. And we need you to decide.

I pressed the play button and immediately heard someone whispering, mid-sentence: “ — sound is too low — ”.

Decide what?

We need you to decide whether you’re coming with us.

I just got here,” I whispered.

Look around you. All these people at their workstations. Why do you think they’re so calm?

I said nothing. Eyes forward.

Lem, they’re calm because they’re going to burn the house down. The reason I’m not worried about human processing is that we’re the humans. We do the processing. If we choose not to process, if we choose to make a different decision, then it simply won’t be processed and nobody will know any different. And Lem, the reason I’m telling you this now, the reason you need to know this, is because if you’re not with us, if you’re going to choose to perpetuate what’s going on here, then, well —

My fingers hovered over my keyboard.

— we’re going to burn you down with it.

I closed my eyes, swallowed hard, and pressed Enter.

Are you sure? (Y/N)

I paused, just for a moment, thinking about going back.

Y.

A liar on my first day.

This is chapter two of my NaNoWriMo 2016 story. To follow along, click the “follow” button below. Or click here to read my non-fiction pieces.

--

--

Ben Werdmuller
Decision Tree

Writer: of code, fiction, and strategy. Trying to work for social good.