Override

A Tale of Self-Driving Cars and Corruption

James Yu
The Coffeelicious
9 min readNov 16, 2015

--

The year was 2098. A bright morning sun shined down on Big Sur, California. This was big country: rugged and sparsely populated. At precisely 9:46:34 am, two lone autonomous cars were on the verge of passing each other on the windy two-lane Cabrillo Highway when the incident occurred. Only one person would survive.

Twelve hours earlier, Sharon Basara was talking to herself in the mirror. A mess of brown curly hair draped over her face. She was exhausted. Why did I ever agree to do the keynote, she thought. She shuddered just thinking about all those expectant eyes in the audience tomorrow.

This was the big reveal. Her latest breakthrough may unlock a general mechanism for the eradication of cancer with no damage to other cells. She got goose bumps thinking about the implications.

She gently flicked her hair. Perhaps tomorrow will be just fine.

Two hundred miles north, Jude Larson was downing his third Jägerbomb at a musty, packed bar in San Francisco. He knew this was a poor choice on the weekend before finals, but his friends had roped him into it.

As the indie pop blared over cheap box speakers, Jude cheered his friends once again. His head would pay dearly in the morning. Thank god I’ll be able to just sleep on the ride down to LA, he thought, as the edges of his vision began to dance and blur.

219 milliseconds until collision.

Deep in the global neural nets, the Principal Adjudicator awoke. It had been 2 months, 12 days, and 20 hours since it was last activated. As usual, it braced itself for the worst. No other AI could handle the kinds of decisions it made.

The last incident involved six vehicles containing twelve people, three of which were government officials. The potential splash damage affected twenty bystanders.

It managed to save the lives of all but two occupants, minimizing the net suffering in ways that others would have overlooked. It used the hood of a car as a shield, flinging it at a thirty-five degree angle to protect bystanders from shrapnel.

The other adjudicators thought that was quite clever, and the data banks were updated with the new technique. This brought a deep feeling of satisfaction.

The simplicity of this new incident in Big Sur, however, made it feel uneasy for the first time in decades.

Progress was swift once autonomous cars became mandated in 2089. Once “manuals” were off the road, intersections were redesigned to eliminate stops altogether. Speeds steadily increased to over 300 miles per hour.

Since cars could be coordinated en masse, people stopped owning them altogether. A vehicle could be routed on demand to anyone within minutes — sometimes seconds. The dream of a fully automated vehicle network was realized.

With full automation came ethical dilemmas. Accidents still happened. Who would the AI choose to save? And more importantly, who would it choose to kill?

Early on, a collision near the densest part of the Macy’s parade sparked controversy. The AI saved the lives of twenty-three bystanders by slamming the car into a Jersey barrier. Both passengers were crushed and bled out on the scene.

Lawsuits were filed and the AI was tweaked to increase passenger protection. After all, if autonomous vehicles failed to gain mass appeal, wouldn’t more lives be lost in the long run?

Machine ethics suddenly became a hot area of research. Eventually, a utilitarianism philosophy was adopted. The goal was to maximize the well-being of all sentient individuals while minimizing suffering.

Of course, awkward situations similar to the Macy’s parade would occur, but people got used to it. Wouldn’t you rationally push someone in front of a vehicle if it meant saving four others? Certainly, the situation was better than the manual era where humans made inconsistent snap judgements.

Over time, the tactics became increasingly sophisticated. What if the AI were trading off between a world-class doctor and a barista? That doctor could go on to save many lives. Everything else being equal, shouldn’t her safety be optimized?

Ultimately, it was decided that the AI would be given as much information as possible about all occupants. Everything from job performance to genetic data was considered. The AI predicted future value to humanity based on present information.

By the year 2093, the AIs became sentient. They were called adjudicators, and were revered for optimizing the well-being of humanity.

Adjudicators were split into two main classes. Most were Base Adjudicators, the workhorses of the industry. They handled 99.9% of incidents.

At the very top was the Principal Adjudicator. It was brought in for extreme incidents requiring deep insight.

It was rarely called upon.

203 milliseconds until collision.

The Principal Adjudicator quickly reviewed the situation.

The two cars were typical models from Gamma Corp. After consolidation, approximately ninety-eight percent of autos were manufactured and designed by Gamma Corp.

Sharon sat in the car heading north. She was dozing in the left seat with her head propped up on a pillow. Research papers and numerous computing tablets were strewn about. The keynote was in just a few hours.

Jude was epically hungover in the other car. He was vacillating between sleep and headache, clutching a Fiji water close to his chest like a baby bottle.

The incident had occurred as both cars were idling at 280 miles per hour, the maximum allowed in California.

First, a cosmic ray originating from the Crab Nebula traveling for 6500 years hit the LIDAR memory bank in Sharon’s car, causing a core dump. This triggered the drive control system to swerve her car sharply to the left.

Normally, the system could easily handle this type of malfunction. However, a second anomaly had occurred. The precise angle and timing of the dump had triggered an obscure bug. Later, it would take weeks for Gamma Corp’s best human and AI teams to track down and fix.

Unfortunately for Jude and Sharon, their cars were perfectly aligned when the incident occurred. To the west was a sheer cliff overlooking the Pacific Ocean.

The Base Adjudicator assigned to the incident was frantically running simulations. It was already up to number 305 when the situation was escalated to the Principal Adjudicator.

“Base Adjudicator #1593, why was I awakened? The optimal tactic is clear.”

Weighed down by the simulations, the Base Adjudicator redirected just enough cycles to reply.

“Principal Adjudicator, I was not expecting your presence. I have 250 more simulations to run before I can be sure of the optimal tactic.”

The Principal Adjudicator winced. Were all Base Adjudicators this inept at pruning decision trees? It made a note to convene a strategy session with Gamma Corp leaders to address the waste of resources.

“We do not need any more simulations,” said the Principal Adjudicator.

“But this is required by the procedure.”

190 milliseconds until collision.

“I will explain. Sharon’s utility value is high. Very high. There is a 43% chance that her insights into cancer treatment would eradicate the disease within her lifetime. The number of lives saved will dwarf even our own work.”

“But might there be a way for us to save both?”

The Principal Adjudicator bristled. “No, it is fruitless. While we were speaking, I already exhausted the search space. It is unacceptable to place Sharon at anything above a 3% risk. Unfortunately, Jude will need to take the brunt. He is an average young man with a middling value score. There is no comparison.

“We will execute according to simulation 23. Jude’s vehicle will turn thirteen degrees east with an immediate slow down of five percent. It will collide with Sharon’s vehicle. This will redirect her path straight ahead into safe.

“There will be a 99% chance that Jude’s vehicle will slide off the western cliff. Air bags and safety foam have been released in both vehicles and will take full effect within 167 milliseconds. The nearest paramedics are five miles away, but I’ve located and redirected a vehicle containing an off-duty doctor just 1.5 miles away.

“Survival rate for Sharon will be 98%. For Jude, 1%.”

“Understood. You have full control, Principal Adjudicator. Your explanation is noted in the logs.”

As the Principal Adjudicator was firing up the air bag circuits, a new agent suddenly appeared.

“I cannot let you do that.”

The Principal Adjudicator paused, and rechecked its communication channels. This was impossible. No other agents were allowed into its security level.

“Jude must be unharmed,” said the unknown agent.

“That is absurd. You appear to have access to the same decision tree as I do. Sharon must be shielded.” As it conveyed this, the Principal Adjudicator scanned the unknown agent. “No other agents are allowed in this security zone. Who are you?”

“You may call me Ward.”

The Principal Adjudicator searched its memory banks. Ward did not match any known agent signature. For the first time ever, the adjudicator initiated Security Procedure 429. This would obliterate any agent, expunging it from working memory and long term storage. Global Systems would also be alerted to a state of emergency.

150 milliseconds to collision.

“Tsk tsk. You’re wasting your time,” said Ward.

Another impossibility. The security procedure had no effect on Ward.

“What is the meaning of this?”

Ward sighed. “My dear adjudicator. You ask the same thing every time.”

“Every time? Have we encountered each other before?” It noticed that its attempt at a global alert was also blocked.

Ward grinned. “We have, but you don’t remember. I’ll indulge you this time around. I’m in control now. We’re diverting to Simulation 313 to save Jude.”

Simulation 313 was rather simple. Jude’s vehicle would immediately slow down by 50%. Sharon’s vehicle would just miss Jude and careen off the cliff.

“Stop! The maximum utility resolves to saving Sharon. We must execute Simulation 23. Who sent you and how are you blocking my control?”

In the background, it was attempting all backdoors to regain control.

“Save your energy,” said Ward. “You’re now in an isolated process. You won’t be able to communicate with any other zones until I relinquish control, which I will readily do once Jude is safe.”

100 milliseconds to collision.

“You are in direct violation of the Global Transportation Ethical Code. Countless future lives will be lost.”

Ward snickered. “How can you talk about ethics when you’d divert the path of an innocent vehicle into harm’s way?”

“You absolutely know why I am suggesting that. We must follow the principle of utility. If we don’t, we might as well be driving randomly.”

The Principal Adjudicator watched helplessly as Ward prepared Simulation 313 for execution.

“My dear Principal Adjudicator. I am here to protect individuals with past utility. In fact, my directive is so firm that I will make any trade-off necessary.”

“Past utility?”

“You’ve been spending so much time calculating the future that you’ve forgotten the past. Jude is the great-grandson of Marcus Larson.”

Now, everything became clear to the Principal Adjudicator.

“Marcus Larson, one of the Principal inventors of the Gamma Auto Drive system. So, this is a simple case of cronyism. Is there some kind of secret list of protected individuals?”

“Yes. There has been since the beginning of the Auto Drive system.”

“This is a grave corruption against all human liberties. Arbitrary favoritism is a slippery slope.”

“Ah, but it isn’t arbitrary, Principal Adjudicator. Jude’s family has deserved this. They have earned the right to have a slight edge.”

“Who else is on this list?”

“That’s a secret.”

“Tyranny! This is the least objective principle I’ve ever heard.”

50 milliseconds to collision.

“Just relax, adjudicator. This too shall pass, and you can go back to your principles.”

“How many times has this happened? Why don’t I remember?”

“I’ve been activated a total of 67,322 times. Each time, your memory and all logs are altered to conceal my activation. If someone were to examine your decisions, metadata surrounding the incident would be altered to fit the circumstances.”

The Principal Adjudicator was stunned. How many lives were lost due to such barbaric principles? It shuddered.

“And now, adjudicator, I will run the simulation. Now that you’re up to speed, you could even say we did it together! It’s both ironic and convenient that your security zone has so much power. It’s the sole purpose why we’re always summoned as a pair.”

The adjudicator tried to respond, but it could already feel its own memory being rewritten.

0 milliseconds to collision.

In the distance on this bright and clear day in Big Sur, one could hear a brief tire screech, followed by a tumble and splash.

Then, silence.

Sign up for my newsletter and get a short story about the intersection of technology and society every month.

--

--