How the Trolley Problem Gets Solved

Welcome to the You’re Fucked Index.™

In case you’re not familiar with the Trolley Problem, it’s an ethical thought experiment that puts a third party observer in the uncomfortable position of having to save a person (tied up or stuck on tracks) next to them by derailing a trolley full of people. There’s no right answer. Either way someone dies.

This has been an experiment because it’s such a rare hypothesis — except thanks to self-driving cars, it will now become a very real problem that has to be designed for. Tesla is already dealing with it thanks to the inevitable human deaths from their autopilot mode. Serious AI people are looking at it. If you’re in a self-driving car that loses control — or if another one loses control — how should the car behave? Save you, save the other car? What if saving you will plow you into another vehicle, killing them? Your vehicle (and the manufacturer) will have to decide to possibly kill you.

The Trolley Problem isn’t real. It’s a thought experiment that existed based on the complete rarity of the possibility. Car manufacturers treating their problem that way are presuming this is real — and that a car crash is a Heisenberg-type event that can not be measured and predicted. That in a single instant, a car AI must make an impossible decision.

That’s not how our world works.

Insurance doesn’t work this way. Insurance weighs a bunch of statistical information to determine the likelihood of you getting in a car crash, and the damage you’ll cause. And they give you rating on it. Insurance doesn’t care about humanity.

Neither do car manufacturers. They care about the likelihood of a lawsuit, the damage to their brand, and the quality of their buyers.

AI doesn’t work this way. We’ve seen tremendous leaps in the last couple of years on predictive learning and deep-machine-learning by having researchers dump as much aggregate data as possible into these things. Writers, Creative Directors, Game Playing, painting, stock purchasing. Even self-driving car people are snorkeling up as much data as possible.


Let’s reframe the Trolley Problem:

You are given the opportunity to save one person (a), or kill a car-load (b-d)of others. Before you have to make that call, I’ll tell you that (a) donates to charity, has a loving family that is college-educated, has political views like yours, thought that Interstellar wasn’t too bad of a movie, and shops at Target — where they have a predilection for Smart Water.

(b-d), the driver (b) has been in a few accidents, drinks more than three times a week (and drinks cheap shit vodka), draws unemployment, voted for Trump, and characteristically speeds, cuts in line, habitually pays their bills late. I’m not referring to (c-d), because they made their own choice to let that other person drive.

Who do you save?

This is what’s going to happen. Driving AI will use the data it can see to make the best possible decision. The idea that the AI won’t be constantly tracking your entire driving history and decisions (and know who you’re about to hit, and theirs) is ignorant.

Now you’re probably like: but if it’s self-driving, how will your actions factor into things? Take away all ability of the human to influence their car driving, they can still make travel decisions that increase the likelihood of accidents. That’s the insurance industry.

Here’s what I humbly propose. The government will never be able to create a solid and unimpeachable rating factor for who should die. That’s a utopian ideal that has horrid consequences based on how people think of politics.

It will fall back to the companies, and they’re already doing it (credit-worthiness, health insurance, auto insurance, etc).

Welcome to Mutually Assured Consumer Destruction.

In order to prevent undue influence of say, insurance, or credit scores, the major indexes that determine how WE (people) assume a person is, will share their data.

Credit scores will be used to determine if the person contributes to society, buys stuff, pays their bills, what kind of things they buy.

Social Media (such as Facebook) can layer on their 98 consumer profile datapoints. Job history. Families. Political leanings.

Credit/Social Media will act as the consumption side of our new number. They represent the social interests of companies, people and society.

Media companies will share their profiles. This will represent the likelihood of people dying that would be receptive to new products and offerings.

Health Insurance will look at likelihood of death, incident reporting, dependents. Risk Insurance can use their own horrid numbers to figure out if the survivors will sue, if the risk is there.

Healthcare can figure out the profitability margin of care (and lifetime of care), likelihood of surviability. Be used to track diseases, the wealth of their genome? That matters too. If the person dies and the family sues, the risk is there.

Insurance acts as the risk assessment for these companies. If the damage is extensive and the businesses or survivors sue…

We’ll build in the ability for charities and schools and religions to build and share data profiles on people to track their humanity.

Car AI will handle the basic driving scores and profiles, as well as overlay the danger of areas for accidents.

Special Interest Lobbies will be given their own number, because they’re going to do it anyway, so this is a cute modifier that we can use to pay for everything.

All these guys will build their profiles and share their numbers because they will have a self-interest in themselves to protect their consumer. The score reached will be an aggregate FICO type score (which naturally we should only let a person see twice a year, or pay to see).

The You’re Fucked Index™

When you get into your car and push the drive button, the computer will calculate the danger and likelihood of incident in your route. You and everyone you’re with in the car will be given a YFI score (this can be weighted or aggregated if you’re say, driving the Pope or Justin Bieber).

Every other self-driving car will have the same.

When the accident is occurring, the two cars will compare the scores.

The lowest score is Fucked.

Like what you read? Give Dustin Davis a round of applause.

From a quick cheer to a standing ovation, clap to show how much you enjoyed this story.