Self-Driving Car Thought Experiment.

尺ㄖ

You are driving in the middle lane of a three lane motorway.

To the left, there is an SUV. To the right, there is a motorcyclist. Directly in front there is a delivery truck.

Without warning, the load of crates that the truck is carrying comes loose and falls into the lane in front. It has happened too quickly, and your car is going to fast for breaking to prevent a crash.

Your self driving car can either swerve into the right lane, hitting the motorcyclist which would likely be fatal to them. However, hitting the motorcyclist will likely cause the least amount of damage in terms of the quantity of other people injured.

Your car can choose to stay in the middle lane, crashing into the crates that have fallen, likely killing you as the passenger. However it causes the minimal amount of damage to those surrounding.

Or your car can choose to swerve to the left into the SUV, which is carrying a family. Unlike the motorcyclist, the SUV has more of a chance of protecting its passengers. However, hitting the SUV increases the chances of injuring numerous people.

Poll Question:

Who should programmers program the car to choose to injure/kill in an unavoidable crash?

Bonus:

Should the people programming self driving cars be prosecuted for preemptive homicide?

With this scenario in mind, would it be ethical to use self driving cars?

Vote below to see results!