When we talk about self-driving cars (cars controlled by computers), people often go to the “what if situations”.
The situation people often go they often go to is “what if the car has to choose between running over a pedestrian or an animal or swerving and hitting a wall and thereby killing or badly injuring the driver.”
I often hear this “what if” posed to show just how dangerous self-driving cars will be. People say how will the computer make that decision? Who’s life will it place more value on?
Another interesting question is what would YOU do if you had to make the decision to run over a pedestrian or run your car into a wall? What would any human do in that situation?
What would you do if you had to choose between someone else’s life and your own well-being?
Interesting that we want to know what a computer would do but we rarely stop and consider how humans make decisions that impact life and death.