Should a car be able to decide who to kill and who to spare?

Should a car be able to decide who to kill and who to spare?

The first known fatality relating to automatic driving occurred this week, when the driver of a Tesla S car was killed while the car was using its Autopilot setting. The driver and the Autopilot system failed to recognise a tractor trailer crossing the road due to its white colour blending with the brightly-lit sky. With large car makers and technology companies investing considerably in self-driving technology to provide better safety, convenience and comfort for drivers, they must also determine the ethical implications of programs used for it. The discussion regarding the responsibility of death caused by these cars is complicated further by the variety of laws across different countries. As such, car manufacturers are cautiously researching into this possibility with philosophies and opinions of ethicists being considered. For example, Pa Consulting conducted a survey asking people whether an autonomous vehicle should kill the occupants to prevent the death of the pedestrians in the event of an accident. 76% of respondents said yes. However, when asked whether the same people would purchase a car that provided safety for pedestrians over themselves, they weren't as enthusiastic.