作者
Carolyn Mason
发表日期
2016
出版商
Science Media Centre, Royal Society of New Zealand
简介
“The researchers found that the majority of their research participants accept that it is morally right to program autonomous vehicles (AVs) to kill the occupant of the car to save ten pedestrians. As the researchers point out, this result agrees with utilitarianism, that is, the moral theory that holds that the right action is the one that, of all the available options, maximises happiness. It is also arguably consistent with other moral theories. Would a rational person want to live in a world where AVs were programmed to kill ten pedestrians rather than kill the occupant of the vehicle? If the answer is ‘no’, then according to Kantian ethics, programming AVs in this way would be immoral. It also seems reasonable to believe that a virtuous person want to drive an AV that would kill the occupant of the vehicle rather than kill ten pedestrians. If so, this position is consistent with virtue ethics.
“Bonnefon et al. also found that the majority of their research participants would prefer to buy a car that would kill ten pedestrians rather than the occupant of the AV. In contrast, the majority of research participants believed that it was both ethical to program an AV to kill one pedestrian if doing so would save ten, and would prefer to own an AV that would kill one pedestrian if doing so would save ten. Bonnefon et al. conclude that “there seems to be no easy way to design algorithms that… reconcile moral values and… self-interest”(1576).