January 25, 2019


A Study on Driverless-Car Ethics Offers a Troubling Look Into Our Values (Caroline Lester, Jan. 12th, 2019, The New Yorker)

We should be wary of drawing broad conclusions from the geographical differences, particularly because about seventy per cent of the respondents were male college graduates. Still, the cultural differences were stark. Players in Eastern-cluster countries were more likely than those in the Western and Southern countries to kill a young person and spare an old person (represented, in the game, by a stooped figure holding a cane). Players in Southern countries were more likely to kill a fat person (a figure with a large stomach) and spare an athletic person (a figure that appeared mid-jog, wearing shorts and a sweatband). Players in countries with high economic inequality (for example, in Venezuela and Colombia) were more likely to spare a business executive (a figure walking briskly, holding a briefcase) than a homeless person (a hunched figure with a hat, a beard, and patches on his clothes). In countries where the rule of law is particularly strong--like Japan or Germany--people were more likely to kill jaywalkers than lawful pedestrians. But, even with these differences, universal patterns revealed themselves. Most players sacrificed individuals to save larger groups. Most players spared women over men. Dog-lovers will be happy to learn that dogs were more likely to be spared than cats. Human-lovers will be disturbed to learn that dogs were more likely to be spared than criminals.

In its discussion, the paper skims over the uglier aspects of the study to identify "three strong preferences" that might provide a starting point for developing a standardized machine-ethics framework: sparing human lives, sparing more lives, and sparing young lives. The paper concludes with a soaring look into the future, and recasts machine ethics as a "unique opportunity to decide, as a community, what we believe to be right or wrong; and to make sure that machines, unlike humans, unerringly follow these moral preferences." But, when I asked Shariff what he thought of the human prejudice shown in the data, he laughed and said, "That suggests to us that we shouldn't leave decisions completely in the hands of the demos."

Posted by at January 25, 2019 12:03 AM