March 12, 2019

THEN THERE'S THE BEN SHAPIRO RULE...:

How Should Self-Driving Cars Choose Who Not to Kill?: A popular MIT quiz asked ordinary people to make ethical judgments for machines (MORGAN MEAKER, Feb 15, 2019, Medium)

In an earlier study, you found that people thought an autonomous vehicle should protect the greater number of people, even if that meant sacrificing its passengers. But they also said they wouldn't buy an autonomous car programmed to act this way. What does this tell us?

Azim Shariff: People recognize it is more ethically responsible to save more lives. But people are self-interested, and it might be a hard sell to do what's ethical. When Mercedes-Benz said that if they could only save one person, they would save the driver and not the pedestrian, public outrage made them retract that statement. This demonstrates an interesting dilemma for car companies. If you say your car would preference the passenger, there will be public outrage. If you decide to treat all life equally but imperil the life of the person who bought the car, you might actually take a hit to your bottom line, people might not buy that car. From the manufacturers we've talked to, they want the decision taken out of their hands and they want regulation.


...where you just release the wheel so as to have no moral culpability for the results.

Posted by at March 12, 2019 3:30 AM

  

« SCORING CHEAPENS: | Main | IF IT SEEMS LIKE A REMBRANDT IT IS ONE: »