Advertisement

We need your help now

Support from readers like you keeps The Journal open.

You are visiting us because we have something you value. Independent, unbiased news that tells the truth. Advertising revenue goes some way to support our mission, but this year it has not been enough.

If you've seen value in our reporting, please contribute what you can, so we can continue to produce accurate and meaningful journalism. For everyone who needs it.

Tony Avelar/AP/PA
decisions decisions

The huge, unexpected ethical question self-driving cars will have to tackle

Would it rather hit a car full of passengers or a wall, injuring its own passenger?

SELF-DRIVING CARS don’t just pose technological and regulatory challenges.

There are ethical questions that need answering, too.

Like, if an autonomous vehicle finds itself in a situation where it will either hit a person in the road or an oncoming car, which will it choose?

Or, if one or the other were inevitable, would it rather hit another car full of passengers, or crash itself into a wall, potentially hurting its occupants?

Lesser of two evils

A recent study called “Autonomous vehicles need experimental ethics,” highlighted by The MIT Technology Review explores these questions, and more.

“Some situations will require AVs to choose the lesser of two evils,” the study reads.

For example, running over a pedestrian on the road or a passer-by on the side; or choosing whether to run over a group of pedestrians or to sacrifice the passenger by driving into a wall. It is a formidable challenge to define the algorithms that will guide AVs confronted with such moral dilemmas. In particular, these moral algorithms will need to accomplish three potentially incompatible objectives: being consistent, not causing public outrage, and not discouraging buyers.

The study doesn’t try to dictate the “right” or “wrong” answer to any of these questions, but makes the point that how companies answer them will have an impact on the adoption of autonomous vehicles.

Who’s responsible? 

If you knew that your self-driving car was pre-programmed to sacrifice you, the occupant, over a group of pedestrians in an inevitable crash situation, would that make you less likely to want one? And, legally, if the control algorithm of a car makes the decision of what to hit, would the passenger be held legally responsible, or the manufacturer of the car?

Jean-François Bonnefon, Azim Shariff, Iyad Rahwan Jean-François Bonnefon, Azim Shariff, Iyad Rahwan

The researchers — Jean-Francois Bonnefon, Azim Shariff, and Iyad Rahwan from France’s Toulouse School of Economics — conducted three surveys that show that people seem to be “relatively comfortable” with the idea that autonomised vehicles should be “programmed to minimise the death toll in case of unavoidable harm.”

MIT Technology Review previously reported that a professor at Stanford University led a workshop earlier this year, bringing together philosophers and engineers to discuss this very issue.

“It is going to kill”

“The biggest ethical question is how quickly we move,” Bryant Walker-Smith, an assistant professor at the University of South Carolina who studies the legal and social implications of self-driving cars told MIT Technology Review. Every year, 1.2 million people die in car crashes, so he believes that moving forward too slowly with self-driving car technology is an ethical problem on its own:

“We have a technology that potentially could save a lot of people, but is going to be imperfect and is going to kill.”

Business Insider reached out to Google to see if its researchers have been tackling these ethical questions and will update if we hear back.

- Jillian D’Onfro.

Read: Google’s self-driving cars have been in 12 accidents, but none were their fault*>

Read: Tesla’s autopilot mode for cars ‘hopefully’ won’t hit pedestrians, says CEO>

Published with permission from
Business Insider
Your Voice
Readers Comments
26
    Submit a report
    Please help us understand how this comment violates our community guidelines.
    Thank you for the feedback
    Your feedback has been sent to our team for review.