Emerging Technologies

How self-driving cars can crash ethically

A sensor is seen spinning atop a Google self-driving vehicle before a presentation at the Computer History Museum in Mountain View, California May 13, 2014. REUTERS/Stephen Lam (UNITED STATES - Tags: SCIENCE TECHNOLOGY TRANSPORT) - RTR3OZBV

Can self-driving cars program their way out of sticky situations? Image: REUTERS/Stephen Lam

Andrew Nusca
Digital editor, Fortune
Share:
Our Impact
What's the World Economic Forum doing to accelerate action on Emerging Technologies?
The Big Picture
Explore and monitor how Justice and Law is affecting economies, industries and global issues
A hand holding a looking glass by a lake
Crowdsource Innovation
Get involved with our crowdsourced digital platform to deliver impact at scale
Stay up to date:

Emerging Technologies

Can self-driving cars program their way out of sticky situations?

Imagine you’re in a self-driving car in the middle of a busy freeway, say the 101, in Los Angeles. Imagine that you suddenly have to swerve because a couch falls off a truck driving in front of you. Stomping on the brakes isn’t an option—behind you is a big rig that’s been tailgating you for the last mile. To your left is a sport-utility vehicle that could probably handle the impact if you swerve into it. To your right is a sedan that might not.

You have to swerve either way. So: left or right?

“These are decisions that need to be thought about or programmed in advance,” said Patrick Lin, director of the Ethics and Emerging Sciences Group at California Polytechnic State University. “Either way leads to problems.” In either case, you’re targeting a vehicle class through no fault of its own.

That’s the problem facing autonomous, a.k.a. self-driving, cars. At an invite-only dinner co-hosted by Fortune and The Drive at the 2016 Los Angeles Auto Show, experts agreed that autonomous technologies allow us to sit back and let the car take the wheel yet come with their own challenges.

“That’s one of the big opportunities for automated vehicles,” said Chris Gerdes, chief technology officer for the U.S. Department of Transportation. “It could have a huge impact on saving lives.”

Ninety-four percent of so-called last actions during an automotive collision are the result of human judgment (read: errors), Gerdes said. “Self-driving cars have this promise of removing the human from that equation,” he said. “That’s not trivial.”

The catch: With self-driving cars you’ve shifted the error from human drivers to human programmer, Gerdes said. Machine learning techniques can improve the result, but they aren’t perfect.

And then there are ethical concerns. If you program a collision, that means it’s premeditated, Lin said. Is that even legal? “This is all untested law,” he said.

On the other hand automated vehicles can see the environment around them in 360 degrees, Gerdes said. Humans can’t do that. Self-driving cars won’t drink too much at dinner; they won’t get tired.

“If you put a vehicle on an open road, I suspect an automated one will be much [safer] than a human[-driven] one,” Gerdes said.

 160121-cars auto industry self-driving autonomous Boston Consulting Group
Image: Boston Consulting Group

He referenced the so-called Trolley Car Problem. The gist: Five people are riding on the trolley car and a pedestrian crosses the tracks in front of it. In an inevitable crash situation, do you choose to let the trolley lose control and potentially kill its five passengers? Or do you throw a switch and almost definitely kill the pedestrian?

That’s an engineering question, Gerdes said. But what happens when you take more safety precautions—say, you post a sign saying that it’s dangerous for pedestrians to cross the trolley tracks? Does that change the equation?

That’s an ethics question, Lin said. “I think ethics is vitally important in these engineering problems but it’s not necessarily a literal translation of philosophical problems,” he said. People hate the trolley-car conceit “because it’s so fake,” he added. Ethics experiments tend to isolate variables. The real world is not so clear-cut.

Still, anytime you’re programming two tons of steel and glass moving at 60 m.p.h. to do something is an ethics call, Lin said. “Ethics are not facts,” he said. “These are things that need to be debated.”

Don't miss any update on this topic

Create a free account and access your personalized content collection with our latest publications and analyses.

Sign up for free

License and Republishing

World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.

The views expressed in this article are those of the author alone and not the World Economic Forum.

Share:
World Economic Forum logo
Global Agenda

The Agenda Weekly

A weekly update of the most important issues driving the global agenda

Subscribe today

You can unsubscribe at any time using the link in our emails. For more details, review our privacy policy.

Why the Global Digital Compact's focus on digital trust and security is key to the future of internet

Agustina Callegari and Daniel Dobrygowski

April 24, 2024

About Us

Events

Media

Partners & Members

  • Join Us

Language Editions

Privacy Policy & Terms of Service

© 2024 World Economic Forum