Advertisement

Designed to kill? The uncomfortable ethics of driverless cars

Should a driverless car ever kill its occupant for the greater good?

Nov 09, 2015, updated Nov 09, 2015
Driverless cars could disrupt traditional industries.

Driverless cars could disrupt traditional industries.

Human error causes more than 90 per cent of car crashes, so replacing a human driver with an autopilot could drastically reduce the road toll.

But removing the errors made by human drivers also means removing their capacity for ethical decision-making in real time – or, at least, delegating it to the manufacturer.

Associate Professor Robert Sparrow, of Monash University, warned last week’s driverless cars conference in Adelaide that in some circumstances, a driverless car should be designed to kill its occupant for the greater good.

The road toll, argues Sparrow, is always – in a sense – a public policy decision: a trade-off between speed, driver competence and the value of human life, so until the technology exists to prevent driverless cars ever crashing, the machine will inevitably have to confront some terrifying ethical quandaries.

Kill the driver?

“We need to build ethics into driverless cars,” Sparrow said, because “sometimes, driverless cars will be forced to choose what to crash into”.

“We could make incredibly safe (cars) now by running them at 20 kilometres an hour,” he said.

“(However) there’s a sense in which road fatalities always represent a policy … trade-off between our speed, driver competence and the value of human life.”

Because driverless cars may inevitably have to confront deadly situations, “in some circumstances, your automated vehicle should be designed to kill the driver”, he said.

This could be necessary when “the choice is between crashing into pedestrians and crashing into a tree”.

“It should be designed (that way) if any other choice causes more fatalities,” said Sparrow.

“Bit of a hard thing to sell to people, though.”

Following the law could make you a ‘target’

Another thought experiment Sparrow proposed was the case of an unavoidable accident in which one of two motorcyclists will have to be run into.

One of the riders is wearing a helmet, as per the law; the other is not.

Which rider must the driverless car choose to hit?

“We want to encourage motorcyclists to wear motorcycle helmets, but if you’ve got the choice of crashing into (one of) two motorcyclists, it’s less likely to produce a fatality if you crash into the guy who’s wearing the motorcycle helmet,” said Sparrow.

“So, if you’re a law-abiding cyclist, then you’ve just made yourself a target for the autonomous vehicles.

“That’s not exactly the incentive structure we’re looking for when we’re thinking about road accidents.”

Beware the driverless school bus

Sparrow asked the audience to imagine a driverless school bus placed in the ethical quandary of crashing into a tree or crashing into a single cyclist.

“It looks as though, if you’re really just concerned to save human lives, what you should do is (to) run over the one cyclist,” he said.

“That makes Google Bus a very dangerous thing to be around.

“It will almost always be safer for that machine to decelerate by crashing into you than it would be to decelerate by crashing into a fixed object.

InDaily in your inbox. The best local news every workday at lunch time.
By signing up, you agree to our User Agreement andPrivacy Policy & Cookie Statement. This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

“Again, you can fix all of these issues just by slowing the machine down.

“Eventually, I suspect, there will be a technological solution, but how many years (will pass) before we can really produce accident-free driverless vehicles?”

‘Drunken robots’

If human error is the cause of most car crashes, should humans be allowed to drive at all?

Sparrow argued that, eventually, manual driving should be made illegal, for the greater good.

“If you can show me that replacing our existing vehicle fleet with a driverless vehicle fleet, that will reduce road fatalities, then I’m all for it,” he said.

“Eventually, a human driver will be like a drunken robot.

“If you take the safety argument seriously, then this should be mandated, and we should be moving to take the existing fleet off the road as soon as possible.

“If driverless vehicles are safer, then it should be illegal to drive.”

Is it the road’s fault?

Last week, Transport Minister Stephen Mullighan revealed several of the state’s most expensive roads projects would be “hard-wired” to communicate with semi-autonomous vehicles.

But if a human being is “partially” driving the car, which, itself, is “communicating” with the infrastructure in the road, as well as with other driverless vehicles, who, or what, is responsible if there’s a crash?

“It’s one thing to say that the company who sold me the car is responsible for the road fatality (but) if that car is communicating with the infrastructure of the road (and) with other vehicles then there’s a tremendous dispersion of responsibility,” Sparrow said.

“I think is going to be really difficult.”

It’s one thing to say ‘you’re driving my car, you’re responsible for what happens as a result’… it’s another thing to say ‘this car doesn’t have a steering wheel, (the manufacturer is) driving it’.”

How concerned should we be?

“At the end of the day,” said Sparrow, he was not overly concerned by these issues, because removing human error from driving would undoubtedly make the roads safer.

However, he said, “taking the driver out of the (equation) places somebody else in the situation of decision-maker,” and “you really want to read the fine print if you’re getting into this vehicle”.

Local News Matters
Advertisement
Copyright © 2024 InDaily.
All rights reserved.