Skip to main content

Why are self-driving cars getting in crashes? They're too good at following the law

Volvo Autonomous Car
The problem with perfection? Humans just aren’t. And apparently, that tension is manifesting itself in potential problems for the all too perfect autonomous car industry. According to a new report from Bloomberg, self-driving vehicles have a crash rate that is twice as high as those involving humans, and the reason is all too ironic — autonomous cars are just too good at obeying the law. And sadly, humans aren’t.

This is great news considering California has just required humans to “be present inside the vehicle and be capable of taking control in the event of a technology failure or other emergency.” But it isn’t enough a technology failure that seems to be causing the problem — rather, it’s the rigidity with which these cars have been programmed to behave, which unfortunately, doesn’t allow for any error (human or machine).

Recommended Videos

So now, the big question among car makers and programmers has taken a strange turn — should cars be taught how to break the law in order to protect themselves, their passengers, and other drivers? “It’s a constant debate inside our group,” Raj Rajkumar, co-director of the General Motors-Carnegie Mellon Autonomous Driving Collaborative Research Lab in Pittsburgh told Bloomberg. While autonomous vehicles are taught to stick to speed limits, carefully signal, and do all the things our driver’s ed teachers told us to do, humans aren’t as careful. So when these self-driving cars try to merge onto the highway without speeding up, problems tend to arise.

Despite the fact that self-driving cars aren’t at fault for any of the accidents they’ve been in, the fact remains that they’re still involved in accidents, some more serious than others.

As per a study from the University of Michigan’s Transportation Research Institute in Ann Arbor, Michigan, “Driverless vehicles [are] … usually hit from behind in slow-speed crashes by inattentive or aggressive humans unaccustomed to machine motorists that always follow the rules and proceed with caution.” So unless we’re planning on reprogramming humans at large, this could be a long-term problem for autonomous cars.

But programming cars to be a bit more daring can also quickly create a new suite of issues. “It’s a sticky area,” said Brandon Schoettle, who co-authored the Michigan study. “If you program them to not follow the law, how much do you let them break the law?”

Questions that will certainly need to be answered before driverless cars start sharing our highways en masse.

Lulu Chang
Former Digital Trends Contributor
Fascinated by the effects of technology on human interaction, Lulu believes that if her parents can use your new app…
We now know what the self-driving Apple Car might look like
A render that shows what the Apple Car might look like.

Thanks to several 3D concept renders, we now know what the future self-driving Apple Car might look like.

Vanarama, a British car-leasing company, took inspiration from other Apple products, as well as Apple patents, in order to accurately picture the rumored Apple car.

Read more
Tesla pulls latest Full Self-Driving beta less than a day after release
The view from a Tesla vehicle.

False collision warnings and other issues have prompted Tesla to pull the latest version of its Full Self-Driving (FSD) beta less than a day after rolling it out for some vehicle owners.

Tesla decided to temporarily roll back to version 10.2 of FSD on Sunday following reports from some drivers of false collision warnings, sudden braking without any apparent reason, and the disappearance of the Autosteer option, among other issues.

Read more
Waymo’s self-driving cars can’t get enough of one dead-end street
waymo

Waymo has been testing its self-driving cars in San Francisco for the last decade. But an apparent change to the vehicles’ routing has caused many of them to make a beeline for a dead-end street in a quiet part of the city, causing residents there to wonder what on earth is going on.

At CBS news crew recently visited the site -- 15th Avenue north of Lake Street in Richmond -- to see if it could work out why so many of Waymo’s autonomous cars are showing up, turning around, and then driving right out again.

Read more