Safety Expectations Too High for Self-Driving Cars?
Automated cars should help reduce crashes by being more alert than human drivers are.
#regulations
Automated cars should help reduce crashes by being more alert than human drivers are. But researchers at the University of Michigan Transportation Research Institute say the safety expectations for self-driving cars may be too optimistic.
Analysts Michael Sivak and Brandon Schoettle point to several areas of concern. For one, they note that some crashes are due to vehicle failure, not human error. They opine that the complexities of technology for self-driving cars could make such vehicles more likely than conventional vehicles to experience an equipment failure.
They also observe that human drivers acquire predictive skills as they gain experience on the road. The researchers question whether such capabilities can be programmed into a computer-controlled car sufficiently to surpass the skill of today's safest group: middle-aged drivers.
Finally, the UMTRI researchers fret over the dynamics of the decades-long transition period during which conventional and autonomous vehicles will share the road. They note that drivers often rely upon eye contact with each other to determine how to proceed. But when self-driving cars and mixed together, they theorize, the lack of such feedback could make driving conventional vehicles less safe.
RELATED CONTENT
-
U.S. in No Hurry to Regulate Autonomous Vehicles
The National Highway Traffic Safety Administration says the emerging technology involved in self-driving cars is too new to be tightly regulated.
-
Bill on Self-Driving Cars Stalls in Senate
Congressional efforts to make it easier to develop self-driving cars in the U.S. have stalled in the Senate despite strong bipartisan support.
-
Self-Driving Chevy Bolt Ticketed for Driving Too Close to Pedestrian
Police in San Francisco ticketed the backup driver in a self-driving Chevrolet Bolt for allowing the car to drive too close to a pedestrian in a crosswalk in San Francisco.