Nearly 400 crashes in the United States in 10 months have involved cars using advanced driver assistance technologies, the federal government’s top auto safety regulator revealed on Wednesday.
The findings are part of a broad effort by the National Highway Traffic Safety Administration to determine the safety of advanced driving systems as they become more common.
In 392 incidents recorded by the agency from July 1 last year to May 15, six people died and five were seriously injured. Teslas running on Autopilot, the more ambitious Full Self Driving mode, or one of their associated features have been in 273 crashes. Five of those Tesla crashes were fatal.
The data was collected last year as part of an NHTSA order requiring automakers to report crashes involving cars equipped with advanced driver assistance systems. Dozens of manufacturers have rolled out such systems in recent years, including features that let you take your hands off the wheel in certain conditions and help you parallel park.
NHTSA’s order was an unusually bold step for the regulator, which has come under fire in recent years for not being more assertive with automakers.
“Until last year, NHTSA’s response to autonomous vehicles and driver assistance was frankly passive,” said Matthew Wansley, a professor at New York’s Cardozo School of Law who specializes in technologies emerging automobiles. “This is the first time the federal government has directly collected accident data from these technologies.”
Speaking to reporters ahead of Wednesday’s release, NHTSA Administrator Steven Cliff said the data – which the agency will continue to collect – “will help our investigators quickly identify potential patterns of emerging defects.” .
Dr. Cliff said NHTSA would use this data as a guide to establish rules or requirements for their design and use. “These technologies hold great promise for improving safety, but we need to understand how these vehicles perform in real-world situations,” he said.
But he cautioned against drawing conclusions from the data collected so far, noting that it does not take into account factors such as how many cars from each manufacturer are on the road and equipped with these types of technologies.
An advanced driver assistance system can steer, brake and accelerate vehicles on its own, although drivers must remain alert and ready to take control of the vehicle at all times.
Safety experts are concerned that these systems allow drivers to relinquish active control of the car and could lull them into thinking their car is driving itself. When technology malfunctions or cannot handle a particular situation, drivers may not be ready to take control quickly.
About 830,000 Tesla cars in the United States are equipped with Autopilot or the company’s other driver assistance technologies, which explains why Tesla vehicles account for nearly 70% of accidents reported in the data. published on Wednesday.
Ford Motor, General Motors, BMW and others have similar advanced systems that allow hands-free driving in certain highway conditions, but far fewer of these models have been sold. These companies, however, have sold millions of cars over the past two decades that are equipped with individual components of driver assistance systems. Components include something called lane keeping, which helps drivers stay in their lanes, and adaptive cruise control, which adjusts a car’s speed and automatically brakes when traffic slows.
In Wednesday’s statement, NHTSA said Honda vehicles were involved in 90 incidents and Subarus in 10. Ford, GM, BMW, Volkswagen, Toyota, Hyundai and Porsche each reported five or fewer.
The data includes cars with systems designed to operate with little or no driver input, and separate data on systems that can simultaneously steer and control the speed of the car but require constant driver attention.
Automated vehicles — which are mostly still in development but being tested on public roads — have been involved in 130 incidents, NHTSA found. One resulted in a serious injury, 15 minor or moderate injuries and 108 no injuries. Many crashes involving automated vehicles were fender bending or bumper banging, as they were primarily used at low speeds and in city driving.
In more than a third of the 130 accidents involving the automated systems, the car was stopped and hit by another vehicle. In 11 crashes, a car with such technology was traveling straight ahead and collided with another vehicle changing lanes, the data showed.
Most incidents involving advanced systems have occurred in San Francisco or the Bay Area, where companies like Waymo, Argo AI and Cruise are testing and refining the technology.
Waymo, which is owned by Google’s parent company and runs a fleet of driverless taxis in Arizona, was part of 62 incidents. Cruise, a division of GM, participated in 23. Cruise just started offering driverless taxi rides in San Francisco, and this month it received permission California authorities to start charging passengers.
None of the cars using the automated systems were involved in fatal accidents, and only one accident resulted in a serious injury. In March, a cyclist hit a Cruise-operated vehicle from behind as they both drove down a street in San Francisco.
NHTSA’s order for automakers to submit the data was prompted in part by crashes and deaths over the past six years that involved Teslas operating on autopilot. Last week, NHTSA expanded an investigation to determine whether the Autopilot has technological and design flaws that pose safety risks.
The agency has investigated 35 crashes that occurred while Autopilot was on, including nine that have resulted in 14 deaths since 2014. It had also opened a preliminary investigation into 16 incidents in which Teslas under Autopilot control crashed. in emergency vehicles that had stopped and had their lights flashing.
In November, Tesla recalled nearly 12,000 vehicles that were part of the beta test for Full Self Driving — a version of Autopilot designed for use on city streets — after rolling out a software update that the company said company, could cause accidents due to unexpected activation of cars. ‘ emergency braking system.
The NHTSA order required companies to provide crash data when advanced driver assistance systems and automated technologies were used within 30 seconds of impact. Although this data provides a broader picture than ever before of the behavior of these systems, it is still difficult to determine whether they reduce accidents or improve safety.
The agency has not collected data that would allow researchers to easily determine whether using these systems is safer than turning them off in the same situations. Automakers were allowed to redact descriptions of what happened in crashes, an option that Tesla as well as Ford and others used regularly, making it harder to interpret the data.
Some independent studies have explored these technologies, but have not yet shown whether they reduce accidents or improve safety.
J. Christian Gerdes, professor of mechanical engineering and director of Stanford University’s Center for Automotive Research, said the data released Wednesday was helpful, up to a point. “Can we learn more from this data? Yes,” he said. “Is it a real gold mine for researchers? I don’t see that.
Because of the redactions, he said, it was difficult to gauge the ultimate usefulness of the results. “NHTSA has a much better understanding of this data than the general public can get by just looking at what’s been published,” he said.
Dr. Cliff, the NHTSA administrator, was reluctant to act on the results. “Data can raise more questions than it answers,” he said.
But some experts said the newly available information should prompt regulators to be more assertive.
“NHTSA can and should use its various powers to do more – rulemaking, star ratings, investigations, additional investigations and soft influence,” said Bryant Walker Smith, associate professor at the Schools of Law and Engineering at the NHTSA. University of South Carolina, specializing in new transport techniques.
“This data could also lead to other voluntary and involuntary disclosures,” he added. “Some companies would gladly provide more context, including miles traveled, accidents ‘avoided’ and other indicators of good performance. Prosecutors will look for patterns and even cases in this data.
Overall, he said, “it’s a good start.”
Jason Kao, Asmaa Elkeurti and Vivian Li contributed research and reporting.