Self-driving and driver assistance technology related to hundreds of car accidents
Over the course of 10 months, nearly 400 car accidents in the United States involved advanced driver assistance technologies, the federal government’s top car safety regulator revealed on Wednesday, in its first large-scale release of data on these growing systems.
In 392 incidents cataloged by the National Highway Traffic Safety Administration from July 1 last year to May 15, six people died and five were seriously injured. Teslas that operate with autopilot, the more ambitious Full Self Driving mode or some of the associated component features were in 273 crashes.
The revelations are part of a comprehensive effort by the federal agency to determine the safety of advanced driving systems as they become increasingly common. In addition to the futuristic lid of self-driving cars, many car manufacturers have rolled out automated components in recent years, including features that allow you to take your hands off the steering wheel under certain conditions and help you park in parallel.
In Wednesday’s release, NHTSA revealed that Honda vehicles were involved in 90 incidents and Subarus in 10. Ford Motor, General Motors, BMW, Volkswagen, Toyota, Hyundai and Porsche each reported five or fewer.
“These technologies promise to improve safety, but we need to understand how these vehicles perform in real-world situations,” said Steven Cliff, the agency’s administrator. “This will help our investigators quickly identify potential defect trends that are emerging.”
Dr. Cliff spoke to reporters ahead of Wednesday’s release, and also warned against drawing conclusions from the data collected so far, noting that they do not take into account factors such as the number of cars from each manufacturer on the road and equipped with these types. of technologies.
“The data can raise more questions than they answer,” he said.
Approximately 830,000 Tesla cars in the United States are equipped with autopilot or the company’s other driver assistance technologies – which provides one explanation for why Tesla cars accounted for almost 70 percent of the reported accidents.
Ford, GM, BMW and others have similar advanced systems that allow hands-free driving under certain conditions on highways, but far fewer of these models are sold. However, these companies have sold millions of cars over the last two decades that are equipped with individual components in driver assistance systems. The components include so-called lane keeping, which helps drivers stay in their lanes, and adaptive cruise control, which maintains the car’s speed and brakes automatically when traffic in front brakes.
Dr. Cliff said the NHTSA will continue to collect crash data involving these types of features and technologies, noting that the agency would use it as a guide in creating rules or requirements for how they should be designed and used.
The data was collected under an order issued by NHTSA a year ago that required car manufacturers to report crashes involving cars equipped with advanced driver assistance systems, also known as ADAS or Level-2 automated driving systems.
The order was partly caused by crashes and fatalities in the last six years involving Teslas operating in autopilot. Last week, NHTSA expanded an investigation into whether autopilot has technological and design flaws that pose a safety risk. The agency has looked at 35 accidents that occurred while the autopilot was activated, including nine that resulted in the deaths of 14 people since 2014. It had also opened a preliminary investigation of 16 incidents where Tesla under crash autopilot crashed into emergency vehicles that had stopped and got the lights flashing.
In accordance with the order given last year, NHTSA also collected data on accidents or incidents involving fully automatic vehicles that are still under development for the most part, but which are being tested on public roads. Manufacturers of these vehicles include GM, Ford and other traditional automakers, as well as technology companies such as Waymo, which is owned by Google’s parent company.
These types of vehicles were involved in 130 incidents, NHTSA found. One resulted in serious injury, 15 in minor or moderate injuries, and 108 did not result in injuries. Many of the accidents with automated vehicles led to fender benders or bumper cranes because they are mainly operated at low speeds and in city driving.
Waymo, which operates a fleet of driverless taxis in Arizona, was part of 62 incidents. GM’s Cruise division, which has just begun offering driverless taxi rides in San Francisco, was involved in 23. A minor accident involving an automated test vehicle made by Pony.ai, a startup, resulted in the recall of three of the company’s tests. vehicles for software correction.
NHTSA’s order was an unusually bold move for the regulator, which has come under fire in recent years for not being more confident towards car manufacturers.
“The agency collects information to determine if these systems in the field pose an unreasonable risk to safety,” said J. Christian Gerdes, professor of mechanical engineering and director of Stanford University’s Center for Automotive Research.
The problems with Tesla’s autopilot system
Allegations of safer driving. Tesla cars can use computers to handle certain aspects of driving, such as changing lanes. But there is concern that this driver assistance system, called autopilot, is not safe. Here’s a closer look at the problem.
An advanced driver assistance system can steer, brake and accelerate vehicles on their own, although drivers must be alert and ready to take control of the vehicle at any time.
Safety experts are concerned because these systems allow drivers to relinquish active control of the car and may trick them into believing that their cars are driving themselves. When technology does not work or can not handle a particular situation, drivers may be unprepared to take control quickly.
NHTSA’s order required companies to provide crash data when advanced driver assistance systems and automated technologies were in use within 30 seconds of the collision. Although this data provides a broader picture of the behavior of these systems than ever before, it is still difficult to determine whether they reduce crashes or otherwise improve security.
The agency has not collected data that allows researchers to easily determine if it is safer to use these systems than to turn them off in the same situations.
“Question: What is the baseline against which we compare this data?” said Dr. Gerdes, the Stanford professor, who from 2016 to 2017 was the first head of innovation for the Department of Transportation, of which NHTSA is a part.
However, some experts say that comparing these systems with human driving should not be the goal.
“When a Boeing 737 falls out of the sky, we do not ask, ‘Does it fall out of the sky more or less than other aircraft?'” Said Bryant Walker Smith, an associate professor at the University of South Carolina’s law and engineering schools specializing in new transportation technologies. .
“Crashes on our roads are equivalent to several plane crashes every week,” he added. “Comparison is not necessarily what we want. If there is a crash that these driving systems contribute to – a crash that otherwise would not have happened – it is a potentially fixable problem that we need to know about. “