On November 20, the National Transportation Safety Board (NTSB) released the results of its 2018 fatal Uber crash investigation in Tempe, Arizona, believed to be the world's first death of a self-driving car.
But rather than turning the cuffs on Uber's robotic car, investigators instead highlighted the many human errors that culminated in the death of 49-year-old Elaine Herzberg. And they sounded a warning: it could happen again.
"If the company is testing automated driving systems on public roads, this crash, it was about you," NTSB leader Robert Sumwalt said in his opening statement for the hearing yesterday.
When the board read its findings on the probable cause of the accident in Tempe, Rafaela Vasquez, the driver of the vehicle at the time of the crash, was accused. Vasquez was never called out by name, but her mistake as a watchdog for the automated driving system was put on a strong display by the NTSB.
In the minutes before the impact, Vasquez allegedly streamed an episode of The Voice on her phone, which is in contrary to Uber's policy prohibiting telephone use. In fact, investigators determined she had looked down at her phone and away from the road for over a third of the total time she had been in the car up to the time of the accident.
Vasquez's failure to "monitor the driving environment and the operation of the automated driving system because she was visually distracted through the ride of her personal cellphone" was cited as the leading cause of the crash. But she shares the blame with her employers at Uber where a sadly lacking safety culture also contributed to Herzberg's death, the board said. Similarly, the federal government also bore its share of the responsibility for failing to better regulate autonomous car operations.
"In my opinion, they put technology development here before they saved lives," said NTSB board member Jennifer Homendy, of the National Highway Traffic Safety Administration (NHTSA), which is responsible for regulating vehicle safety standards.
At the time of the crash, Uber & # 39; s Advanced Technologies Group had no business security plan or guidance document identifying the roles and responsibilities of individual employees in managing safety, said Michael Fox, senior investigator on highways at NTSB. The company also lacked a security department and did not have a dedicated security manager who was responsible for risk assessment and mitigation. In the weeks before the accident, Uber made the fateful decision to reduce the number of drivers in each vehicle from two to one. The decision removed important redundancy that could have helped prevent Herzberg's death.
Not only was Vasquez alone in the car at the time, but her complacent attitude about the car's automatic driving system also contributed to the crash. And that attitude was badly misplaced. The car discovered that Herzberg crossed the street with his bike 5.6 seconds before the collision. But even though the system continued to track Herzberg right up to the crash, it never correctly identified her as a human on the road, nor did it accurately predict her path.
" Automation of self-confidence … must be in everyone's vocabulary, "said Bruce Landsberg, an NTSB board member.
Vasquez was completely unaware of this conflict before it was too late. One of the implicit lessons of the Uber crash and subsequent NTSB investigation is the underutilization of driver's self-driving vehicles, said Mary "Missy" Cummings, director of Humans and Autonomy Lab at Duke University. Uber and other companies that test self-driving cars usually hire independent contractors as safety drivers to cycle around the cars and generate miles. They are seen as little more than bodies in seats. Instead, they should be viewed as critical partners in a test protocol that can provide very useful feedback, Cummings said.
"Of course, this will cost money," she said. "Despite everyone's lip service as security is crucial, no one I know of supports safety drivers in this way."
Uber's aggressive corporate culture – the need to test autonomous vehicles even if the technology wasn't ready – has been exposed not only through this investigation, but also through the lawsuit filed by Waymo, the self-driving company spun out of Google, which accused Uber of steal their self-driving business secrets.
"The driver had a chance to save her life," said a former employee of Uber ATG who was with the company at the time of the crash, told The Verge "but Uber had dozens. "
Uber ATG was under tremendous pressure to show results to the new CEO, Dara Khosrowshahi, who reportedly considered closing the division due to rising R&D costs, which led them to cut corners, though The company has since made great strides in addressing these failures.
NTSB Board Members Rescued Their Most Blowing Reviews for the Federal Government, Homendy Accused NHTSA of Prioritizing Technological Progress Than Saving Lives, and She Called the Agency's Voluntary Guidance "As being" laughable. "
] The voluntary safety guidelines first came up directed by President Obama who feared that restrictive rules for self-driving car testing could stifle innovation. These rules have become even more relaxed under President Trump, who went ahead by eliminating an all-star federal vehicle automation committee that was meant to serve as a "critical resource" for the Department of Transportation. Trump axed the committee without even telling any of its members, The Verge recently reported.
So far, only 16 companies have submitted voluntary safety reports to NHTSA, many of which constitute little more than "marketing brochures," said Ensar Becic, project manager and human performance researcher in the Office of Highway Safety. It represents only a fraction of the over 60 companies that test self-driving cars in California alone.
"I mean you might as well say we want your opinions, but we don't really need it," Homendy said during the hearing. "So why do it?"
The Department of Transport has released three versions of its automated vehicle safety guidance, and plans to release a fourth version containing lessons learned from the Tempe crash, said Joel Szabat, acting under the secretary of politics at the Department of Transportation, during a Senate hearing on November 20. (The original document is called "Automated Driving Systems: A Vision for Safety." " They were to rename it" A Vision for Mack of Safety, "" Homendy quipped.)
Today there is no federal laws that require AV operators to demonstrate the safety of their vehicles before testing them on public roads or providing data on disruptions or failures in their automated driving systems. The government's only role is reactive: to remember a part if it is defective or to start an investigation in the event of a crash.
In its final report, the NTSB recommends changing it. AV operators should be required, not encouraged, to submit safety assessments if they want to test their vehicles on public roads, it argues, and there must be a continuous evaluation process to determine whether AV operators adhere to their safety targets.
But the day after the report was released, NHTSA's top administrator testified at a Senate hearing that Congress should pass a law to speed up the deployment of fully self-driving cars that lack traditional controls like steering wheel and pedals. At the moment, the agency only has access to exempt a total of 25,000 vehicles from federal vehicle safety standards per year.
"As we hear from the industry that the lid may be too small," said NHTSA acting administrator James Owens.
The previous attempt to lift the restrictions for cars without human controls flopped. The Democrats in the Senate blocked the bill and cited inadequate measures to ensure security. A second attempt is in the works, but it remains to be seen if it can get enough votes to pass.
Too high restrictive federal regulations at this stage of a rapidly changing technology are likely to result in significantly more harm than good, said Raj Rajkumar, professor of electrical and computer engineering at Carnegie Mellon University. "Just because Uber and their operator then behaved badly, not everyone else should be punished," he said.
Autonomous vehicles are meant to save lives, not take them. And if people don't trust the companies that build the technology, it will never save the life-saving potential of self-driving cars. The cars will roam the streets, empty and underused, until the operators pull the plug.
Polls show that half of American adults think automated vehicles are more dangerous than traditional human-powered vehicles. Opinions are already tougher, and it is unclear what can be done to undo the damage the Uber crash has already caused.