قالب وردپرس درنا توس
Home / Business / Uber's self-driving car did not know pedestrians could Jaywalk

Uber's self-driving car did not know pedestrians could Jaywalk



The software in the self-driving SUV that killed a woman in Arizona last year was not designed to detect pedestrians outside a cruise, according to new documents released as part of a federal investigation into the incident. It is the most damning disclosure offered in a trove of new crash-related documents, but other details indicate that Uber's self-driving car work in many different ways did not consider how humans actually work.

The National Transportation Safety Board, an independent security panel from authorities that more frequently investigate air crashes and major trucking incidents, released documents related to its 20-month investigation into the Uber crash. The panel will issue a final report on the event in two weeks. More than 40 of the documents, which span hundreds of pages, delve into information about the incident on March 1

8, 2016, in which the Uber test vehicle, which was driven by 44-year-old Rafaela Vasquez, killed a 49-year-old woman named Elaine Herzberg when she crossed a darkened road in the city of Tempe, Arizona. At that time, only one driver monitored the experimental car's operation and software as it drove around Arizona. Video footage released in the weeks following the crash showed Vasquez responding with shock in the moments just before the collision.

The new documents indicate that some errors were clearly related to Uber's internal structure, what experts call "security culture." For one, the self-driving program did not include an operations security department or security manager.

Ask WIRED

The most glaring errors were software-related. Uber's system was not equipped to identify or handle pedestrians who went outside a junction. Uber engineers also seem to have been so concerned about false alarms that they built in an automated one-second delay between a recognition and action. And Uber chose to turn off a built-in Volvo braking system, which the carmaker later concluded may have reduced the speed at which the car hit Herzberg dramatically, or perhaps avoided the collision altogether. (Experts say the decision to turn off the Volvo system while Uber's software was doing its job was technically sensible because it would be unsafe for the car to have two "masters".

Much of it explains why, despite the fact that the car discovered Herzberg with more than enough time to stop, traveling at 43.5 mph when it struck her and threw her 75 ft. When the car first discovered her presence, 5.6 seconds before the collision, it classified her as a vehicle. Then it changed to "other", then to vehicles again, back to "other", then to bicycle, then to "other" again, and finally back to bicycle.

It was never guessed that Herzberg was on foot for a simple, crazy reason: Uber did not ask its car to look for pedestrians outside pedestrian bridges. "The system design did not include the consideration of pedestrians," reads NTSB's "Vehicle Automation Report". Each time it tried a new guess, it restarted with to predict where the mysterious object – Herzberg – was heading. It was not until 1.2 seconds before the impact that the system recognized that the SUV would hit Herzberg, that it could not steer around her, and that it needed to tap the brakes.

It triggered what Uber called "Action Suppression", where the system held the brake for one second while confirming the "nature of the detected danger" – a second where the security operator, Uber's most important and last line of defense, could have taken control of the car and hit the brakes self. But Vasquez didn't look at the road in that second. Then with 0.2 seconds left before the collision, the car sounded alarm and Vasquez took the steering wheel and switched off the autonomous system. Almost a second after hitting Herzberg, Vasquez hit the brakes.


Source link