Radar in Uber's self-driving vehicle discovered pedestrian Elaine Herzberg more than five seconds before the SUV crashed into her, according to a new report from the National Safety Transportation Tables. Unfortunately, a number of poor software design decisions prevented the software from taking action up to 0.2 seconds before the fatal crash in Tempe, Arizona.
Herzberg's death happened in March 201[ads1]8, and NTSB published its first report on the case in May of that year. This report made it clear that poorly written software, not hardware failing, was responsible for the crash that killed Herzberg.
But the new report, released Tuesday, marks the end of NTSB's 20-month investigation. It provides much more detail about how Uber's software worked – and how everything went wrong in the final seconds before the crash that killed Herzberg.
A misclassification timeline
Like most self-driving software, Uber's software attempts to classify each object it detects in one of several categories – such as a car, bicycle, or "other." Based on this classification, the software then calculates a velocity and probable trajectory for the object. This system failed catastrophically in Tempe.
The NTSB report contains a second-by-second timeline showing what the software "thought" as it approached Herzberg, which pushed a bicycle over a multi-lane roadway far from any cross:
- 5.2 seconds before steadily, the system classified her as an "other" object.
- 4.2 seconds before impact, she was reclassified as a vehicle.
- Between 3.8 and 2.7 seconds before impact, the classification switched several times between "vehicle" and "other."
- 2.6 seconds before the collision, the system classified Herzberg and her bicycle as a bicycle.
- 1.5 seconds before the collision, she was "unknown."
- 1.2 seconds before collision she became a "bike" again.
Two things are remarkable about this course of events. First, the system was not classified as a pedestrian at any time. According to NTSB, this is because "the system design did not take into account pedestrians."
Second, the constantly changing classification prevented Uber's software from accurately calculating her trajectory and realizing that she was on a collision course with the vehicle. You might think that if a self-propelled system sees an object move into the vehicle's path, it would put on the brakes even though it was not sure what kind of object it was. But that's not how Uber's software worked.
The system used an object's previously observed locations to calculate velocity and predict future directions. However, "if the perception system changes the classification of a detected object, the tracking history of that object is no longer considered when generating new trajectories," NTSB reports.
What this meant in practice was that because the system could not To tell what kind of object Herzberg and her bike were, the system worked as if she was not moving.
From 5.2 to 4.2 seconds before the crash, the system classified Herzberg as a vehicle and determined that she was "static" – does not mean to move you – and thus is unlikely to travel into the car's lane. that she was moving but predicted that she would stay in her current orbit.
When the system reclassified her as a bike 2.6 seconds before collision, the system again predicted that she would be in the lane a mistake that is much easier to do if you throw out previous position data. 1.5 seconds before impact, she became an "unknown" object and was once classified as "static."
It was only 1.2 seconds before the crash when she started to going into the SUV lane, the system realized a crash was imminent.
At this point, it was probably too late to avoid a collision, but tapping the brakes may have slowed the vehicle enough to save Herzb That's not what happened. NTSB explains why:
"When the system detects an emergency, it starts action suppression. This is a one-second period where [automated driving system] suppresses planned braking while the system verifies the nature of the detected hazard and calculates an alternate lane or vehicle operator takes control over the vehicle. "
NTSB states that, according to Uber, the company" implemented the action suppression process due to the concerns of the developed automatic detection system that identified false alarms, causing the car to go into unnecessary extreme maneuvers. "
the vehicle did not start using the brakes until 0.2 seconds before the fatal crash – too late to save Herzberg's life.
Even after this one-second delay, according to the NTSB, the system does not necessarily use the brakes at full strength. If a collision can be avoided with hard braking, the system brakes hard, up to a fixed maximum level of deceleration. However, if a crash is unavoidable, the system uses less braking power, initiating a "gradual decline of the vehicle" while alerting the driver to take over.
A 2018 report from Business Insiders Julie Bort suggested possible reasons for these strange design decisions: the team was preparing to give a demonstration to Uber's recently hired CEO, Dara Khosrowshahi. Engineers were asked to reduce the number of "bad experiences" experienced by riders. Shortly afterwards, Uber announced that it would "turn off the car's ability to make emergency decisions on its own, such as tapping the brakes or swinging hard."
The oscillation was finally activated again, but the restrictions on hard braking remained in place until the fatal crash in March 2018.
The Uber vehicle was a Volvo XC90 system that comes with a sophisticated emergency braking system. Unfortunately, before the crash of 2018, Uber would automatically disable Volvo's crash prevention system when Uber's own technology was active. One of the reasons, NTSB said, was that Uber's experimental radar used some of the same frequencies as the Volvo radar, creating a danger of interference.
Since the crash, Uber has redesigned its radar to operate at frequencies other than Volvo's radar, allowing Volvo's emergency braking system to continue to be in use while Uber tests its own self-driving technology.
Uber also states that it has reversed other aspects of the software. It no longer has a "suppressive" period before you brake in an emergency. And the software no longer throws past location data when the classification of an object changes.