The driver of a Tesla Model S that crashed into a fire truck on a California freeway last year failed to notice the road thanks to "overconfidence" on Autopilot, the National Transportation Safety Board said Wednesday. In addition, investigators determined that another probable cause of the crash was the design of Autopilot, "which allowed the driver to disconnect from the driving task."
No one was injured in the crash, but the driver's use of Autopilot has brought regulator and media attention to the incident. There have been numerous reports of Tesla owners using Autopilot at the time of a crash, as well as a handful of people killed. Tesla has consistently said that drivers who use Autopilot are safer than those who do not.
The crash occurred Jan. 22, 201
Tesla was traveling in the HOV lane behind another vehicle, but when that vehicle changed lanes to the right, Tesla accelerated and hit the rear of the fire truck at a recorded speed of about 31 mph. The autopilot had been turned on for a total of 29 minutes and four seconds before the crash, but the driver's hands were only detected on the steering wheel for 78 seconds by that time. Autopilot sent "several" hands-off alerts in the last 13 minutes before the crash.
"Most of the time the system was switched on, it did not detect the applied steering torque (hands on the steering wheel)," NTSB states.
After the lead vehicle changed lanes, the Model S began to accelerate again to its 80 mph adaptive speed control speed. Autopilot sent a forward collision warning 0.49 seconds before collision, but "the automatic emergency braking system was not activated," NTSB concludes.
Autopilot is a semi-autonomous Level 2 that combines adaptive cruise control, lane assistance, self-parking and recently the ability to automatically change lanes. It uses a range of sensors, including eight cameras, radar and ultrasound.
Car safety experts note that adaptive cruise control systems like Autopilot rely most on radar to avoid hitting other vehicles on the road. Radar is good for detecting moving objects, but not stationary objects. It also has difficulties in detecting objects such as a vehicle that crosses the road and does not move in the direction of travel of the car.
Previously, Tesla CEO Elon Musk has blamed crashes involving Autopilot on driver over-insurance. "When it's a serious accident, it's almost always, in fact, maybe always, so it's an experienced user, and the problem is more of complacency," Musk said last year. This appears to back up the NTSB conclusion that the driver's "inattention and overconfidence" was at stake in the crash in January 2018.
In a statement, a spokesman for Tesla notes that although Autopilot will turn off when a driver repeatedly ignoring warnings to remain engaged in driving, the automaker continues to update to make the advanced driver assistance system "smarter, safer and more efficient."
"Since this incident happened," the spokesman added, "we have done updates to our system, including adjusting time intervals between practical alerts and the conditions under which they are activated. "