Business

The United States opens investigation into Tesla’s autopilot for emergency vehicles




The owner of a 2021 Tesla Model Y reported to the car safety agency that on November 3 in Brea, California, the vehicle was in FSD Beta mode “and while taking a left turn, the car went into the wrong lane and I was hit by another driver in the lane at side of my lane. “

The car “warned halfway in the turn” and the driver tried to take control, “but the car took control of itself and forced itself into the wrong lane”, the report states. The car was seriously damaged on the driver’s side, the owner added.

“NHTSA is aware of the current consumer complaint and is in communication with the manufacturer to gather further information,”[ads1]; a NHTSA spokesman told Reuters on Friday.

Tesla did not immediately comment.

Earlier this month, Tesla recalled nearly 12,000 U.S. vehicles due to a communication failure that could trigger a false collision warning or unexpected automatic emergency brake.

The recall was requested after a software update for vehicles with FSD Beta. Tesla said that more than 99.8% of the vehicles recalled as of October 29 had installed a software update to resolve the issue and that no further action was needed.

FSD is an advanced driver assistance system that handles some driving tasks, but Tesla says does not make vehicles completely autonomous. The functions “require a fully attentive driver”, it says.

Last month, the NHTSA raised concerns about the use of FSD. “Despite Tesla’s characterization of FSD as a ‘beta’, it is capable of and is used on public roads,” said NHTSA.

In August, NHTSA opened a formal safety probe in Tesla’s autopilot, another driver assistance software system, after a dozen crashes with Tesla models and emergency vehicles.



Source link

Back to top button