قالب وردپرس درنا توس
Home / Business / Tesla did not solve an autopilot problem for three years and now another person is dead

Tesla did not solve an autopilot problem for three years and now another person is dead



On May 7, 2016, a 40-year-old man named Joshua Brown was killed when his Tesla Model S collision collided with a tractor trailer crossing his road on US Highway 27A, near Williston, Florida. Almost three years later, another Tesla owner, the 50-year-old Jeremy Beren banner, was also killed on a Florida freeway under similar circumstances: Model 3 collided with a tractor trailer crossing his road and clearing the process.

There was another great similarity: both drivers were found by investigators to have used Tesla's advanced driver assistant system Autopilot at the time of their respective crashes.

Autopilot is level 2 semi-autonomous system, as described by the Society of Automotive Engineers, which combines adaptive cruise control, lane keep assist, self parking, and most recently the ability to automatically change paths. Tesla reckons it as one of the safest systems on the road today, but Brown's and Banner's death raises questions about these claims and suggests that Tesla has neglected to cope with a major weakness in its flagship technology.

There are some big differences between the two crashes. For example, Brown and Banner cars had completely different driver assistant technologies, although both are called Autopilot. The autopilot in Brown's Model S was based on technology provided by Mobileye, an Israeli startup since it was purchased by Intel. Brown's death was partly responsible for the two companies that shared ways in 201

6. The Banner model 3 was equipped with a second-generation version of Autopilot that Tesla developed in the house.

It suggests that Tesla had a chance to record this so-called "edge case" or unusual relationship when you redesign Autopilot, but so far it has failed to do so. After Brown's death, Tesla said the camera could not recognize the white truck against a bright sky; The US National Highway Traffic Safety Administration (NHTSA) found that Brown was not aware of the road and freed Tesla. It decided that he would set his car's cruise control at 74 km / h in two minutes before the crash, and he should have had at least seven seconds to notice the truck before crashing into it.

Federal investigators have not yet decided in Banner's death. In a preliminary report released on May 15, the National Traffic Safety Board (NTSB) said that Banner engaged Autopilot about 10 seconds before the collision. "From less than 8 seconds before crash to the time of impact, the vehicle did not know the driver's hands on the steering wheel," said NTSB. The vehicle ran at 68 km / h when it crashed.

In a statement, a Tesla spokesman formulated it differently and changed the passive "the vehicle did not detect the driver's hands on the wheel" to the more active "driver immediately removed his hands from the wheel." The spokesman did not respond to follow-up questions about what the company did to address this issue.

Earlier, Tesla's CEO Elon Musk has blamed crashes involving Autopilot on the driver's conviction. "When it's a serious accident, it's almost always, in fact maybe always, the case that it's an experienced user, and the problem is more of a complacency," Musk said last year.

The last crash comes at a time when Musk exploits Tesla's plans to distribute a fleet of autonomous taxis by 2020. "A year from now we have over a million cars with full self-purchase, software, everything," he said on a recent "Autonomy Day" event for investors. [19659012] These plans will be useless if federal regulators decide to turn down Autopilot. Consumer guardians appeal to the government to open up an investigation of the advanced driver assistance system. "Either Autopilot cannot see the wide side of an 18 wheel or that can't respond safely to it, "said David Friedman, vice president of Consumer Reports, in a statement." This system cannot reliably navigate in common road situations alone and does not keep the driver engaged. really when needed most. "

Car Safety Experts note that adaptive cruise control systems like Autopilot rely mostly on radar to avoid hitting other vehicles on the road. Radar is good at detecting moving objects but not stationary objects. It also has trouble detecting objects like A vehicle crossing the road and not moving in the direction of the car.

Radar objects of detected objects are sometimes ignored by the vehicle's software to handle the generation of "false positives", said Raj Rajkumar, an electrical and data processor at Carnegie Mellon University Without these, the radar would "see" a transition and report it as an obstacle, causing the vehicle to slam on the brakes.

On the computer's vision page of the equation, the algorithms using the camera output must be trained to detect trucks that are perpendicular to the vehicle's direction, he added, in most traffic situations there are vehicles in front, rear and rear side, but a perpendicular vehicle is much less common.

"Essentially, the same event repeats after three years," Rajkumar said. "This seems to indicate that these two issues are still not addressed." Machine learning and artificial intelligence have inherent limitations. If sensors "see" what they have never or rarely seen before, they do not know how to handle these situations. "Tesla doesn't handle the known limitations of AI," he added.

Tesla has not yet explained how it intends to solve this problem. The company publishes a quarterly security report on the security of Autopilot, but the report is brief on details. This means that experts in the research environment do not have hard data that will allow them to compare the effectiveness of Autopilot to other systems. Only Tesla has a 100 percent understanding of Autopilot's logic and source code, and it protects the secrets closely.

"We need detailed exposure data related to when, where and what conditions drivers utilize Autopilot," said Bryan Reimer, researcher at MIT Transport and Logistics Center, in an email to The Verge "so that we can begin to better quantify the risk with respect to other vehicles of the same age and class. "

Other Tesla owners have talked about Autopilot's problem with detecting trucks in the road of the vehicle. An anonymous Twitter user using the handle @greentheonly "hacked" a model X and posting on Twitter and YouTube. They did this to "observe Autopilot from inside", they said in an email to The Verge. In March, their model X discovered a tractor wagon that was perpendicular to the road, similar to Brown and Banner. The vehicle would have tried to drive under the truck if the driver had not intervened.

According to @ greentheonlys data, semi was not marked as an obstacle. But they decided not to tempt fate: "I did not try to approach the trailer and see if any of the entrances would change (but I do not turn it on)."


Source link