An employee is driving a Tesla Motors Inc. electric car S, equipped with Autopilot hardware and software, hands-free on a freeway.
Jasper Juinen | Bloomberg | Getty Images
Two consumer safety groups are calling for federal and state investigations into Tesla's semi-autonomous technology in the wake of several fatal crashes linked to the system earlier this year.
The investigations could pose a major threat to the California electric car manufacturer, as Tesla CEO Elon Musk has promised that a fully autonomous version of his technology, called Autopilot, will be released this year. He said during a conference call this week that the company expects to generate significant revenue from fleets of "robot axes" it intends to roll out in 2020 using Autopilot.
"We feel Tesla is breaking the laws of deceptive practices, both at the federal and state levels," said Jason Levine, head of the Washington, DC-based Center for Auto Safety, one of two groups that has requested an investigation into both Autopilot system and Tesla's marketing of the technology.
CAS, along with California's non-profit Consumer Watchdog, pointed to a series of crashes, injuries and deaths that have occurred in recent years involving Tesla vehicles operating in Autopilot mode. That includes one in May where a Model S sedan slammed into a parked police car. Two months earlier, a driver was killed when his Model 3 sedan slammed into a semi-trailer in Del Ray Beach, Florida and cut off the roof.
Autopilot was first introduced in October 201
However, a recent survey by the Insurance Institute for Highway Safety found that a significant percentage of owners misunderstand the potential of these systems, especially their limitations. The survey of 2,000 owners found that this is especially true with AutoPilot. Half of those surveyed believed that Autopilot allowed a driver to take their hands off the wheel completely. For similar systems, responses ranged from 20% to just over 30%.
The autopilot name, itself, was misleading, IIHS said in its analysis, "signaling drivers that they can turn their minds and eyes elsewhere."
In the March 1 crash in Florida, National Transportation determined Safety Board that the driver turned on Autopilot 10 seconds before collision and did not have his hands on the steering wheel for the last eight seconds.
The agency has made similar discoveries in other crashes, several of them also fatal.
Tesla, for its part, has defended Autopilot. In a statement released in May, it said: "As our quarterly safety reports have shown, drivers who use Autopilot record fewer accidents per kilometer than those who drive without it."
However, this has not been supported by independent research, and Tesla has had to refrain from claims that the safety record was supported by the National Highway Traffic Safety Administration.
The carmaker also said in a statement that there is nothing about the name, Autopilot, that should mislead consumers.
"They probably oppose the name" Automobile, "the statement suggested. The company also argued that it has gone to great lengths to make consumers aware of the limits of the system, in the owner's manuals, on its website and elsewhere.
CAS's Levine dismisses such claims as "legalese", citing the many ways Tesla and Musk have promoted the system, including images released shortly after Autopilot debuted, including those showing that Musk and his then wife drove off with their hands waving out the windows of a Tesla vehicle. Musk seemed to suggest that the system could work hands-on during a December 2018 interview on CBS News Magazine, "60 Minutes."
"They can say they have written language to fulfill their obligations, but their actions show a desire to deceive consumers, "Levine said in an interview.
Together with the Consumer Watchdog, the center wants both the Federal Trade Commission and Calif ornia Department of Motor Vehicle reads to start immediate probes. The groups claim the automaker violated Section 5 of the FTC Act, as well as California Consumer Law, claiming that the way Tesla markets Autopilot is "substantially misleading and … likely to mislead consumers into believing that their vehicles have self-driving or autonomous capabilities."  Despite such concerns, Tesla has been working to update the Autopilot system and CEO Musk earlier this month reiterated earlier promises to introduce a "full self-driving" version before the end of the summer. Last year, it rolled out new features that claimed to allow true hands-free operation amid criticism that the update was unable to meet expectations.
But during a conference call with analysts and investors on Tuesday, Musk said the upcoming upgrade "will be quite compelling."
The CEO has promised to put as many as 1 million robot axes on the road by 2020, a direct challenge for such ride-sharing services as Uber and Lyft, which work with their own self-driving technologies.
Musk has indicated that the service will provide a new source of revenue for the company. On Wednesday, Tesla posted $ 1.12 in share losses for the second quarter, after adjustments that were far greater than the 40 cents per share analysts surveyed by Refinitive Expected. The shares have fallen sharply since the report. Anything that can disrupt this program can complicate Tesla's struggle to turn the economy around.
"There is no doubt that (Autopilot) technology is impressive," said CAS chief Levine, but Tesla's continued reliance on what he called "hyperbolic statements" misleads consumers and poses serious security risks.