The National Transportation Safety Board (NTSB) has issued a preliminary report on a Tesla (TSLA) Autopilot death that took place in Florida on March 1, 2019. This report raises several problematic issues for Tesla, including speculation on various types of liability Tesla can have.
The report is only two pages long, so I urge all readers to go to the original source, linked to above. In summary, Tesla was on a 55 MPH speed limit road on Autopilot – but went 68 MPH. A large truck crossed the road, and for whatever reason Tesla did not stop.
As a result, the Tesla hit the truck, cut the roof of the car, and the person in Tesla died. The remains of the car were found 1
Message to investors in Tesla's last funding round: Take a look at this picture. Elon sold you $ 2.7 billion of securities based on Tesla's autonomy capacity. This picture is reality. Tesla cars cannot even see the width of a tractor wagon. $ TSLA pic.twitter.com/1EPNLrvoo
– NetflixAndLamp (@NetflixAndLamp) May 16, 2019
Autopilot or not, why did the car brake or avoid the obviously big obstacle? I guess this will be the source of further investigation. (Complete disclosure: I am short TSLA stock.)
You would think that if a driver assistance system, such as auto-braking or Tesla Autopilot, had any value at all, it could at least avoid driving straight into a giant truck. If not, we are actually nowhere on the road to a car without a car, such as a robot ax. A small child knows how to avoid driving into a large truck with high speed.
We're not talking about any advanced James Bond style maneuvers here. We are talking about the vehicle seeing a giant truck in front, and at least braking.
Everyone in the mood to have the car drive you right into a giant 68 MPH truck? Me, neither.
This incident took place on March 1. May 2 – just two weeks ago – Tesla went to the market to increase over $ 2 billion in new funding – mainly in convertible debt but also some equity. At that time, Tesla hosted a conversation with investors where CEO Elon Musk said driverless robotics are the dominant part of the future's value and that we will see a million of these Tesla driverless robotics within next year.
Most industry experts believe that a truly driverless car, which can go anywhere a person can drive, at such full speed, is very far in the future – no place near next year. Given this, it would be good for an investor to know the basic facts that referred to this lethal Tesla Autopilot accident.
But did Tesla mention this during this investment interview? Did Tesla say that this car operated in Autopilot mode at the time of the crash, and that the car obviously failed to brake for the truck? No, it didn't. It kept its real investors in the dark as it passed the hat for over 2 billion in new funding.
This can claim liability issues. Should real and would be (in the May 2 deal) Tesla shareholders have been told what Tesla already knew at that time, about this Autopilot accident? It is a rhetorical question.
The other type of responsibility here – which of course would also damage the owners in the end – is whether any action after this incident would cause Tesla to shut down the Autopilot system. It can certainly be done via OTA update, for all the hundreds of thousands of such cars in the field already.
If that were what Tesla had to do, it would probably have to refund all thousands of dollars per car that it charged for this functionality. Using the red mathematics here, let's say it had to pay $ 2,000 per car (very conservative), multiplied by 500,000 cars. That's $ 1 billion. Having to repay that kind of money to the customers would bring Tesla to their knees in the balance.
One can also ask about Tesla's amazing ability to slip into things that cause Tesla's insurance rates to shoot. Here is GEICO increases the prices of Tesla insurance by 25%.
Although people are not injured, Teslas seems to have a magical inclination to walk from parked and then straight into a shop window. This incident, for example, happened just a week ago.
Of course, a driver should not let the car go haywire. The driver should always keep his hands on the Tesla steering wheel and discover things in front of (and around) the car, which Tesla obviously misses. However, Tesla's CEO is not a good example of this. Here is a picture of his appearance on 60 minutes half a year ago, where he was so proud that he looks away while not having his hands on the wheel:
Given the preliminary results of NTSB survey By the deadly crash of Jeremy Banner's Model 3, it is important that I continue to share this image. Why do you think people believe it is safe to operate a Tesla with the AutoPilot engaged and their hands off the wheel? $ TSLAQ pic.twitter.com/LUxeW07YS0
– Mr. Generic (@OffBrandCapital) May 16, 2019
If you find message from CEO Musk to be ahem, unwise – Just go to Tesla.com/Autopilot. Yes, the actual landing page for this functionality. What is the first thing you see on that page? There is a video of a Tesla "running" where the person behind the wheel does not touch it. The text on the screen at the beginning of the video reads, "He does nothing. The car runs itself."
Okay then! I do not think the customer can be invalid because he does not know how to behave otherwise: The CEO continues 60 Minutes showing that it is great to take the hands off the wheel and look away. Then Tesla's own website has a long video that clearly shows you can put your hands off the wheel – not seconds – but minutes!
With all of this, investors cannot also be invalidated to begin to see the risk of multiple liability incidents, which can seriously damage Tesla's driverless robotic plans, as well as re-balance. Tesla should at least have been clean on this before it traveled the two billion dollars two weeks ago.
Learn the history of the companies in your portfolio | Behind the label