There is a growing perception that the Full Self-Driving (FSD) capability of Tesla’s Autopilot system leaves much to be desired if one were inclined to employ euphemisms here. After a string of troubling road accidents – the precursors to a number of ongoing FSD-related investigations in the US – and the fact that the “miles per disengagement rate on FSD beta is actually getting worse,” as per a tabulation by Taylor Ogan, the CEO of Snow Bull Capital, clearly something is not working properly at Tesla. Meanwhile, the deposition of an Autopilot executive back in the summer of 2022 (and only just now made public) has opened a veritable can of worms for the EV giant.
Hello institutional $TSLA investors:
How is your investing in a company run by someone who used a completely fake video to sell so-called “Full Self Driving” any different from investing in a company that used fake tests to sell blood machines?
Enjoy the due-diligence lawsuits!
— Stanphyl Capital (@StanphylCap) January 18, 2023
To wit, Ashok Elluswamy, Tesla’s Head of Autopilot Software, participated in a July 2022 deposition in relation to a fatal accident back in 2018. While most of the media has concentrated on the juicy tidbit that Tesla’s 2016 FSD promotional video was staged à la Nikola style, the deposition’s transcript also highlights the Tesla executive’s very troubling knowledge gaps.
I don’t know if I can put into words how terrible this is, but I’ll try.
This quote is from a deposition of Ashok Elluswamy, Tesla’s Head of Autopilot Software relating to the 2018 fatal Autopilot crash of Walter Huang.
He doesnt know what an Operational Design Domain (ODD) is. pic.twitter.com/SqeE7xp2m4
— Mahmood Hikmet (@MoodyHikmet) January 15, 2023
The Twitter account has @MoodyHikmet effectively summarized this aspect of the deposition. For one, when asked about Operational Design Domain (ODD) – the conditions under which an automated system operates – Mr. Elluswamy said:
“I’ve heard those words before, but I do not recall much more than that.”
Tesla’s Autopilot head also denied the recollection of ever seeing a document on ODD while working at the EV giant. The fact that Mr. Elluswamy appears totally unaware of such a fundamental aspect of any automated system is quite troubling. Of course, it is entirely possible that Tesla uses some other vernacular to lay out the Autopilot’s ODD. However, his unfamiliarity with fundamental industry parlance is very intriguing. Yet, it gets worse.
As far as safety goes, Tesla Autopilot is designed to be taken over by a human before it gets into a dangerous situation.
You’d think perception-reaction time would be an important consideration for this…
BUT HE DOESN’T EVEN KNOW WHAT THAT IS. pic.twitter.com/5KxIRdFqj1
— Mahmood Hikmet (@MoodyHikmet) January 15, 2023
The perception-reaction time describes the time period it takes for humans to perceive and then react to a specific stimulus – be it auditory or visual cues. However, Mr. Elluswamy denied the recollection of ever receiving any training on perception-reaction time! He also admitted to guessing “what those words mean.”
this article, where multiple tesla stans confidently offer to show a journalist self driving to prove it works and then get quiet as it repeatedly fucks up, is amazing https://t.co/qT9uTN3BRX pic.twitter.com/hDMQbxAk3Y
— soul nate (@MNateShyamalan) January 18, 2023
Given this state of affairs at Tesla, it is hardly a surprise that the Autopilot is consistently underperforming the hype that Elon Musk so diligently generates. In a comical development, the New York Times recently reported on a meeting where its reporter tested out the Autopilot’s FSD capability in the presence of some die-hard Tesla fans who repeatedly got stumped every time the system faltered.
While odds are against Elon and $TSLA BOD in the funding secured trial that started Tues in SFO, case should be irrelevant to TSLA value since any award against BOD should be viewed as a one time event and excl from future earnings. Elon testifies Friday. https://t.co/vpaTEjCWvy
— Gary Black (@garyblack00) January 19, 2023
Meanwhile, Tesla’s ongoing “funding secured” trial is also opening a can of worms for the FSD capability of the Autopilot. While Gary Black of the Future Fund correctly notes that this lawsuit poses minimal ramifications for the company’s financials, it does leave the proverbial avenue wide open for follow-up FSD-related lawsuits.
these consistent claims made directly to investors, I think they may have liability in future investor FSD suits. If the stock price does stay down consistently, the odds of such suits are high. Defense will argue “forward-looking”.
— KSalberta @k_salberta@masto.ai (@k_salberta) January 15, 2023
What do you think of Tesla’s Autopilot strategy? Let us know your thoughts in the comments section below.
The post Houston, We Have a Problem: Tesla’s Autopilot Software Head Does Not Know About the Operational Design Domain (ODD) Or Perception-Reaction Time! by Rohail Saleem appeared first on Wccftech.