There is a growing perception that the Full Self-Driving (FSD) capability of Tesla’s Autopilot system leaves much to be desired if one were inclined to employ euphemisms here. After a string of troubling road accidents – the precursors to a number of ongoing FSD-related investigations in the US – and the fact that the “miles per disengagement rate on FSD beta is actually getting worse,” as per a tabulation by Taylor Ogan, the CEO of Snow Bull Capital, clearly something is not working properly at Tesla. Meanwhile, the deposition of an Autopilot executive back in the summer of 2022 (and only just now made public) has opened a veritable can of worms for the EV giant.

How is your investing in a company run by someone who used a completely fake video to sell so-called “Full Self Driving” any different from investing in a company that used fake tests to sell blood machines? Enjoy the due-diligence lawsuits! — Stanphyl Capital ❌ (@StanphylCap) January 18, 2023 To wit, Ashok Elluswamy, Tesla’s Head of Autopilot Software, participated in a July 2022 deposition in relation to a fatal accident back in 2018. While most of the media has concentrated on the juicy tidbit that Tesla’s 2016 FSD promotional video was staged à la Nikola style, the deposition’s transcript also highlights the Tesla executive’s very troubling knowledge gaps.

This quote is from a deposition of Ashok Elluswamy, Tesla’s Head of Autopilot Software relating to the 2018 fatal Autopilot crash of Walter Huang. He doesnt know what an Operational Design Domain (ODD) is. pic.twitter.com/SqeE7xp2m4 — Mahmood Hikmet (@MoodyHikmet) January 15, 2023 The Twitter account @MoodyHikmet has effectively summarized this aspect of the deposition. For one, when asked about Operational Design Domain (ODD) – the conditions under which an automated system operates – Mr. Elluswamy said: Tesla’s Autopilot head also denied the recollection of ever seeing a document on ODD while working at the EV giant. The fact that Mr. Elluswamy appears totally unaware of such a fundamental aspect of any automated system is quite troubling. Of course, it is entirely possible that Tesla uses some other vernacular to lay out the Autopilot’s ODD. However, his unfamiliarity with fundamental industry parlance is very intriguing. Yet, it gets worse.

You’d think perception-reaction time would be an important consideration for this… BUT HE DOESN’T EVEN KNOW WHAT THAT IS. pic.twitter.com/5KxIRdFqj1 — Mahmood Hikmet (@MoodyHikmet) January 15, 2023 The perception-reaction time describes the time period it takes for humans to perceive and then react to a specific stimulus – be it auditory or visual cues. However, Mr. Elluswamy denied the recollection of ever receiving any training on perception-reaction time! He also admitted to guessing “what those words mean.”

— soul nate (@MNateShyamalan) January 18, 2023 Given this state of affairs at Tesla, it is hardly a surprise that the Autopilot is consistently underperforming the hype that Elon Musk so diligently generates. In a comical development, the New York Times recently reported on a meeting where its reporter tested out the Autopilot’s FSD capability in the presence of some die-hard Tesla fans who repeatedly got stumped every time the system faltered.

— Gary Black (@garyblack00) January 19, 2023 Meanwhile, Tesla’s ongoing “funding secured” trial is also opening a can of worms for the FSD capability of the Autopilot. While Gary Black of the Future Fund correctly notes that this lawsuit poses minimal ramifications for the company’s financials, it does leave the proverbial avenue wide open for follow-up FSD-related lawsuits.

— KSalberta @k_salberta@masto.ai (@k_salberta) January 15, 2023 What do you think of Tesla’s Autopilot strategy? Let us know your thoughts in the comments section below.