Here's a NHTSA report that just came out. It's almost entirely about Autopilot, but there is a mention of one FSD crash with a fatality.Do we know of any fatalities with 1 billion FSD miles driven?
The report mentions FSD on one line of the report, stating that between August 2022 and August 2023, there were 60 crashes examined and one of those involved a fatality. This is apparently an at-fault crash, but there is no documentation on the crash in the report. Autopilot was
For those who read the report, Tesla's recall 23V-838 is the one that amped up the monitoring of the driver and removed the double pull activation. In the report, there is one injury in the 111 examined crashes related to inadvertent deactivation of Autopilot via steering, resulting in TACC continuing to operate.
A comparison of Tesla’s design choices to those of L2 peers identified Tesla as an industry outlier in its approach to L2 technology by mismatching a weak driver engagement system with Autopilot’s permissive operating capabilities.
Unlike peer L2 systems tested by ODI, Autopilot presented resistance when drivers attempted to provide manual steering inputs. Attempts by the human driver to adjust steering manually resulted in Autosteer deactivating. This design can discourage drivers’ involvement in the driving task. Other systems tested during the PE and EA investigation accommodated drivers’ steering by suspending lane centering assistance and then reactivating it without additional action by the driver.
Notably, the term “Autopilot” does not imply an L2 assistance feature, but rather elicits the idea of drivers not being in control. This terminology may lead drivers to believe that the automation has greater capabilities than it does and invite drivers to overly trust the automation. Peer vehicles generally use more conservative terminology like “assist,” “sense,” or “team” to imply that the driver and automation are intended to work together, with the driver supervising the automation.