Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
  • Want to remove ads? Register an account and login to see fewer ads, and become a Supporting Member to remove almost all ads.
  • Tesla's Supercharger Team was recently laid off. We discuss what this means for the company on today's TMC Podcast streaming live at 1PM PDT. You can watch on X or on YouTube where you can participate in the live chat.
This site may earn commission on affiliate links.
ASSO enabled? Try turning that off. I like it knowing to drive 17 MPH in a twisty subdivision but 40+ in an open, well marked, no traffic but with posted 25 MPH signs is a waiting ticket around here. For others that's likely normal.
Yeah, no, no offset enabled. I prefer to adjust that using the thumbwheel because I use different amounts in different contexts.

The more I think of it, it's as though it doesn't apply any brakes while FSDS is enabled, neither mechanical nor regen. It just seems to coast. Next time I'll check to see if regen is indicated, but it sure doesn't fell like it. It just feels like I took the foot off the accelerator when I pass the lower speed limit sign and put it in neutral. But again, in "manual" or AP, all behaves normally.
 
Do we know of any fatalities with 1 billion FSD miles driven?
Here's a NHTSA report that just came out. It's almost entirely about Autopilot, but there is a mention of one FSD crash with a fatality.


The report mentions FSD on one line of the report, stating that between August 2022 and August 2023, there were 60 crashes examined and one of those involved a fatality. This is apparently an at-fault crash, but there is no documentation on the crash in the report. Autopilot was

For those who read the report, Tesla's recall 23V-838 is the one that amped up the monitoring of the driver and removed the double pull activation. In the report, there is one injury in the 111 examined crashes related to inadvertent deactivation of Autopilot via steering, resulting in TACC continuing to operate.

A comparison of Tesla’s design choices to those of L2 peers identified Tesla as an industry outlier in its approach to L2 technology by mismatching a weak driver engagement system with Autopilot’s permissive operating capabilities.

Unlike peer L2 systems tested by ODI, Autopilot presented resistance when drivers attempted to provide manual steering inputs. Attempts by the human driver to adjust steering manually resulted in Autosteer deactivating. This design can discourage drivers’ involvement in the driving task. Other systems tested during the PE and EA investigation accommodated drivers’ steering by suspending lane centering assistance and then reactivating it without additional action by the driver.

Notably, the term “Autopilot” does not imply an L2 assistance feature, but rather elicits the idea of drivers not being in control. This terminology may lead drivers to believe that the automation has greater capabilities than it does and invite drivers to overly trust the automation. Peer vehicles generally use more conservative terminology like “assist,” “sense,” or “team” to imply that the driver and automation are intended to work together, with the driver supervising the automation.
 
It's almost entirely about Autopilot, but there is a mention of one FSD crash with a fatality.

I wonder if that could be the employee crash for which Elon said the vehicle did not have FSD Beta firmware.

Legally, Tesla was required to report the crash to NHTSA because the passenger that survived claimed FSD Beta was enabled.

Relevant quote from Rohan Patel's tweet: "What we do know is that FSD beta software was not downloaded onto the car. Important to note that the regulations force us to report an incident to NHTSA, even if we have verifiable evidence that the ADAS system was not enabled. This creates confusion and doesn't make much sense to me, but is NHTSA's effort to build a wide net of any incident that may have involved ADAS."
 
Nope. That was not FSD and it was on May 16th 2022.

Would have been in indeterminate category in any case even if it had been in the right window.

First, I find it hard to believe that there could be a fatal crash on FSD Beta that doesn't make the news at all.

Second, I'm not positive those table headings necessarily mean the crashes occurred within those date windows. Those date windows are when the analyses were conducted; and the employee vehicle crash would have fallen outside of the scope of the detailed analysis because there was no video footage to analyze:

1. A detailed analysis covered 446 crashes occurring from early 2018 through August 2022 that relied on in-depth assessments of video from the subject vehicle onboard cameras, event data recorders (EDR), data logs, and other information; and
2. A supplemental analysis, which focused on 510 Tesla incidents gathered through the Standing General Order (SGO) process from August 2022 to the end of August 2023.
 
  • Like
Reactions: FSDtester#1
First, I find it hard to believe that there could be a fatal crash on FSD Beta that doesn't make the news at all.
Why?
Second, I'm not positive those table headings necessarily mean the crashes occurred within those date windows.
It’s possible but they do say the first window covered crashes occurring in that window. It is kind of nonsensical to categorize by analysis date so I assumed basic competence I guess.
 
  • Disagree
Reactions: FSDtester#1
I assumed basic competence I guess

I see you're unfamiliar with Federal agencies.


Tesla crashes are way over-represented in the news. The name drives clicks. Fatal crashes involving Teslas are catnip for journalists. Even when there is no evidence of Autopilot being involved, you'll often see speculation that it was involved. I just can't imagine a verifiable case of a fatal crash involving FSD without a whisper of it in the media.
 
Last edited:
Nope! Note the clearly visible “STOP” lettering in front of the car!
IMG_0813.jpeg

Nope! 👎
IMG_0814.jpeg


And these examples are not the occasional outliers.
I must be blind, I can't see STOP lettering anywhere in either of those pictures.
 
Yes. Order of magnitude better. Flawless at roundabouts which previously could not execute at all. Also can identify and slow down naturally for speed bumps. Just to give a couple examples. Check out Edgecase on X
I'm bringing into this thread - a comment made in the Options thread. It is more appropriate to answer that in detail here.

For one thing as I've explained in this and V11 threads so many times - there are a number of issues with the way roundabouts were handled in V11. It would stop sometimes before the entry and sometimes try to get into a roundabout even when a car was coming from the left. That doesn't mean V11 couldn't "execute at all". It handled a number of round abouts fine in V11 and even somewhat in V10. Before V10 it was indeed very bad. For me V11 failed at some complicated multilane roundabouts may be 50% of the time. Simple round abouts it handled ok - esp when there were no cars around - though it would hesitate to enter.

Now in V12 - it handles roundabouts much better. But I'd not go so far as to say "flawless". Just the other day FSD wanted to enter the round about when a car was coming from the left forcing me to disengage.

It is important to understand that it is not a "static" roundabout problem - i.e. FSD has handled that particular roundabout quite well a number of times. But on that particular occasion, there were two cars that exited the round about just before passing in front of us. So, may be FSD assumed the third car would exit as well ? Anyway even as the car started coming towards us, FSD started to accelerate forcing me to disengage (and brake quickly).

This is important to note - because don't assume FSD has "solved" roundabouts and sit back and relax. Be ever ready to take over.
 
So you can see the S, and the car partially is over it (barely) as my screen capture of Google maps shows (generously).

Someone has to mount a camera outside the car I guess. Or I quickly disengage, run out, and take a picture outside. Maybe I will just hang my phone out the window, might work.

This one was in the range of 5-10 feet.

Anyway no real need for verification here - so many people have indicated this happens it isn’t really something that needs to be proven.

IMG_0819.jpeg
 
Last edited:
Tesla crashes are way over-represented in the news. The name drives clicks. Fatal crashes involving Teslas are catnip for journalists. Even when there is no evidence of Autopilot being involved, you'll often see speculation that it was involved. I just can't imagine a verifiable case of a fatal crash involving FSD without a whisper of it in the media.
Actually - if you see all the crashes mentioned in the NHTSA filings, you won't find all of them mentioned by media.

When a crash happens - some local media has to first report it. Then its picked up by national paparazzi / bait-click media for clicks. If locally it is not reported (or atleast not digitally) you won't see it on usual websites.
 
  • Like
Reactions: AlanSubie4Life
I wonder if that could be the employee crash for which Elon said the vehicle did not have FSD Beta firmware.
I would tend to doubt it given that NHTSA was working with Tesla. Tesla would certainly provide any information that they have that sheds of culpability.

Even if somebody died while FSD was active, the only issue here is that the driver wasn't properly supervising. All of this is about forcing people to be responsible in their role as a driver. This same assertion could be thrown at cars without any driver assists. Distracted driving costs the country billions. According to NHTSA, $98 billion in 2019. Medical bills, legal fees, property damage, lost productivity, etc. It's one way to generate economic activity, but I think we can do better.

 
Here's one of those v12.3.4 navigation snafus that, in short order, turns into a near miss cluster %^.

A good question is - why is the driver letting the car make that mistake in the first place. Isn't one of the ideas to disengage when it makes an important mistake - otherwise how will it ever get better ?

I almost always intervene if the select lane will lead to a missed turn (or in this case taking a turn not needed). Today on my short 5 mile trip, I had to intervene 3 times to change lanes. Atleast one of them, I had to never to with V11. 12.3.4 seems to take the wrong lane ...
 
  • Like
Reactions: Pdubs and Mike1080i
Actually - if you see all the crashes mentioned in the NHTSA filings, you won't find all of them mentioned by media.

When a crash happens - some local media has to first report it. Then its picked up by national paparazzi / bait-click media for clicks. If locally it is not reported (or atleast not digitally) you won't see it on usual websites.

The NHTSA report has a total of 29 fatal crashes from Jan 2018 - Aug 2023.

The Tesla Deaths website lists a total of 33 fatal crashes in the news in the USA from Jan 2018 - Aug 2023 that they claim involved Autopilot. 27 of those 33 have the "Reported in NHTSA SGO" column filled out with the unique case ID number.

So about 93% of all fatal crashes involving Autopilot have made it into the media. I'd consider that over-represented.
 
  • Informative
Reactions: JHCCAZ and EVNow
The main point was that the Post claimed that this would be the first. So if the Post is correct and Elon is correct, then there haven't been any FSD deaths.

Stock analyst Toni Sacconaghi was on CNBC this morning (not a TSLA bull). Said he's been driving FSD the last 3 weeks and FSD has made good progress but has "every daily limitations," "still a long ways to go." He thinks:

- full self driving will take 5-10 years,
- it isn't clear that TSLA will be the only company solving it,
- automotive innovation (premium features) are almost always priced away due to global competition and eventually become standard features.
OK, crediting ANY useful data or analysis to a media source ar a stock analyst is simply ridiculous. They all have huge ulterior motives and are about worthless for any worthwhile information. I get useful data from a good number of the folks on here with actual experience. Sometimes the experiences are contradictory; that is still useful information. Seriously, I have seen so much stupidity and lies from the media--both sides of the aisle--that I currently trust raw data only. We get raw data on this thread; ignore the liars and idiots elsewhere.
 
  • Like
Reactions: Pdubs
I am getting interesting new behavior on V12.3.4. Before, it always took too long to slow down after a speed limit drop. Now--same software--if no one is in front of me it not only slows down appropriately, it starts slowing BEFORE the speed limit drop--so that it's at the appropriate speed passing the sign. I've never seen that before. This is NEW behavior from the same software. However, if we are following another car which ignores the new speed limit, we also ignore the new speed limit.