Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
Does anyone else in this thread have two Teslas? If so, are you noticing this: I have two 2020 Teslas, an AWD 3 and a MYP. Both have been on identical versions of V12.x, now both on V12.3.4. The Y is much, much better at FSD. I have no idea why, but I very rarely have a safety related intervention with the Y, but with the 3 it's a much more common experience. I have re-calibrated the cameras on the 3, no help. I swear, it's like entirely different versions of FSDS on each.
My wife's M3 with FSD has never had a single interention on V12.
That's because my wife has never activated V12...
 
Seeing as tons of people here have the same complaint, I doubt this very much. All super noticeable and abnormal as has been confirmed repeatedly here. That is why I posted the videos - makes it super super clear what the issue is.

Yeah have to be careful that regen is at a maximum. Usually after avoiding the initial sharp hit using the accelerator, you can just let the car take care of most of the rest. Just have to be conservative. In some cases of very late reaction, it is not possible of course.
Yup speed control needs improvement, but I just didn't find your videos all that egregious and none that I was uncomfortable with. There are way more important improvements for FSD.
 
Last edited:
Does anyone else in this thread have two Teslas? If so, are you noticing this: I have two 2020 Teslas, an AWD 3 and a MYP. Both have been on identical versions of V12.x, now both on V12.3.4. The Y is much, much better at FSD. I have no idea why, but I very rarely have a safety related intervention with the Y, but with the 3 it's a much more common experience. I have re-calibrated the cameras on the 3, no help. I swear, it's like entirely different versions of FSDS on each.
There is a theory that sometimes they do A/B testing with different parameters to see which work better, but as far as I can recall there’s never been confirmation of this.
 
I'm running my second version of V12, specifically V12.3.3 which isn't the latest. My experience is that I find it useful on the highway for making the drive more relaxing as I don't have to mind the speed quite as closely as if my foot were determining that, but V12 offers less real control than v11 did. V12 has more of a mind of its own and my ability to control speed with the thumbwheel on the steering wheel is less. And going into TACC requires you to come to a full stop first, which is less convenient. As to city driving, v12 is certainly improved with fewer nervous breakdowns at intersections, but I live in an old city with a lot of history including driving habits contrary to normal rules and some non-standard intersections plus some laxness in road markings. The result is problems. I also find the navigational routing problematic at times. I was on a suboptimal FSD route the other day when the car decided to do its right turn before we reached the intersection and I had to stop it from entering an A&W!! Riding outside the city the mapping wasn't up to date with construction and the car again wanted to turn right before the intersection under FSD and I had to intervene to keep us from going into a deep ditch. There was a driveway within 50 ft of the intersection that may have been a factor in the confusion. I always fully share my data with Tesla but I have my doubts about the near term future of FSD. To my mind it isn't good enough to get overall statistically much safer than human drivers if the car still does plainly idiotic things that a human driver could avoid. And that's the problem you have with the limited form of AI technology being used to develop FSD right now. There is nothing comparable to the human hypocampus that allows our brains to cope more effectively with something not seen before.
 
Does anyone else in this thread have two Teslas? If so, are you noticing this: I have two 2020 Teslas, an AWD 3 and a MYP. Both have been on identical versions of V12.x, now both on V12.3.4. The Y is much, much better at FSD. I have no idea why, but I very rarely have a safety related intervention with the Y, but with the 3 it's a much more common experience. I have re-calibrated the cameras on the 3, no help. I swear, it's like entirely different versions of FSDS on each.
My 22 MS has always performed better on FSD than my 21 MY.
 
Does anyone else in this thread have two Teslas? If so, are you noticing this: I have two 2020 Teslas, an AWD 3 and a MYP. Both have been on identical versions of V12.x, now both on V12.3.4. The Y is much, much better at FSD. I have no idea why, but I very rarely have a safety related intervention with the Y, but with the 3 it's a much more common experience. I have re-calibrated the cameras on the 3, no help. I swear, it's like entirely different versions of FSDS on each.
This sounds similar to the reports of FSD hitting curbs more frequently with Model S and X vs 3 and Y. I suspect it's a limitation of NN. Previously it was using logic to determine its actions, and that logic could be genericized across all vehicles. Now it's "photons in, controls out". So each specific version of each car (model, year, HW version, etc.) needs its own training data set, which is not going to be 100% equivalent across the board. It wouldn't surprise me if FSD performs better with Model Y. Since that is the most common Tesla, there will be more training data for it, and it makes sense to focus attention on it.

I suspect there is no more interpretation of the video to identify objects and their trajectories? Previously I imagine the programmers could issue generic commands like "speed down to 20 MPH" in response to a "collision probable" determination, and the underlying code would compute the pedal and wheel controls necessary to make that happen, based on variables such as vehicle power and road conditions. This scales relatively easily to a variety of vehicles and results in a fairly uniform experience across the board. But with pure NN, every vehicle configuration needs its own set of training data. Cybertruck doesn't even have any version of FSD yet. (Then again, Cybertruck doesn't even have Autosteer, which afaik doesn't use pure NN).

I'm new to Tesla, so a question for those who have been around. The v12 updates are rolling out in waves according to model and year. Was it like this with previous versions of FSD, or is it new to the pure NN FSD?
 
Another example of how superficial and clueless v12 can be. In the past one could assume v12 thought the line of cars were parked but these cars are moving and applying brake lights.
I'm not sure I understand. Do you never switch to an open lane to go around slow moving traffic? (I see humans do that all the time to cut to the front of the line.)

The problem would be where it was trying to cut in, but I can't really tell if it was aiming for the pole or if she just disengaged before it attempted to merge.
 
Does anyone else in this thread have two Teslas? If so, are you noticing this: I have two 2020 Teslas, an AWD 3 and a MYP. Both have been on identical versions of V12.x, now both on V12.3.4. The Y is much, much better at FSD. I have no idea why, but I very rarely have a safety related intervention with the Y, but with the 3 it's a much more common experience. I have re-calibrated the cameras on the 3, no help. I swear, it's like entirely different versions of FSDS on each.
We have two, both with the same software versions (see sig). I drive both extensively. I haven't noticed any difference in their behaviors.

I may be different from many here because I primarily "use" FSD rather than "test" FSD. I take over for UPLs or any sketchy situation. If the car starts to slow down too much or come too close to a curb, I disengage. I don't wait around for it at four-way stops, I press the accelerator.
 
I'm not sure I understand. Do you never switch to an open lane to go around slow moving traffic? (I see humans do that all the time to cut to the front of the line.)

The problem would be where it was trying to cut in, but I can't really tell if it was aiming for the pole or if she just disengaged before it attempted to merge.

This really reinforces @EVNow 's point about the user-collected disengagement data being too subjective to be useful. In this case, FSD:

1. Drove down a legal, empty lane
2. Tried to merge into a turn lane at the last minute
3. Was angled toward a curb at the moment of disengagement, but it was also actively turning

Nothing about that was inherently illegal or dangerous until she intervened. Rude? Sure. But you cannot say with 100% certainty that it was a critical safety disengagement.

In my opinion, it's more likely that FSD would have found a space and continued turning or applied the brakes (FSD V12 behavior we have seen many times) than it driving over the curb and into a pole (FSD V12 behavior we have never seen before). But again, that's just my subjective opinion. Only Tesla would have the ability to simulate the next couple of seconds and figure out what the system would have done had it not been disengaged.
 
Another example of how superficial and clueless v12 can be. In the past one could assume v12 thought the line of cars were parked but these cars are moving and applying brake lights.

Your character assassination of FSD not withstanding, why do people think FSD is smarter than it is? It's a driving assist system. It's not contemplating the meaning of the universe and existence. It's driving. You, as a human, can ask the deep questions, and ponder why the cars are all stacked up. You can come to the conclusion that those cars are probably in line waiting for the upcoming turn, but it's really just a guess. I guarantee you that there are times in your past where you sat, patiently, in line for an upcoming exit/turn, and then later, when you start to see lots of people changing lanes, find out there was an accident up ahead, and you just wasted tons of time sitting there like an idiot. But were you an idiot? You made a decision based on the information you had at the time, and didn't know there was an accident up ahead.

My point is that in that situation - DISENGAGE FSD AND DRIVE MANUALLY!! You, as a human with a sophisticated brain that's taken hundreds of thousands of years to evolve, can outthink a computer that's been learning for just a few years.

*ugh*
 
This really reinforces @EVNow 's point about the user-collected disengagement data being too subjective to be useful.
Individually, sure. But collectively, Tesla can examine all the data coming in for patterns. If every Tesla that ever tried to turn right at that light during rush hour resulted in a disengagement or an automatic reroute, I think that it would show up in data analysis. Combined with the many different ways drivers describe the problem, an analyst would understand the situation before watching any video.
 
Individually, sure. But collectively, Tesla can examine all the data coming in for patterns. If every Tesla that ever tried to turn right at that light during rush hour resulted in a disengagement or an automatic reroute, I think that it would show up in data analysis. Combined with the many different ways drivers describe the problem, an analyst would understand the situation before watching any video.

The user-collected data I'm talking about is the Google Form spreadsheet. Where anyone can submit their disengagement data and label it as a critical safety disengagement or not.

I think it's fine to use that data to show there has been a reduction in disengagements from V11 to V12, but people are taking the nominal number of critical safety disengagements from that spreadsheet and comparing it to services like Waymo. That just doesn't make sense for multiple reasons.

Tesla does have quality data on their own disengagements, but they have not shared any of that publicly.
 
The user-collected data I'm talking about is the Google Form spreadsheet.
My apologies. I took that as "user disengagement data going to Tesla". Yes, volunteered data is pretty useless. It requires motivation to submit data, and the motivation, whatever it is, usually skews the data pretty heavily. Which is why we have stuff like double blind studies. But you already know that :)

Sorry for the distraction.
 
Tweeter experiences FSD improvements without update. Quote:
... This left turn was jerky 10 out of ten times yesterday with clear blue skies at the 8am hour, no traffic, and failed 7 of 10 by tuning prior to median. Today, same conditions, smooth perfect turn 10 of 10. And the only thing that is different is upload and download of some data that was not a new build.
 
Yup speed control needs improvement, but I just didn't find your videos all that egregious and none that I was uncomfortable with. There are way more important improvements for FSD.
Watching a video of it and experiencing it in the driver's seat are very different things. This ridiculous stopping behavior and the need for me to press the gas pedal several times a minute are the 2 biggest failings of FSD for me.

I've had other cars try to pass and cut it front of me when the car starts doing the slow creep.
 
  • Like
Reactions: AlanSubie4Life
Tweeter experiences FSD improvements without update. Quote:
... This left turn was jerky 10 out of ten times yesterday with clear blue skies at the 8am hour, no traffic, and failed 7 of 10 by tuning prior to median. Today, same conditions, smooth perfect turn 10 of 10. And the only thing that is different is upload and download of some data that was not a new build.
I’ll guess we won’t get followup.

why do people think FSD is smarter than it is?
Yes, exactly. Meanwhile even Elon appears to be confused and is frittering away the focus of SELLING CARS for some pie-in-the-sky dream of robotaxis which obviously is not a good use of limited resources. Focus on making FSD good; maybe people will like it and maybe it will improve safety. And make cars - good cheap ones.

But wasting resources on extremely difficult Robotaxis is obviously a dead end at the moment or at least is starting from scratch. So infuriating. Waste.
 
Last edited: