Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
What on earth makes you think that? I've been hearing some version of "it's going to be awesome in 3-6 months" for several years now, and then it's not, and then its all about the next big release or platform change (HW3! DOJO!!! Nothing but nets!!!)

Meanwhile 12.3.3 can't distinguish between a 4 way or 2 way stop sign, and first chance it got, 200ft from my house, pulled out in front of a car with ROW from the right (I had a stop, they did not), stuttered in the middle of the left turn, and I had to punch it to not get hit.

That was my second drive on 12.3.3.

It needs to go several lifetimes with no mistakes, statistically. I have 600k (conservatively, maybe as much as 750k) lifetime miles since 1982, and have 1 at fault fender bender, 3 mph in reverse in a parking lot (would not even be counted as a crash by Tesla). Only reported because my flatbed truck obliterated the bumper, grill and headlight of a newer Camry in 1995.

Average human goes 200k between accidents. If Tesla releases FSD at 51% of average human accidents, then it will be 1/3 as good as me. I refuse to use something that could triple my accident rate.

What the actual F--- are y'all smoking? This is going to seriously hurt or kill someone, and needs to be shut down.

I could go on about all the other places in the drive it failed, but one failure is enough. (took off aggressively from a blind corner, crossing the path of a lane with no stop from the left, no time to check for traffic, failed to check for traffic for a blind merge on the right a block later, refused to attempt a UPL 3 blocks later, disengaging with no warning, in between 2 lanes, rolling slowly, I had to hit the brakes).

It needs to effectively never fail or it's useless. What am I supposed to do if I think it's making a mistake? If I can't see a reason for it pulling off the road into the ditch, it could be 2 things:

1. Saving my life from an imminent collision I can't see.
2. Be about to crash into the ditch and kill me.

What do I do?

So this was a fun test and now I'm done.

Be careful people, don't get compacent.
I am truly sorry you're having that experience Ox. But you're laboring under the false premise that everyone is having your experience. If everyone was, then I'd 100% agree with you. It works very well for me, and as yet I have not experienced major failures like you describe. Doesn't mean I won't experience them.

For now I remain cautiously optimistic and will reject your premise and calls for it to be shut down.
 
Yeah it is weird to do the wide release (except to lessen the sting of the quarter I guess).
The picking the correct turn lane is next level though. It seems fine to release with that limitation.
I'm just mystified that they released FSD in this state:
1) Doesn't even attempt go the speed desired by the user.
2) Won't respond in a normal way to red traffic lights.
3) Frequently can't handle stop signs in a remotely normal way.
4) Cuts it super close to curbs on many turns.
5) Simply stops in lanes of traffic in the middle of a maneuver.

It's WAY better than v10/v11 in most other ways though. (And a couple of these were limitations on v11 as well.)
If they could just fix 1 through 4 it would be incredibly awesome and I could totally understand a release.

I expect it will be years before these items are fixed. Then can start the march of 9s.

They can do free months at any time; people have short memories. I doubt alienation will be much of an issue if people are only subjected to it for a month at a time. Vast majority of people likely turn it on, after about 2 minutes say, "WTF?" and then turn it off and forget about it.

No doubt FSD has never been ready for prime time. A few minutes behind the wheel and you know something is a miss. It's sluggish, it makes too many mistakes, it's uncomfortable, and it takes too long for the team to fix known problems.

There'll be some improvements here and there but I doubt there are any easy solutions to HW3/HW4's glaring systemic issues. No matter how they package it, the end product drives essentially the same.

I'm pretty sure Elon was desperate for a distraction and a new revenue source.

I would be very surprised if any OEM jumped on board to license v12.
 
Your yellows might have a 4 second timer, but the NNs were trained on 2 or 3 second timers, so it brakes out of caution?
Multiple experiences yesterday on last-second yellows (very very few people would have slammed brakes for them, it was 2 seconds or so). The car tapped the decel - I pressed accelerator immediately and overrode but I think on these particular ones it might have followed through.

Anyway it is just sort of broken.

I’ve never seen a 2-second yellow. Seems hazardous.
 
Since yellow lights vary in timing, with no way to tell what that timing is, and the new NN are not coded but instead learn how to handle things by watching curated videos, I wonder if that's why we're seeing this behavior. Your yellows might have a 4 second timer, but the NNs were trained on 2 or 3 second timers, so it brakes out of caution?
It could be. I've experienced two different FSD behaviors:
  1. Light turned yellow, FSD quickly stabbed the brakes, and then it proceeded through the intersection.
  2. Light turned yellow, FSD slammed on the brakes, full ABS level, and stopped. (I had no time to safely react, and was lucky nobody was behind me.)
    1. I could have applied the accelerator, but it had braked so much that I didn't think it would be safe to proceed. (No way I could have cleared the intersection before the opposing light turned green.)
    2. And no, there was nobody running the light from another direction that AEB would have been trying to prevent a collision with.
 
  • Informative
Reactions: FSDtester#1
Multiple experiences yesterday on last-second yellows (very very few people would have slammed brakes for them, it was 2 seconds or so). The car tapped the decel - I pressed accelerator immediately and overrode but I think on these particular ones it might have followed through.

Anyway it is just sort of broken.

I’ve never seen a 2-second yellow. Seems hazardous.
There's a national guideline of 3-6 secs, but emergency vehicles can override the signal at any time (at least around here), so it's absolutely possible for there to be a 1-sec yellow, I've seen 'em.
 
Last edited:
  • Like
Reactions: sleepydoc
Multiple experiences yesterday on last-second yellows (very very few people would have slammed brakes for them, it was 2 seconds or so). The car tapped the decel - I pressed accelerator immediately and overrode but I think on these particular ones it might have followed through.

Anyway it is just sort of broken.

I’ve never seen a 2-second yellow. Seems hazardous.
Feds standard is 3-6 seconds. Could Tesla be training on 3 as the lowest common denominator?
 
  • Like
Reactions: JB47394
Admittedly I am far from sure what the situation is, and it's been argued over and over again that we have no data (we have no data), but:

If releasing this Supervised version of FSD lowers accident rates for those using it (in a true side-by-side comparison of using vs. not - which we have no data for, and particularly for this brand new year-old v12, even Tesla does not have that data), then it could well be worth it.

I do have concerns that with the wider release someone will get hurt. But as a committed concern-monger, I can say that I have been wrong every time to date on this point. There have been accidents caused by FSD, but they've luckily not led to someone running straight into a tree (was close though!).

It's even possible that it has lowered accident rates. We just have no idea, since no data with the relevant comparison has ever been published by Tesla on this, on a quarterly basis or otherwise, ever.

But just as a thought experiment, it can fail frequently as long as it is Supervised, and it may improve safety.

Or it may make things worse the better it gets.

What we can be sure of is there won't be any shut down until someone is seriously hurt or killed, and probably not even then (though it depends on what happens).
It took 6 risks in a 1 mile drive that I may not have been able to respond to fast enough to avoid a crash (I could not always see that it was clear before the car aggressively crossed a lane with ROW). Luckily, no one was coming in most of the situations, and the one with a car coming I was able to intervene and avoid a crash.

It changed lanes with no signal, took a UPL into my street with no signal, signaled left at a T intersection, then abruptly turned right. Any one of these are disqualifying.

I have no way to square any of this behavior with anything close to a safe product.

I must be holding it wrong.
 
  • Like
Reactions: gforce2002
It should show the alternate routes while your actively using it then you can just touch the alternate route on the screen

Should also be able to choose a route as your default
The way to use FSD Supervised now is to treat it like a drive function. Take all the disengagements just as another function. So, put in your Navigation route. At the turn that deviate from the route you want, just disengage, drive manually through the turn, reengage after the turn. The system would then reroute, again disengage on the next turn you want to make, do the turn manually. Repeat if the re- route is not what you want.

This way you are using FSD Supervised on portions which you feel comfortable with.
 
feel we will be getting 12.3.3 to 12.4.1 this week and 12.5 by May 2024
think our critiques are being addressed live and fixes will be in these two major releases
Well, I'll accept they're at least working on the issues. If everything we've talked about that's dangerous is fixed as soon as May (I'll settle for June even), that would make me ecstatic.
 
I don't buy the 'collecting more data" narrative. It makes more sense to me that the the trial and removal of the beta designation were, as has been mentioned, timed for the purpose of FSD revenue recognition. I guess we'll find out at the forthcoming ER...
Yeah, Tesla doesn't need a single person to be using FSD to collect data. In fact, they can collect data from any vision-equipped car on the roads, which is almost all of them. James Douma agrees the trial is not for data collection purposes:

In fact, Tesla benefits more from data where the driver is driving manually than it does from drivers on FSD. You can't really use video input to train on human behavior if that video is of a car being driven on FSD. If you do that you're training the neural net to drive like a car on FSD, which is--if not incestual--at least a cyclic dependency you would like to avoid.

Video from manual driving is fed into the neural net. Video and disengagements from users of FSD simply indicate the progress of FSD and where Tesla needs to focus its development attention. I have a hard time believing that Tesla has no idea where to focus next, since there are still lots of not-so-edge cases that still need work.
 
That would make sense. If they focused on city driving training, then the speeds would be lower and the yellows shorter. So now when it sees a yellow light, it assumes that it needs to stop unless it's practically in the intersection.
Can FSD just use the following clues?
1. The distance between the car and the intersection and the car speed to calculate whether it will have enough time to cross the intersection. That's what humans do (but not accurate like FSD)
2. If the stop would cause a hard breaking then continue to go.

Regardless the problem rarely existed before.
 
In fact, Tesla benefits more from data where the driver is driving manually than it does from drivers on FSD. You can't really use video input to train on human behavior if that video is of a car being driven on FSD. If you do that you're training the neural net to drive like a car on FSD, which is--if not incestual--at least a cyclic dependency you would like to avoid.

Video from manual driving is fed into the neural net. Video and disengagements from users of FSD simply indicate the progress of FSD and where Tesla needs to focus its development attention. I have a hard time believing that Tesla has no idea where to focus next, since there are still lots of not-so-edge cases that still need work.
I've wondered this myself (specifically, if manual driving is more beneficial vs. FSD driving w/disengagement data.) I kinda wish Tesla would offer some sort of "training" mode where we could submit manual drives for training. Allowing us to toggle this on/off per drive would let us adjust our driving behavior so that we could attempt to drive "perfectly" for that trip when we're in the mood to do so.

Of course, "perfect" is subjective as can be seen in these threads, but it might still be helpful...maybe? None of us really knows what on earth Tesla does with our data, or what they're looking for at any given time...
 
  • Like
Reactions: AlanSubie4Life