Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
Pretty quick reaction to lead vehicle 2 ahead slowing down for an unprotected left turn to switch lanes at an oddly shaped intersection with traffic coming from behind and lights changing:

12.2.1 pass turning.jpg


 
Pretty quick reaction to lead vehicle 2 ahead slowing down for an unprotected left turn to switch lanes at an oddly shaped intersection with traffic coming from behind and lights changing:

View attachment 1023489

It was in the wrong lane. Complete failure to anticipate and move to the other lane smoothly in advance. Instead a high risk maneuver in the intersection. More traffic, and it would have been stuck.

This is not good.

This is also one of those hard situations that humans paying attention would deal with effortlessly, but I would expect would be difficult for a system like this to figure out in advance.

But it was pretty darn clear far in advance the Tesla was turning left. Took seven seconds for FSD to react.

Note the lead car took the correct path immediately, effortlessly, safely, smoothly.

I am glad it recovered but it lost a position.
 
Last edited:
  • Like
Reactions: primedive and kabin
Pretty quick reaction to lead vehicle 2 ahead slowing down for an unprotected left turn to switch lanes at an oddly shaped intersection with traffic coming from behind and lights changing:
43:03 Slight movement to avoid paper debris being blown around
51:20 Alternating merge at an uncontrolled intersection
53:51 Smoothly handling a pedestrian striding out into the road at a pedestrian crossing
03:48 Jogger on edge of road with a car coming up fast in the adjacent lane
 
  • Funny
Reactions: AlanSubie4Life
It was in the wrong lane. Complete failure to anticipate and move to the other lane smoothly in advance. Instead a high risk maneuver in the intersection. More traffic, and it would have been stuck.

This is not good.

This is also one of those hard situations that humans paying attention would deal with effortlessly, but I would expect would be difficult for a system like this to figure out in advance.

But it was pretty darn clear far in advance the Tesla was turning left. Took seven seconds for FSD to react.

Note the lead car took the correct path immediately, effortlessly, safely, smoothly.

I am glad it recovered but it lost a position.
What you talk about points to poor lane planning, which FSD Beta had always been bad at, but as to subject of reaction time to vehicles on the road that doesn't point to a 7 second reaction time (as it was already slowing down in reaction to the lead vehicle well before that, just that the decision to abort the turn came very late).

That said, lead vehicle tracking probably is the easiest task and not a very good test of reaction times. Rather vehicles or pedestrians that just pop out are probably a more appropriate test.
 
Last edited:
Smoothly handling a pedestrian striding out into the road at a pedestrian crossing
I noticed this one last night. It was not smooth, no matter how much Whole Mars said he didn’t think that guy was going to walk.

He was striding obviously towards a marked crosswalk, what on earth did he think was going to happen?

Car reacted late, and stopped too abruptly when it did respond.

Don’t be fooled by Whole Mars; watch the scene and give yourself credit as an alert driver.
 
  • Like
Reactions: kabin
Pretty quick reaction to lead vehicle 2 ahead slowing down for an unprotected left turn to switch lanes at an oddly shaped intersection with traffic coming from behind and lights changing:

View attachment 1023489


Yikes! At best, it worked out that time but it's another example of v12 taking excess risk. Changing lanes in an intersection tends to be illegal and unsafe. FSD should have waited for the intersection to clear before proceeding in or changed lanes prior to entering the intersection. Hard to tell from the repeater lens but v12's indecisive last second lane change appeared to slow the car in the right lane as it made the lane change.

I'm beginning to think that stubby blue path line is telling the story of v12 path being too myopically focused.
 
Last edited:
This is also one of those hard situations that humans paying attention would deal with effortlessly, but I would expect would be difficult for a system like this to figure out in advance.
Yeah it can definitely do better planning, but its ability to safely recover shows the current model architecture is able to consider a bunch of complexity and even small signals at the same time, so maybe it just doesn't have enough training examples or core understanding yet. Sorry if you don't play chess and I should find a better comparison, but AlphaZero learning process also needed to learn that it blundered then tactics of how to recover then strategies of how to avoid getting into dangerous situations in the first place.

Hopefully Tesla has a good way to collect this type of data without requiring people to have it active and ending up in riskier situations. Doesn't help that Omar doesn't disengage, which presumably would send back data of needing improvement.
 
Yeah it can definitely do better planning, but its ability to safely recover shows the current model architecture is able to consider a bunch of complexity and even small signals at the same time, so maybe it just doesn't have enough training examples or core understanding yet. Sorry if you don't play chess and I should find a better comparison, but AlphaZero learning process also needed to learn that it blundered then tactics of how to recover then strategies of how to avoid getting into dangerous situations in the first place.

Hopefully Tesla has a good way to collect this type of data without requiring people to have it active and ending up in riskier situations. Doesn't help that Omar doesn't disengage, which presumably would send back data of needing improvement.

It might look better than it's true capabilities. We've seen enough hasty v12 decisions to know it isn't processing everything.
 
V12.2.1 seems strange on my car now. It has hard braking at stop light multiple times. On the way to work this morning the speed limit of the street is 50 but FSD showed 30 mph at kept the number on the screen for more than 2 miles. I dialed up the max speed to 50 and FSD ran at 50 (30 mph limit was still shown on the screen).
 
  • Informative
Reactions: FSDtester#1
Recently, the car changed lanes without signaling. It just drifted from one lane to the other. Has anyone ever noticed this, because I don't recall it happening? That is, I suspect this is an FSD 12 thing. I reported it, of course.

It happened on the road below, which I am assuming is not limited access. Speed limit 50 MPH.

Screenshot 2024-03-01 at 8.09.50 AM.jpg
 
Recently, the car changed lanes without signaling. It just drifted from one lane to the other. Has anyone ever noticed this, because I don't recall it happening? That is, I suspect this is an FSD 12 thing. I reported it, of course.

It happened on the road below, which I am assuming is not limited access. Speed limit 50 MPH.

View attachment 1023584
Did it change lane right after you entered that highway?
 
Sorry if you don't play chess and I should find a better comparison, but AlphaZero learning process also needed to learn that it blundered then tactics of how to recover
Yeah to me doesn’t seem analogous.

Just seems like this sort of thing is far more complex than chess, which has clear rules. There are hundreds of thousands/millions of variants of this sort of thing. Learning blundering does not seem simple. There are rules here, but each variant will have different considerations.

I’d be surprised if it ever starts to learn the correct lane to be in even 7-10 seconds in advance. Not to mention that for many cases it requires prior knowledge to know far enough in advance to avoid a problem scenario/lane backup.
 
  • Like
Reactions: kabin
It really isn’t because if I use FSD I have to be paying attention. So if I am paying attention I want something not inferior.
Thats not the point. The point I'm trying to make is that you think only about top 5% of drivers - not the average.

Oh yes, we don't want FSD (in the end state) to behave like an average human - it should be like top 5%. But I won't call that behavior odd by human standards.

ps : I want Tesla to concentrate on improving FSD behavior that falls in the "below average" - or in some cases lower than 10 percentile. What we don't know is how easy is it for Tesla to target particular behavior in the "end to end" NN.
 
  • Like
Reactions: aronth5