Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
This morning I had an UPL that FSDS totally misjudged and waited until a car was closing to try and pull out. FSDS seems so good at everything expect inner city multilane selection and crowded UPL. Chuck's latest UPL (this morning) is F'n UGLY and more like a Steven King nail biter movie.


Yikes. I quit watching after the first attempt. FSD timing appears way off kilter. Maybe they need to better synchronize the new vehicle control nets or poor object kinematics estimation?

Apparently many more ADAS visits are forthcoming.
 
Last edited:
Maybe they need to better synchronize the new vehicle control nets or poor object kinematics estimation?
Coming around to the idea that the lead engineer for this needs to be relieved of duty. It is not a difficult problem! Just simulate it. The problems will show up. If not, fix the simulation.
 
Last edited:
I tried disengaging by jerking the steering wheel slightly. The robot shrieked and gave errors about the speed being too high and said something about a steering problem. It refused to disengage, and continued to do loud beeping. The blue stripe finally turned gray and the beeping stopped.

Pretty strange. I’ll stop using the steering wheel method since it seems to be a problem.
 
  • Informative
Reactions: cyborgLIS
I tried disengaging by jerking the steering wheel slightly. The robot shrieked and gave errors about the speed being too high and said something about a steering problem. It refused to disengage, and continued to do loud beeping. The blue stripe finally turned gray and the beeping stopped.
Similar episode once when I tried to disengage using the yoke but corrective steering prevented me from turning manually, and there were lots of beeps. All happened very quickly while I was driving very slowly in a parking lot.
 
I’m beginning to have concerns that they will not have the right approach for v12.4.

They appear to be leaning too heavily on real-world results and training. And it does not appear that there is any feedback loop.

Failure to solve this in 12.4.x could easily be a death blow to the entire program. Have to be able to solve the simple problems to have any hope of solving more difficult ones.

Any beer bet that 12.4 will be the one that finally gets the first 9? 🫃🏻
 
No bets, but there is no such thing as a simple fix for ML. The challenge is that outcomes are probabilistic and not deterministic so the best you can hope for is that the inference is probably correct. Also, I think even the smartest minds have significantly underestimated self driving as a challenge - ML is a great way to do brute force statistical regression and solve certain problems - but there is no guarantee that as it stands that it is the right approach for this particular problem. If the scenario doesn’t exist in the training set, then it cannot be handled during inferencing. Just that small problem that your training set merely needs to have examples of every possible driving outcome that could ever occur in every possible environment that could possibly occur.
 
The challenge is that outcomes are probabilistic and not deterministic so the best you can hope for is that the inference is probably correct.
Except that does not seem to be happening here.

Inference and understanding of the vehicles and their positions seems to be perfect.

And for being probabilistic, it is not like this is infrequent or somehow being tripped up by tricky situations!

Just simulate it! No need to drive! If it doesn’t fail in simulation, make it fail in simulation FIRST.
 
Similar episode once when I tried to disengage using the yoke but corrective steering prevented me from turning manually, and there were lots of beeps. All happened very quickly while I was driving very slowly in a parking lot.
I would prefer to disengage with a gentle tap on the brakes and take control, rather than fidgeting with the steering wheel.
 
  • Like
Reactions: enemji
This morning I had an UPL that FSDS totally misjudged and waited until a car was closing to try and pull out. FSDS seems so good at everything expect inner city multilane selection and crowded UPL. Chuck's latest UPL (this morning) is F'n UGLY and more like a Steven King nail biter movie.

Chuck always test it right after installing a new version. My experience is that it takes a couple of days for a new version to “settle in”.

I find that FSD preforms much better after a couple of days of using it. (I don’t Know why or how, but it is just my observation)
 
I’m beginning to have concerns that they will not have the right approach for v12.4.

They appear to be leaning too heavily on real-world results and training. And it does not appear that there is any feedback loop.

Correct me if I'm wrong, but haven't actually seen what degree of improvement a full retraining cycle can actually lead to. Most public testers at this point started on a 12.3.* release (and the various point releases that have come pretty quickly are not full retraining of the system, as far as I know).

12.4 will be our first real chance to see how much changes.

I personally don't care that much about unprotected lefts, specifically, but I'm am very interested to get a better picture of:
- how much change can they ship per iteration (meaning a full retraining of the system)
- are there noticeable regressions?
- how many iterations they can ship per year
 
Time to give up on Chuck's Left Turn. Left turns are dangerous, according to experts and NHTSA:

What's wrong with turning left?
Left-hand turns are generally considered unsafe and wasteful on right-hand driving roads, such as those in the US.

"Left-turning traffic typically has to turn against a flow of oncoming vehicles," explains Tom Vanderbilt, author of the book "Traffic: Why we drive the way we do."

"This can not only be dangerous, but makes traffic build up, unless you install a dedicated left-turn 'phase,' which is fine but basically adds 30 or 45 seconds to everyone else's single time," he said.

A study on crash factors in intersection-related accidents from the US National Highway Traffic Safety Association shows that turning left is one of the leading "critical pre-crash events" (an event that made a collision inevitable), occurring in 22.2 percent of crashes, as opposed to 1.2 percent for right turns. About 61 percent of crashes that occur while turning or crossing an intersection involve left turns, as opposed to just 3.1 percent involving right turns.

Left turns are also three times more likely to kill pedestrians than right ones, according to data collected by New York City's transportation planners.
 
Correct me if I'm wrong, but haven't actually seen what degree of improvement a full retraining cycle can actually lead to.

Yep.
12.4 will be our first real chance to see how much changes.

Yep, that is why it is one of the most consequential releases ever.

Time to give up on Chuck's Left Turn. Left turns are dangerous, according to experts and NHTSA:
A lot more accidents at nearby traffic-light-controlled intersections.
 
  • Funny
  • Like
Reactions: Dewg and gsmith123
Time to give up on Chuck's Left Turn. Left turns are dangerous, according to experts and NHTSA:
Most collisions at intersections happen when there is a traffic signal.

1714429431899.png


I’m not sure it’s possible to exit Chuck’s neighborhood doing only right turns.
 
Last edited: