Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
I'm sure you've been frustrated getting stuck behind a slow driver before. How would you feel if every other car on the road was driving through intersections the way FSD currently does? It's NOT acceptable and not human-like either. I would like to blend in with the rest of traffic, not be a public nuisance and inspire contempt for autonomous driving.
What is not acceptable is this human behavior of anger and frustration. Do I feel it? Yes. Should I be feeling it given the amount of problem this incident created? Absolutely not, it's assanine to feel this much anger for something so inconsequential.
 
  • Like
Reactions: FSDtester#1
I'm sure you've been frustrated getting stuck behind a slow driver before. How would you feel if every other car on the road was driving through intersections the way FSD currently does? It's NOT acceptable and not human-like either. I would like to blend in with the rest of traffic, not be a public nuisance and inspire contempt for autonomous driving.
Yeah...I find that the overwhelming majority of the times I disengage or intervene in v12.3.6 it's not because I had to. It's because the car was taking too long at a stop sign, driving too slow (or occasionally too fast) or because it stayed behind another slow driver when I wanted to go faster and pass them.

That needs to be considered when looking at the disengagement rate. If FSD's disengagement rate is x, keep it mind that there's a driver right there using the system as driver's assistance. If they are in a hurry, impatient, or don't want to slow other drivers on the road, there will often be an disengagement counted that would otherwise not have happened in a Robotaxi situation.

That's not to say there aren't a lot of issues to fix still. But I think calling it useless or worthless is a pretty big stretch. It's actually a pretty miraculous system that's better than anything else offered by any other OEM.
 
What is not acceptable is this human behavior of anger and frustration. Do I feel it? Yes. Should I be feeling it given the amount of problem this incident created? Absolutely not, it's borderline stupid to feel this much anger for something so inconsequential.
You are absolutely right. The difference of a few seconds is pretty meaningless in the grand scheme of things.

Having said that though, the reality is that lots of drivers are impatient, and road rage is a real thing. So minimizing how much one "annoys" other drivers, as silly as it sounds, does indeed contribute to safety in the form of reduced road-rage incidents.
 
What is not acceptable is this human behavior of anger and frustration. Do I feel it? Yes. Should I be feeling it given the amount of problem this incident created? Absolutely not, it's assanine to feel this much anger for something so inconsequential.
It's human nature. It's not changing anytime soon and it needs to be factored in. Besides, from a tech point of view, I want the car to drive efficiently and not pass up on perfectly safe opportunities to advance. Whether or not they are disengagement/intervention worthy, they are still mistakes.
 
  • Like
Reactions: JB47394 and Ben W
What is not acceptable is this human behavior of anger and frustration. Do I feel it? Yes. Should I be feeling it given the amount of problem this incident created? Absolutely not, it's assanine to feel this much anger for something so inconsequential.
Yes! Why are we acting like humans around here? We must take our medicine. When FSD craps on us, be thankful, and ask for more.
 
  • Funny
Reactions: EVNow and cyborgLIS
That's not to say there aren't a lot of issues to fix still. But I think calling it useless or worthless is a pretty big stretch. It's actually a pretty miraculous system that's better than anything else offered by any other OEM.
Useless is a harsh word, but from a utility point of view, with autonomous driving it's really all or nothing. If it has an even modest chance of making a critical mistake, that means the rider must be engaged in the driving process, and thus cannot use that time to be productive. From a tech point of view, I agree 100% that FSD is a marvel that accomplishes things no other system even attempts.

Actually, I have to add that it does offer some utility as a driver assistance (not replacement) when driving in unfamiliar areas.
 
  • Like
Reactions: Ben W
Imo Tesla's fsd performance surpassed waymo's as it is forced to handle much more challenging scenarios. This includes all the unreasonable 6 lane blind UPLs, needing to go 15% over the speed limit at all times, navigating through parking garages and extremely busy parking lots, etc etc. Many places waymo doesn't even touch.
It does a terrible job in busy parking lots.

In a parking garage, Navigation thought it was driving on the street outside the garage. I don't know what it would have done if I had enabled FSD, because it seemed like a bad idea to try it. This was exiting after an event, so traffic was bumper-to-bumper going down the ramps with no room for error.

On a 6-lane UPL, it waited so long to cross the near lanes, and moved so slowly, that it had to stop in the middle of them because before it reached the far lanes, traffic was approaching from the right.

Why would it need to go 15% above the limit at all times?
 
Last edited:
I just returned from a ~300 mile trip. V12.3.6 continues to struggle when environmental stimuli increases. Indecisiveness for yellow light intersections - abrupt brief decel and then quickly accelerate through the yellow. Crazy lane change attempts. Misinterprets roadway. Accelerates into tailgating a lead vehicle at traffic lights, followed by a laggy (it would be interesting to measure) regen/decel resulting in the lead vehicle pulling away forcing another excess acceleration. And the usual dry wipes, hard braking, excess acceleration, refuse to change lanes after manual turn signal, and the never ending max speed control snafu.
 
For me 12.3.6 has the same problem I’ve had the past few releases. In my home county it’s mostly 2 lane rural roads. At many points where there is a turnoff of the road there has been a bypass lane added so you don’t have to wait for left turning traffic. FSD dives into these bypass lanes and comes out again every time it sees one. Really frustrating and makes you look like an idiot to other drivers. I don’t know what the thing is thinking as you can see it pick the the whole route on the screen so it can see the new lane is is only a few feet long and there is no point to entering it and then immediately be forced to merge back.
 
  • Like
  • Informative
Reactions: JB47394 and Ben W
Yes! Why are we acting like humans around here? We must take our medicine. When FSD craps on us, be thankful, and ask for more.
Humans are flawed. A robot will wait until there's zero chance of collision. A human will wait until there's an acceptable risk due to a lack of patience. Right now humans want the robot to take similar acceptable risk, then demonize it if something bad happens, but also want a crash free future.
 
Another week and still no critical safety disengagements in a long time. Some regular disengagements of course.
Got me to thinking. What if I never disengaged or used the accelerator pedal what percentage of my drives would "successfully" reach the destination? Sure that would mean I'd annoy other drivers and when FSD misses or takes the wrong FSD would have to reroute. I think the biggest problem would be coming up to a construction zone where the officer/road crew person wanted me to stop since FSD doesn't respond to hand gestures. Except for this type of disengagement I find it hard to think the percentage wouldn't be over 99%. What do others think?
You're right, it would probably make it 99% of the time. So do absolutely terrible human drivers, even drunk ones. The 99% statistic doesn't make FSD a good or safe driver.

A significant number of accidents are caused by two cars making otherwise-minor mistakes at the same time and place. If only one car makes a mistake, the other car can usually take evasive action to avoid a collision. But just as the average driver breaks the law on average 1000 times for each time receiving a traffic ticket, it's likely that you'll make 1000 minor traffic mistakes before getting unlucky enough that another driver also makes one at the same time and place, leading to a collision. The point is, in this same situation, if you DON'T make the minor mistake, the accident probably won't happen. And that's why small mistakes matter.
 
For me 12.3.6 has the same problem I’ve had the past few releases. In my home county it’s mostly 2 lane rural roads. At many points where there is a turnoff of the road there has been a bypass lane added so you don’t have to wait for left turning traffic. FSD dives into these bypass lanes and comes out again every time it sees one. Really frustrating and makes you look like an idiot to other drivers. I don’t know what the thing is thinking as you can see it pick the the whole route on the screen so it can see the new lane is is only a few feet long and there is no point to entering it and then immediately be forced to merge back.
I've experienced the same thing, for both v12 on city streets and v11 on highways. It seems to strongly prefer the leftmost and rightmost lanes, leading it to frequently make such mistakes (for both left-turnoffs and right-turnoffs), when arguably the default bias should be (when a lane splits) to move toward the middle lanes (away from the leftmost/rightmost lanes) unless making or about to make a turn. It sometimes does this even while the nav visualization / path planner still shows the correct lane/path.
 
Last edited:
Humans are flawed. A robot will wait until there's zero chance of collision. A human will wait until there's an acceptable risk due to a lack of patience. Right now humans want the robot to take similar acceptable risk, then demonize it if something bad happens, but also want a crash free future.
The robot will do whatever it's programmed (or for a neural network, trained) to do. The question of how to appropriately balance risk with reward is a highly nontrivial one, for both humans and robots. An AV (or human) trained for zero risk would never leave the driveway. So no, we don't expect AV's to never make mistakes. But we do expect them to never make stupid pointless mistakes.

Example: imagine the car needs to exit the freeway soon, and the exit lane is packed and very slow. At what point should the car begin attempting to merge? (At what point should a human attempt to merge?) Regardless of when the attempt begins, it's possible that the merge won't be successful, and the car will miss the exit. Driving aggressively and merging at the last minute is one strategy, with probably a better expected result (time to destination), but also a higher chance of "failure" (missing the exit), and/or a higher chance of having to be "rude" in order to complete the maneuver. If we have our AV set to "Assertive" mode, and it starts trying to merge at the last minute and misses the exit, or else maybe it does force its way in but gets honked at, and/or blocks a bunch of cars behind it in the process, is that a reasonable mistake? What if it happened in an Uber? How harshly would you criticize yourself if you made this mistake?

Right now, FSD v12.3.6 still regularly makes stupid pointless mistakes, such as getting into a left-turn-only lane when it needs to go straight, or failing to shift a few inches over (on an otherwise empty road) to avoid a large pothole or piece of road debris, or failing to accept the obvious right-of-way at an intersection, or stopping in a wide lane in a position that pointlessly blocks the trailing car from making a right turn, then failing to move out of the way when that driver tries to edge past it. It also still has huge problems with U-turns and other common simple maneuvers; e.g. when making a left turn from a divided street with a wide median, it regularly blows through the median's stop sign to cross the opposing traffic lane. [San Vicente Blvd E to Gretna Green Way N: happens every time.] These are some of the pointless mistakes I would expect a properly trained AV (and/or an experienced and attentive human driver) to essentially never make.
 
Last edited:
So no, we don't expect AV's to never make mistakes. But we do expect them to never make stupid pointless mistakes.
There may be brilliance in exquisitely trained NNs that we cannot conceive of.

There may be a level of possible behavior that is beyond most human conception.

These are likely to be judged harshly by humans in our typical short sightedness and this is likely unavoidable.

I offer a real world example in that I know someone who over a period of 6 years of driving, never made a left turn.

This is possible. This was done quite safely and consciously and successfully although it required a lot of maps (pre gps navigation).

If FSD stopped making left turns it would appear maddening to most/all of us but it is possible to achieve. The consequences of such a choice would likely be profound on many levels and costs. And there are likely other hidden possibilities to be discovered.

In a RT world, would humans accept a fee structure based on calculations they could not decipher but that offered levels of choices and fees considering planning optimized around levels of safety, timeliness or cost?

FSD in a human centric environment is going to be surprising and hopefully a bit humbling.
 
  • Informative
Reactions: APotatoGod
It does a terrible job in busy parking lots.

In a parking garage, Navigation thought it was driving on the street outside the garage. I don't know what it would have done if I had enabled FSD, because it seemed like a bad idea to try it. This was exiting after an event, so traffic was bumper-to-bumper going down the ramps with no room for error.....
Actually it does a good job in parking lots considering it hasn't been trained on them yet. Parking lots (aka ASS & Banish) is coming soon but NOT here yet. In parking lots it is "figuring out" how to drive based on City Street training data.

Also I live over a gigantic parking deck (about 1¼ miles in circumference) and our parking is in a nested area. I "play" around with enabling FSDS and it will follow lanes and make turns but of course has no "idea" were it is since there in zero training on this so far and no way to Navigate.