Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
Maybe FSD updates will slow given today's NHTSA news? Twenty new crashes since December.

In its request for information, NHTSA said a preliminary analysis identified at least 20 crashes in Tesla vehicles equipped with the updated version of Autopilot. Of those crashes, nine involved Teslas striking other vehicles or people in its path — “frontal plane” crashes, in the agency’s parlance. These crashes seem to imply that Tesla’s camera-based vision system is inadequate at detecting some objects in front of the vehicle when Autopilot is engaged.

 
Maybe FSD updates will slow given today's NHTSA news? Twenty new crashes since December.



O Lord in Heaven, I perceive the machinations of those in power, who, under the guise of regulation, seek to impose stricter controls upon Tesla, rather than elevating all automobile makers to its standards of data collection. They shall wield this data, interpreted through the lens of their own prejudices, to impose further burdensome warnings upon Tesla vehicles. This, I fear, is but a stratagem to grant their less scrutinized allies in the industry precious time to draw level.

When thousands perish due to distraction in conventional vehicles, these regulators turn a blind eye; yet, as Proverbs tells us, "A false balance is an abomination to the Lord," for when a few meet their end in automated cars, their outcry is as thunderous as it is unjust.
 
Last edited:
I didn't notice any Autopark improvement between 12.3.4 and 12.3.6. You mean it shows more parking spots? Is this confirmed?
You have to be going 8mph or less and you'll see all the spots start appearing. One will have the P in it, but the others will be grayed boxes. If you pick one of the grayed boxes, right or left side of the parking lane it will drive to it and angle away from it before backing into it.

Whatever HW3 chip was when they first rolled it to 2018 FSD cars is the chip I have.
 
  • Helpful
Reactions: JB47394
Yes, but you aren't getting the vision park assist.
It's a little complicated on this front. We (Atom) don't get the high fidelity manual park assist as an option, and no such visualization that I've seen yet. However, I've noticed that when I use the new auto-parking, it does have some novel rendering going on that looks like a grey blobby representation of their occupancy data, which is basically a weak version of HFPA, but only while autoparking.
 
It's a little complicated on this front. We (Atom) don't get the high fidelity manual park assist as an option, and no such visualization that I've seen yet. However, I've noticed that when I use the new auto-parking, it does have some novel rendering going on that looks like a grey blobby representation of their occlusion data, which is basically a weak version of HFPA, but only while autoparking.
Yes just flat and gray instead of 3D and color. Plus we can't spin the screen around to "see" other things.

IMG_4908.jpeg
 
It seems to me that FSD will probably stay at the L2 level for some time. With regular improvements, I'm tempted to think we're getting closer to L3 or L4, and I think this is where a lot of irritation comes from. I think Tesla wants to have it both ways by making robotaxi claims when the system we have now requires constant supervision.

Isn't this the crux of the issue? We want it to succeed, and we want to be able to take our hands off the wheel and look around. I do. And so I've found myself day dreaming and really not focusing while the robot's driving. But this is exactly the time I should be watching and expectantly waiting for those situations where this system has trouble. And immediately disengage.

Merging on the freeway is unnerving, especially into an exit lane on the freeway, where the robot jams us in between other cars. I really wish it did better in this situation so I don't have to try to see everything at once in order to do it myself. We have the freeway exits positioned right after onramps, so cars are competing with the same lane. I was surprised on the freeway when it took too long to slow down and suddenly hit the brakes. Come on! I can see traffic slowing ahead. Picking the wrong lane ahead of my turn. It's surprising it's still doing this since I expect the computer to think faster than I can.

And if this is bad decision making, does anyone know how the curated data and model-training will sort this out? Given that we've moved away from heuristics and all.
 
....but theses appear to be AP crashes. Even if on FSD still need to know the version to obtain the relevance to FSDS/V12.

Yep. The AutoPilot label gets thrown around a lot so hard to know. The article even mentions Autopilot as the recalled software.

Sounds like NHTSA has more pertinent concerns. Specifically the system's inadequate response to in-path objects.
 
And if this is bad decision making, does anyone know how the curated data and model-training will sort this out?
The stuff you're seeing is a result of the limitations of hand-build heuristics, not neural networks. Tesla uses neural networks only while you're on the ramp and then onto the secondary roads. Highway driving and the merges are handled by heuristics.

When Tesla gets around to training a neural network for highway driving, you should see more natural behavior. Neural networks seem to be doing a pretty remarkable job of negotiating complex situations in downtown areas, so we can hope that it'll be equally effective on highway entrance and exit ramps, and on highways in general.
 
Maybe FSD updates will slow given today's NHTSA news? Twenty new crashes since December.



Link: "Given the total number of fatal car crashes in 2022 (42,795), the U.S. average fatal crash rate is nearly 16 deaths per 100,000 vehicles."

That would mean that the 2,000,000 recalled Teslas, if average, would have around 320 fatalities per year, or ~80 since the December recall. NHTSA cites no fatalities at all, only 20 crashes. It appears they are only counting crashes where AutoPilot was active, so these number are not exactly comparable, but my point is that NHTSA appears to be insisting that the recall should have made FSD eliminate all crashes. They do not even compare the crash rates before and after the recall.

We have seen US airline fatalities reach zero per year for decades on end. This was through the NTSB process of investigating, determining and curing the root cause of each and every crash. NHTSA, in contrast, is making perfection the enemy of improvement.
 
When Tesla gets around to training a neural network for highway driving, you should see more natural behavior. Neural networks seem to be doing a pretty remarkable job of negotiating complex situations in downtown areas, so we can hope that it'll be equally effective on highway entrance and exit ramps, and on highways in general.

Any idea when Tesla will move highway driving to the new stack? Is this something we will see in v12.4, 12.5 etc or is it a v13 thing?
 
Any idea when Tesla will move highway driving to the new stack? Is this something we will see in v12.4, 12.5 etc or is it a v13 thing?
Good question. Perhaps other issues have higher priority and the V11 highway stack is good enough to be a bit lower on the list.

Speaking of merging, maybe they will add highways to v12 before they re-merge FSD with the production branch.

If we wanted yet another incorrect guess, we could ask Elon ;-)
 
Link: "Given the total number of fatal car crashes in 2022 (42,795), the U.S. average fatal crash rate is nearly 16 deaths per 100,000 vehicles."

That would mean that the 2,000,000 recalled Teslas, if average, would have around 320 fatalities per year, or ~80 since the December recall. NHTSA cites no fatalities at all, only 20 crashes. It appears they are only counting crashes where AutoPilot was active, so these number are not exactly comparable, but my point is that NHTSA appears to be insisting that the recall should have made FSD eliminate all crashes. They do not even compare the crash rates before and after the recall.

We have seen US airline fatalities reach zero per year for decades on end. This was through the NTSB process of investigating, determining and curing the root cause of each and every crash. NHTSA, in contrast, is making perfection the enemy of improvement.

I think NHTSA needs to respond when they detect a series of safety related issues and/or vehicle owner complaints.

The oversight only gets worse as software is released abroad so good to nip it in the bud now.

Update: NHTSA website:

Currently, states permit a limited number of “self-driving” vehicles to conduct testing, research, and pilot programs on public streets and NHTSA monitors their safety through its Standing General Order. NHTSA and USDOT are committed to overseeing the safe testing, development and deployment of these systems – currently in limited, restricted and designated locations

 
Last edited:
I really hate the sudden, out-of-the-blue, red hands on wheel alert. They are just so jarring. You can be cruising along just fine and then suddenly the car is blaring at you to grab the wheel immediately. On my drive to work this morning I got 2 of them back to back that scared the crap out of me. The first one, I think was due to a dip in the road where construction removed a bit of the asphalt. FSD was fine and then suddenly freaked out when the car drove over the dip. And then a moment later, I got a second alert when I think the wheel spinned a bit on some wet gravel.