Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
Yeah, bet the problem is more related to picking the safest driver videos to feed the NNs. These drives likely skew to the more conservative side with lower speeds.
I apologize for my safe driving. /s

(actually, this 'problem' with V12 is one of the reasons I'm interested in getting it and trying it out sooner rather than later. It is possible it matches my driving style. On the other hand, that video of the last second stopping at a stop sign is exactly why I avoid being an early adopter.)
 
Seems dam near perfect to me by comparison to previous versions granted I’m not hunting for a chauffeur replacement.
A chain's as strong as its weakest link, and there are a boatload of weak links still. I'm interested to see how my 12.3 issues are addressed (or not) going into the next updates: stop signs, speed, rights into traffic, glaring nav confusion at my funky intersection, etc.

I hope the march of nines is truly starting.
 
~12 disengagements (~9 for traffic lights (red) or stopped traffic; inability to ease off in time…
I've been trying to pay more attention to 12.3's stopping behavior given that you seem to have way more issue with it, and I wonder if it's more noticeable when there's lead vehicles slowing down. There was a previous comment that stopping at a line is trickier than accelerating from the line, but is that actually the common case for you?

I've noticed more frequent unnecessary deceleration when following another vehicle that is also slowing down for the red light. This could be end-to-end is being conservative in assuming the lead vehicle might slow down faster, but this is a bit trickier than stopping for a fixed line as the lead vehicle is still moving.
 
What’s the OEDR of an end to end NN based system? There’s also the fact that Tesla doesn’t even define the OEDR of FSD beta. And there’s the CEO of the company saying that the goal of V12 is unsupervised operation.
Tesla has found that draconian driver monitoring is enough to make FSD beta testing as safe as manual driving. I would be curious how the safety would compare to manual driving using the same driver monitoring.
Is there a tweet or something of Elon saying the goal for v12 is unsupervised operation?

The OEDR hasn’t changed, all the limitations specified in these 2020 letters still exist four years later. FSD is better at doing turns and navigating roundabouts, but those were never part of the limitations — those are customer experience improvements.
 
Why? Humans don't. [use more than just vision]
I use vision, prior knowledge of an area, knowledge of the traffic laws for my region, knowledge of behaviours of types of vehicles, ability to read and interpret most signs (some just make no sense and I live in a bilingual jurisdiction so temporary highway sign boards flash the message in one language then the second language and sometimes the message is more than one 'screen' long so one doesn't see both parts of the message before driving by) and SOUND.

And as my ears start to lose that fine-tuning of sound, I decided to purchase a car with blind spot monitoring. One bonus in choosing a tesla with fsd was supposed to be that it would help extend my driving options as my eyes struggle with haloing at night, but that turned out to be my misunderstanding of what fsd was (and likely will ever be given my current hardware level.)
 
I've been trying to pay more attention to 12.3's stopping behavior given that you seem to have way more issue with it, and I wonder if it's more noticeable when there's lead vehicles slowing down. There was a previous comment that stopping at a line is trickier than accelerating from the line, but is that actually the common case for you?

I've noticed more frequent unnecessary deceleration when following another vehicle that is also slowing down for the red light. This could be end-to-end is being conservative in assuming the lead vehicle might slow down faster, but this is a bit trickier than stopping for a fixed line as the lead vehicle is still moving.
Yesterday it was worse without lead vehicles. (Though it is a mixed bag with lead vehicles, depending on how they stop. For some reason the car tracks the lead vehicle and not the traffic light.)

When looking at this stopping behavior, be sure to be traveling at least 50mph towards traffic lights. Below 40mph disengagement is usually not needed.
 
Last edited:
  • Informative
Reactions: RabidYak
Elon says one thing, Tesla app says another, who do we believe?!

1711636406079.gif
 
On the surface, that's seems true. But I bet there's a lot more going on under the hood than meets the eye.
The idea of a procedural (C++) thread somehow monitoring and overriding FSDb when needed is...problematic. Because the better the NN gets at driving, the less confident the C++ code can be in making the decision to override. I can envision the C++ code as being a safety rail for a short time to address glaring flaws in behavior of the NN, but it does not seem like an effective long-term solution. The NN is just going to have to get really, really good all by itself AFAIK.
 
Elon says one thing, Tesla app says another, who do we believe?!

View attachment 1032997


I suppose it depends on your definition of "fine" as Elon is using it there.

In terms of long-term battery health, 95% is better than 100%. 90% is better than 95%. 80% is better than 90%. 50% is better than 80% while you're at it.

But it's increasingly diminishing returns for values of "better" as you drop down.


OTOH- for a long daily commute 100% is better than 95%, 95% is better than 90%, etc.... and at some point you go from fine to won't get the job done.
 
Elon says one thing, Tesla app says another, who do we believe?!

View attachment 1032997
That was 2019. I'd been charging both of ours to 90% until my app started saying 80%, so now I do 80% for day-to-day driving (and typically 90-100% immediately before setting out on a longer trip.)