Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Tesla announced it will offer FSD to competitors

This site may earn commission on affiliate links.
Why don't you think that Tesla doesn't have "full OEDR" And by the way, humans don't either. How many deer are hit each year? Squirrels? Armadillos? Let alone walls. we had one truck mode an Interstate bridge when his dump truck was slightly raised and impacted a bridge at 70mph. "Full OEDR" is pretty comical.
DDT including fallback? Why do you think that it is missing? Take a look at the CPU architecture, there's a backup online and running with supervisory algorithms monitoring and waiting for a failure.
If you spent 20 minutes looking at the video from Prof Koopman or spent an hour reading J3016, you would make less of a fool of yourself here. Just watch the damn thing so I don't have to write it all down for you?

But the most important thing is that the SAE specs were written with the concept of basic programmatic flow as it existed when written. Today's AI truly changes a lot of that.
False again. SAE J3016 doesn't care about implementation techniques. It's a classification system. The most recent revision of J3016 is from 2021.
Tesla is improving and it is improving at a rapid pace.
Very rapidly... The world's only Level 2 system that will be autonomous in 6-12 months since 2016. Autonomous around 2150 then? The current system isn't much more reliable than two years ago. Getting from say 5-10 miles to 100 miles in city traffic (which it clearly isn't at) after three years of development is meaningless if you need 50000 miles+.

I know, you don't believe that Tesla will ever do it. That's fine, But you aren't going to change the minds of the many who do believe it and who do see the advantages of it.
I'm not here to convert the cultists, but I aim to stop some of the obvious misinformed twisting of facts, falsifications and lies.
 
Last edited:
Just reviewing the levels chart.
View attachment 957871

I guess if the NHTSA or NTSB ever finds an instance where FSD is found guilty, that automatically puts the car in Level 3 (Human and CAR is responsible)

And like I inferred, FSD is performing somewhere between Level 4 and Level 5, from the Automation and Conditions lines, All it takes is changing the responsibility to make it a Level 4.

Traffic Aware Cruise Control with lane following is all that's required for Level 2. The chart makes it seem that it's a small jump between Level 2 and Level 3. Tesla was well beyond the Automation and Condition requirements 4 years ago.

So it comes down to disengagement and safety. Why do you think that Tesla reports that information? Keep the safety as good as a driver and the disengagements down and that starts the path to removing the steering wheel.

Perhaps you should look at the actual SAE chart first, and please make sure it is the current one.

Truly, if Tesla was to turn off the nags, the cars would operate at a Level 4 mode.

Again, look at the current chart...perhaps read the 41 page SAE document that actually explains everything as well. J3016_202104 is the current version by the way.
 
If you spent 20 minutes looking at the video from Prof Koopman or spent an hour reading J3016, you would make less of a fool of yourself here. Just watch the damn thing so I don't have to write it all down for you?


False again. SAE J3016 doesn't talk at all about implementation techniques. It's a classification system. The most recent revision of J3016 is from 2021.

Very rapidly... The world's only Level 2 system that will be autonomous in 6-12 months since 2016. Autonomous around 2150 then? The current system isn't much more reliable than two years ago. Getting from say 5-10 miles to 100 miles in city traffic (which it clearly isn't at) after three years of development is meaningless if you need 50000 miles+.


I'm not here to convert the cultists, but I aim to stop some of the obvious wrongs, falsifications and lies. Like the ones you're spewing out.
I have spent the time reading it. I know that it was written following concepts that the writers believed were obvious when it was being written. Many of the terms that you are referring to as law, are actually concepts. It all follows the common methods of classical system design. And no one has been able to implement it.

Tesla moved away from classical system design years ago. It became evident to the team that there were far too many corners cases that would make the code unwieldy and that a new corner case would appear all the time. This is the problem that Cruise and Waymo have. From my knowledge they have safety drivers drive the routes daily looking for things that present themselves as issues. And Cruise, Waymo as well as MB have limitations of 45 mph.

Tesla is now taking an approach similar to the human brain. Let's stop there for a second. The human brain is CURRENTLY THE ONLY FSD solution out there. and it WILL FAIL J3016 MISERABLY! How can this be? Something that fails J3016 but successfully provides FSD?

Elon has proven that his companies are quite adept at implementing things that "Can't be done" SpaceX and their reusable rockets were impossible. "Electric vehicles would never catch on" but are now considered when total domination will occur, not IF.

So Tesla has now taken the AI approach that better follows the human mind. And because of that, they were able to make some leaps forward relatively quickly. As an example of the Navigate on AutoPilot was built using the classical processes, and it worked. But it was basically the easiest solution as it was limited to the highways and very restricted environment. It was the replacement of that at the end of last year that has caused a lot of the software release issues. But it's now using AI and does as well, if not better.

with AI being used as the basis for FSD, optimizing the AI model becomes the important part. And that often just means more and better training. Historically, training often meant rooms of workers classifying pictures. Tesla did this, but it wasn't scalable to the point needed. So they created an AI to classify the clips. and once the clips are categorized, they then have to go into the training. And that's a long process, for which Tesla has built Dojo, basically an enormous supercomputer dedicated to AI training. That's just coming online right now.

Right now, Tesla seem to be dropping new FSD software updates between 3 and 6 months. And once the software is on the car, it doesn't learn anymore. So, you can't expect the metrics to increase faster than the releases.

Tesla's goal is to provide FSD. Its goal is not to satisfy the line items of J3016. Again, Elon has shown that he can make the impossible possible, and that's often done by ignoring the rules that don't make sense.

At this time, FSD over 100 million miles of safer than human driving.


But since your location is shown in Europe, I can only assume that you haven't experienced FSD or maybe only for a short period of time. It is perfectly believable that by reading many of the articles on the Internet that you don't believe that FSD can do what it does today.
But as someone who uses it daily, I know what it can do. In many cases I trust it more than my own driving.

Quote J3016 all you want, I'm just telling you reality. And at the end of the day, Tesla is really close to being able to show regulators that FSD can drive all over the place as safe as a human. And that's the intent of J3016.
 
Tesla is now taking an approach similar to the human brain. Let's stop there for a second. The human brain is CURRENTLY THE ONLY FSD solution out there. and it WILL FAIL J3016 MISERABLY! How can this be? Something that fails J3016 but successfully provides FSD?

Because you don't understand J3016. No, the human brain would not fail J3016. On the contrary, the human driver is L5.

Definition of L5
The sustained and unconditional (i.e., not ODD-specific) performance by an ADS of the entire DDT and DDT fallback without any expectation that a user will need to intervene.

When manually driving, the human driver performs the sustained and unconditional performance of the entire DDT and DDT-fallback. So the human driver meets the definition of L5.

And the SAE defines unconditional ODD as:

“ Unconditional/not ODD-specific” means that the ADS can operate the vehicle on-road anywhere within its region of the world and under all road conditions in which a conventional vehicle can be reasonably operated by a typically skilled human driver."

So SAE J3016 literally defines the ODD of L5 as the same ODD as a typically skilled human driver! The human driver is basically the standard for L5. So no, it would not fail J3016.

Also, the human brain is not the only FSD solution. Waymo has FSD, Cruise has FSD. Yes, they are geofenced. But they are generalized FSD. They perform the entire DDT and DDT fall-back without human intervention.

And it is a fallacy that the only valid FSD solution has to be like the human brain (ie vision-only, end-to-end). That's just one way to try to approach solving FSD. It is not the only solution to FSD.
 
Last edited:
As long as there is ANY kind of driver fall back, it can't be above Level 3. Even if Tesla got rid of the nags it wouldn't get rid of the driver fallback requirement in various scenarios.
And what are those? I don't seem to have many/any.

And "Fully autonomous based on condition restrictions allows for those few places where it may not currently work to be excluded.

A simple example would be to use the old NoA rules of 4+ lane divided, restricted access highways. That's pretty much an easy task for FSD. This is something that Tesla could probably remove the nags and get passed currently as Level 4. Isn't it?
 
  • Love
Reactions: APotatoGod
I have spent the time reading it. I know that it was written following concepts that the writers believed were obvious when it was being written. Many of the terms that you are referring to as law, are actually concepts. It all follows the common methods of classical system design. And no one has been able to implement it.
You are constantly talking out of your ass. If you've read the document, you clearly haven't understood it.
No one has been able to implement SAE J3016 L4? What about Cruise, Waymo, Weride, Baidu, Deeproute, AutoX?
Tesla moved away from classical system design years ago. It became evident to the team that there were far too many corners cases that would make the code unwieldy and that a new corner case would appear all the time.
LOL, you're really bubbled up. Tesla isn't first with anything. Waymo and all others has used machine learning from the very start 15 years ago. Heard of the DARPA challenge? DARPA Grand Challenge - Wikipedia

In 2004 machine learning was used by the winner, Stanley, for example. That's 19 years ago.
This is the problem that Cruise and Waymo have. From my knowledge they have safety drivers drive the routes daily looking for things that present themselves as issues. And Cruise, Waymo as well as MB have limitations of 45 mph.
Tesla has no limitations, but it handles nothing with any reliability guarantees either. That's because the human is responsible driving, and the system only does the partial OEDR. By your definition of "works" Tesla can drive in a snow storm in complete darkness.
Tesla is now taking an approach similar to the human brain. Let's stop there for a second. The human brain is CURRENTLY THE ONLY FSD solution out there. and it WILL FAIL J3016 MISERABLY! How can this be? Something that fails J3016 but successfully provides FSD?
You prove again and again that you know nothing about J3016, autonomous driving, computer vision or machine learning.
Why don't we take a random YT video from outside the Tesla bubble:
for a system that's about parity with FSDb. A driver-assist system.

Or this:

At this time, FSD over 100 million miles of safer than human driving.
Why don't you get behind the wheel in a Tesla with FSD with you blindfolded and let's see if that statement holds true. Let me know how far you get. I'd guess 5-15 minutes.
But since your location is shown in Europe, I can only assume that you haven't experienced FSD or maybe only for a short period of time. It is perfectly believable that by reading many of the articles on the Internet that you don't believe that FSD can do what it does today.
But as someone who uses it daily, I know what it can do. In many cases I trust it more than my own driving.
I have "driven" FSDb this year and ridden in a Waymo. Thanks for asking.
 
Last edited:
You are constantly talking out of your ass. If you've read the document, you clearly haven't understood it.
No one has been able to implement SAE J3016 L4? What about Cruise, Waymo, Weride, Baidu, Deeproute, AutoX?

LOL, you're really bubbled up. Tesla isn't first with anything. Waymo and all others has used machine learning from the very start 15 years ago. Heard of the DARPA challenge? DARPA Grand Challenge - Wikipedia

In 2004 machine learning was used by many/most contestants. That's 19 years ago.

Tesla has no limitations, but it handles nothing with any reliability guarantees either. That's because the human is responsible driving, and the system only does the partial OEDR. By your definition of "works" Tesla can drive in a snow storm in complete darkness.

You prove again and again that you know nothing about autonomous driving, computer vision or machine learning.

Why don't you get behind the wheel in a Tesla with FSD with you blindfolded and let's see if that statement holds true. Let me know how far you get. I'd guess 5-15 minutes.

I have "driven" FSDb this year and ridden in a Waymo. Thanks for asking.


Then let's go with the simple response. Who drove me and my wife to lunch today? I didn't, she didn't

Really? Comparing AI today to machine learning in 2004?

And I will absolutely guarantee you that if you "have driven" it, you don't know how well it works. That's just like the Consumer Reports testers that don't like Teslas, because the spend too little time in one. They tend to hate the single display, that's because they can't get used to it.
It truly takes a few months to learn FSD. Otherwise you just keep hitting the disengage button.
 
I bet like some others have suggested that at most Elon was talking to one of the other executives (maybe Farley) about NACS. Elon suggested (maybe Ford) considering FSD and they politely said "yes we could very well be interested" to appease him. Of course Elon took that and ran with it. While Tesla is making good progress I just don't see any other manufacture showing more than a passing interest until/or if Tesla reaches L3 at least on all highways under most conditions.

While I and many LOVE the L2 in the city, I just don't see the general public clamoring for an L2 system for city driving. It will be more of an hindrance than an improvement and don't see it as a such a "must have" car selling point that other manufactures will beat a door to Tesla to license FSD.
 
Last edited:
And what are those? I don't seem to have many/any.

And "Fully autonomous based on condition restrictions allows for those few places where it may not currently work to be excluded.

A simple example would be to use the old NoA rules of 4+ lane divided, restricted access highways. That's pretty much an easy task for FSD. This is something that Tesla could probably remove the nags and get passed currently as Level 4. Isn't it?

Just because YOU haven't experienced any driver fallback requirements doesn't mean they don't exist. Just cover up the cabin camera while on autopilot and see what happens, yes yes you are going to then say ohh well that is just a "nag" based requirement for that camera...ok cover another camera.

Right now, if the car freaks too much out about whatever it will put up a take control immediately screen. Yes those cases are few but they can happen on any type of road in lots of different conditions.

Point being, Nags are not the only thing keeping Tesla from going level 3. And just because YOU don't experience these issues, doesn't mean they don't exist. Tesla is going to have to test the ability of the car to conduct it's own DDT fallback before they can rely on it anyway and there is not hint that they are imminently planning on doing that yet.

Then let's go with the simple response. Who drove me and my wife to lunch today? I didn't, she didn't

And this isn't really as "simple" as you try to want it to be because you have to define some things... The car didn't get to the location and park itself with no user intervention, nor did it pull up to the curb in front of a restaurant and stop in order for your wife to get out. The car if parked in a driveway at the beginning of your trip was not able to be put into FSDb without a certain level of user intervention besides stalk activation.

And one scenario does not prove that it is ready for level 3.
 
Please confirm, running 2023.26.x, Tesla is eye tracking to make sure you are watching the road and then hits you with nag if you aren’t.
With this code I did notice AP smoother and felt less nags by staring forward and looking at the road.
 
  • Informative
Reactions: pilotSteve
I worked in the semiconductor industry from 1974. Licensing was a big thing. Back then everybody needed a license from Bell, because non-expired patents taken out by Bell Labs would block you from making much of anything. The usual deal was that you worked up a cross-license deal with them, with not much money changing hands in case the property on each side was deemed imbalanced. Most of the major players also had cross-license agreements with each other.

Possible interest by each side in licensing is NOT necessarily the same thing as saying that company XYZ has intentions of running an intact copy of Tesla software release n.nn.nnn on their vehicles.
 
  • Informative
Reactions: pilotSteve
Just because YOU haven't experienced any driver fallback requirements doesn't mean they don't exist. Just cover up the cabin camera while on autopilot and see what happens, yes yes you are going to then say ohh well that is just a "nag" based requirement for that camera...ok cover another camera.

Right now, if the car freaks too much out about whatever it will put up a take control immediately screen. Yes those cases are few but they can happen on any type of road in lots of different conditions.

Point being, Nags are not the only thing keeping Tesla from going level 3. And just because YOU don't experience these issues, doesn't mean they don't exist. Tesla is going to have to test the ability of the car to conduct it's own DDT fallback before they can rely on it anyway and there is not hint that they are imminently planning on doing that yet.



And this isn't really as "simple" as you try to want it to be because you have to define some things... The car didn't get to the location and park itself with no user intervention, nor did it pull up to the curb in front of a restaurant and stop in order for your wife to get out. The car if parked in a driveway at the beginning of your trip was not able to be put into FSDb without a certain level of user intervention besides stalk activation.

And one scenario does not prove that it is ready for level 3.

So, you are saying that the car has to work nearly perfectly, without many/any accidents.

So are you then a L1 or a L2 driver? What are humans in general. Because they certainly can't meet the requirements.

That's indeed one of the problems with the SAE standards, They assume a black and white world.
 
So, you are saying that the car has to work nearly perfectly, without many/any accidents.

So are you then a L1 or a L2 driver? What are humans in general. Because they certainly can't meet the requirements.

That's indeed one of the problems with the SAE standards, They assume a black and white world.

What are you even talking about... Look, you said if Tesla got rid of Nags then it would be L4, that is false. Your statement is also implying that that there is no driver fallback scenarios currently with Tesla's, that is also false and would be required for L4. This is my argument with you at this point about this. It's not complicated. I countered your statements with some explanations and you responded with talking about whether humans are L1 or L2 or whatever which isn't the conversation.

Some people try to say that Tesla's can do what they cannot do yet, and some people go the opposite and say that Tesla's cannot do anything and are death traps... both are incorrect.
 
  • Like
Reactions: spacecoin
What are you even talking about... Look, you said if Tesla got rid of Nags then it would be L4, that is false. Your statement is also implying that that there is no driver fallback scenarios currently with Tesla's, that is also false and would be required for L4. This is my argument with you at this point about this. It's not complicated. I countered your statements with some explanations and you responded with talking about whether humans are L1 or L2 or whatever which isn't the conversation.

Some people try to say that Tesla's can do what they cannot do yet, and some people go the opposite and say that Tesla's cannot do anything and are death traps... both are incorrect.

What I am saying is that the car can do dramatically more than what the anti-FSD videos on YouTube show.
There are so many that show situations for which the operators didn't give the car a chance to react. Just because a car looks like it going to run off a cliff doesn't mean that it will. Reaction time is a huge thing here. I have proven many times that some drivers will disengage way before others do.

I am not saying that the car will do everything. What I am saying, as most all the pundits here believe that the car will do a LOT, much more than the pundits give it credit for.

I *KNOW* the amount of driving that FSD is doing for me and my wife.