You can install our site as a web app on your iOS device by utilizing the Add to Home Screen feature in Safari. Please see this thread for more details on this.
Note: This feature may not be available in some browsers.
If you spent 20 minutes looking at the video from Prof KoopmanWhy don't you think that Tesla doesn't have "full OEDR" And by the way, humans don't either. How many deer are hit each year? Squirrels? Armadillos? Let alone walls. we had one truck mode an Interstate bridge when his dump truck was slightly raised and impacted a bridge at 70mph. "Full OEDR" is pretty comical.
DDT including fallback? Why do you think that it is missing? Take a look at the CPU architecture, there's a backup online and running with supervisory algorithms monitoring and waiting for a failure.
False again. SAE J3016 doesn't care about implementation techniques. It's a classification system. The most recent revision of J3016 is from 2021.But the most important thing is that the SAE specs were written with the concept of basic programmatic flow as it existed when written. Today's AI truly changes a lot of that.
Very rapidly... The world's only Level 2 system that will be autonomous in 6-12 months since 2016. Autonomous around 2150 then? The current system isn't much more reliable than two years ago. Getting from say 5-10 miles to 100 miles in city traffic (which it clearly isn't at) after three years of development is meaningless if you need 50000 miles+.Tesla is improving and it is improving at a rapid pace.
I'm not here to convert the cultists, but I aim to stop some of the obvious misinformed twisting of facts, falsifications and lies.I know, you don't believe that Tesla will ever do it. That's fine, But you aren't going to change the minds of the many who do believe it and who do see the advantages of it.
Just reviewing the levels chart.
View attachment 957871
I guess if the NHTSA or NTSB ever finds an instance where FSD is found guilty, that automatically puts the car in Level 3 (Human and CAR is responsible)
And like I inferred, FSD is performing somewhere between Level 4 and Level 5, from the Automation and Conditions lines, All it takes is changing the responsibility to make it a Level 4.
Traffic Aware Cruise Control with lane following is all that's required for Level 2. The chart makes it seem that it's a small jump between Level 2 and Level 3. Tesla was well beyond the Automation and Condition requirements 4 years ago.
So it comes down to disengagement and safety. Why do you think that Tesla reports that information? Keep the safety as good as a driver and the disengagements down and that starts the path to removing the steering wheel.
Truly, if Tesla was to turn off the nags, the cars would operate at a Level 4 mode.
I have spent the time reading it. I know that it was written following concepts that the writers believed were obvious when it was being written. Many of the terms that you are referring to as law, are actually concepts. It all follows the common methods of classical system design. And no one has been able to implement it.If you spent 20 minutes looking at the video from Prof Koopmanor spent an hour reading J3016, you would make less of a fool of yourself here. Just watch the damn thing so I don't have to write it all down for you?
False again. SAE J3016 doesn't talk at all about implementation techniques. It's a classification system. The most recent revision of J3016 is from 2021.
Very rapidly... The world's only Level 2 system that will be autonomous in 6-12 months since 2016. Autonomous around 2150 then? The current system isn't much more reliable than two years ago. Getting from say 5-10 miles to 100 miles in city traffic (which it clearly isn't at) after three years of development is meaningless if you need 50000 miles+.
I'm not here to convert the cultists, but I aim to stop some of the obvious wrongs, falsifications and lies. Like the ones you're spewing out.
Truly, if Tesla was to turn off the nags, the cars would operate at a Level 4 mode.
Tesla is now taking an approach similar to the human brain. Let's stop there for a second. The human brain is CURRENTLY THE ONLY FSD solution out there. and it WILL FAIL J3016 MISERABLY! How can this be? Something that fails J3016 but successfully provides FSD?
And what are those? I don't seem to have many/any.As long as there is ANY kind of driver fall back, it can't be above Level 3. Even if Tesla got rid of the nags it wouldn't get rid of the driver fallback requirement in various scenarios.
You are constantly talking out of your ass. If you've read the document, you clearly haven't understood it.I have spent the time reading it. I know that it was written following concepts that the writers believed were obvious when it was being written. Many of the terms that you are referring to as law, are actually concepts. It all follows the common methods of classical system design. And no one has been able to implement it.
LOL, you're really bubbled up. Tesla isn't first with anything. Waymo and all others has used machine learning from the very start 15 years ago. Heard of the DARPA challenge? DARPA Grand Challenge - WikipediaTesla moved away from classical system design years ago. It became evident to the team that there were far too many corners cases that would make the code unwieldy and that a new corner case would appear all the time.
Tesla has no limitations, but it handles nothing with any reliability guarantees either. That's because the human is responsible driving, and the system only does the partial OEDR. By your definition of "works" Tesla can drive in a snow storm in complete darkness.This is the problem that Cruise and Waymo have. From my knowledge they have safety drivers drive the routes daily looking for things that present themselves as issues. And Cruise, Waymo as well as MB have limitations of 45 mph.
You prove again and again that you know nothing about J3016, autonomous driving, computer vision or machine learning.Tesla is now taking an approach similar to the human brain. Let's stop there for a second. The human brain is CURRENTLY THE ONLY FSD solution out there. and it WILL FAIL J3016 MISERABLY! How can this be? Something that fails J3016 but successfully provides FSD?
Why don't you get behind the wheel in a Tesla with FSD with you blindfolded and let's see if that statement holds true. Let me know how far you get. I'd guess 5-15 minutes.At this time, FSD over 100 million miles of safer than human driving.
I have "driven" FSDb this year and ridden in a Waymo. Thanks for asking.But since your location is shown in Europe, I can only assume that you haven't experienced FSD or maybe only for a short period of time. It is perfectly believable that by reading many of the articles on the Internet that you don't believe that FSD can do what it does today.
But as someone who uses it daily, I know what it can do. In many cases I trust it more than my own driving.
You are constantly talking out of your ass. If you've read the document, you clearly haven't understood it.
No one has been able to implement SAE J3016 L4? What about Cruise, Waymo, Weride, Baidu, Deeproute, AutoX?
LOL, you're really bubbled up. Tesla isn't first with anything. Waymo and all others has used machine learning from the very start 15 years ago. Heard of the DARPA challenge? DARPA Grand Challenge - Wikipedia
In 2004 machine learning was used by many/most contestants. That's 19 years ago.
Tesla has no limitations, but it handles nothing with any reliability guarantees either. That's because the human is responsible driving, and the system only does the partial OEDR. By your definition of "works" Tesla can drive in a snow storm in complete darkness.
You prove again and again that you know nothing about autonomous driving, computer vision or machine learning.
Why don't you get behind the wheel in a Tesla with FSD with you blindfolded and let's see if that statement holds true. Let me know how far you get. I'd guess 5-15 minutes.
I have "driven" FSDb this year and ridden in a Waymo. Thanks for asking.
You did. Assisted by FSDb. Why didn't you both sit in the back seat and shared some champagne otherwise?Then let's go with the simple response. Who drove me and my wife to lunch today? I didn't, she didn't
And what are those? I don't seem to have many/any.
And "Fully autonomous based on condition restrictions allows for those few places where it may not currently work to be excluded.
A simple example would be to use the old NoA rules of 4+ lane divided, restricted access highways. That's pretty much an easy task for FSD. This is something that Tesla could probably remove the nags and get passed currently as Level 4. Isn't it?
Then let's go with the simple response. Who drove me and my wife to lunch today? I didn't, she didn't
....and I would say living/driving in an urban environment >85% of the time almost ALL my drives prove it's NOT L3 ready.......And one scenario does not prove that it is ready for level 3.
Just because YOU haven't experienced any driver fallback requirements doesn't mean they don't exist. Just cover up the cabin camera while on autopilot and see what happens, yes yes you are going to then say ohh well that is just a "nag" based requirement for that camera...ok cover another camera.
Right now, if the car freaks too much out about whatever it will put up a take control immediately screen. Yes those cases are few but they can happen on any type of road in lots of different conditions.
Point being, Nags are not the only thing keeping Tesla from going level 3. And just because YOU don't experience these issues, doesn't mean they don't exist. Tesla is going to have to test the ability of the car to conduct it's own DDT fallback before they can rely on it anyway and there is not hint that they are imminently planning on doing that yet.
And this isn't really as "simple" as you try to want it to be because you have to define some things... The car didn't get to the location and park itself with no user intervention, nor did it pull up to the curb in front of a restaurant and stop in order for your wife to get out. The car if parked in a driveway at the beginning of your trip was not able to be put into FSDb without a certain level of user intervention besides stalk activation.
And one scenario does not prove that it is ready for level 3.
So, you are saying that the car has to work nearly perfectly, without many/any accidents.
So are you then a L1 or a L2 driver? What are humans in general. Because they certainly can't meet the requirements.
That's indeed one of the problems with the SAE standards, They assume a black and white world.
What are you even talking about... Look, you said if Tesla got rid of Nags then it would be L4, that is false. Your statement is also implying that that there is no driver fallback scenarios currently with Tesla's, that is also false and would be required for L4. This is my argument with you at this point about this. It's not complicated. I countered your statements with some explanations and you responded with talking about whether humans are L1 or L2 or whatever which isn't the conversation.
Some people try to say that Tesla's can do what they cannot do yet, and some people go the opposite and say that Tesla's cannot do anything and are death traps... both are incorrect.