Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Why are TACC, AP (and ?FSD) so bad?

This site may earn commission on affiliate links.
There are many if you trust the NHTSA reports (which are public). I disagree with the press being loud wrt FSD. Quite the contrary, given the level of this bait and switch scam.

Yes it is nonsense, since your statement was "Yet the early numbers for FSD beta also look good". Now you're talking about miles, but that's irrelevant given the non-disclosed number of incidents - which typically companies with autonomous ambitions would report to the DMV by law. But not Tesla.

I don't think NHTSA is done. They have decided on a strategy, which seem to be "pick on all the unsafe operations". It's a smart strategy, and really the only way forward as I see it. I think NHTSA will keep doing this until Tesla promises that they've fixed it. Since it's impossible to provide such guarantees in an unbounded ODD, NHTSA will get their chance to strike down on this experiment if they chose to. Given Teslas track-record of non-improvement in terms of reliability, this won't likely take too long... <8 miles / DE for over 20 months, it's laughably bad.

No I haven't had a chance to get it delivered to my car and Tesla won't be able too either in the coming 2-3 years. This is something I learned by researching the UNECE R79 amendments and drafts. My car will be six years old by then, and it doesn't even have NoA - and Tesla won't refund. They keep selling FSD in EU to more sheep.

With regards to reliability, I look at the data. At first a looked at the reports by Dirty Tesla and more recently the Community Tracker: Home - it's alarmingly flat miles / DE. 4-8 miles for 30 months of public beta. Not bullish.
Sorry, you clearly have an agenda, so there is little point in discussing this further. Apparently you are angry at Tesla for not giving you something you wanted, which is fine, but sadly this seems to have overflowed into your assessments of FSD. In what way is it 'laughably bad" that the car is doing 8 miles between DE's, most of which (as I noted) are "embarrassment driven" because the car is being super-cautious? No-one else has ANY form of ADAS system anywhere near to FSD, but apparently you "know" how fast they are supposed to progress on FSD, based on .. WHAT exactly?

As for mileage, I stand by my statement .. FSD has driven a LOT of miles, so where are all the accidents if it's so bad? And OF COURSE the press will jump on them instantly, as they do all things Tesla. Why would they not? They have ZERO advertising revenue from Tesla. And of course they DO .. any potential autopilot accident by a Tesla is headline news. But, again, after the hysteria has died down, it mostly turns out NOT to be AP/FSD (for example, in the model S that hit a tree at high speed).
 
  • Like
Reactions: ewoodrick
In what way is it 'laughably bad" that the car is doing 8 miles between DE's, most of which (as I noted) are "embarrassment driven" because the car is being super-cautious? No-one else has ANY form of ADAS system anywhere near to FSD, but apparently you "know" how fast they are supposed to progress on FSD, based on .. WHAT exactly?
Given that Supervised RL systems typically has an S-curve, it's not likely to progress faster after 2.5 years than before. As there is very little progress in terms of reliability, it's not likely to become autonomous. The value of an L2 in a city environment that will likely driver slower and less safe than a human is questionable. Why use it? Ask yourself; Will my dad use this product? Will my wife? Why/why not? At present it's a gimmick, like smart summon or the x-mas show.
As for mileage, I stand by my statement .. FSD has driven a LOT of miles, so where are all the accidents if it's so bad?
What statement? There are no KPI:s available outside of the community tracker and similar efforts - which you just dissed.

There are few accidents as the drivers doesn't trust the system. If the performance would, let say, magically 10x to 80 miles per DE, there will be more accidents (than a human driver) until the system 1000x:s (which it obviously never will on current hardware).

S-curve: The End of Starsky Robotics
Valley of degraded supervision: Tech Roadmap for Automakers Disillusioned With Robotaxis | The Ojo-Yoshida Report
 
Last edited:
  • Like
Reactions: COS Blue
Given that Supervised RL systems has an S-curve, it's not likely to progress faster after 2.5 years than before. As there is very little progress in terms of reliability, it's not likely to become autonomous. The value of an L2 in a city environment that will likely driver slower and less safe than a human is questionable. Why use it? Ask yourself; Will my dad use this product? Will my wife? Why/why not?

If I'm understanding your statement correctly, While Tesla may be using Supervised RL for training, that's not the only inputs that go into decision making and a lot of that I don't think is even in the cars yet, V11 I believe brings that to the table. While Tesla has been using AI for recognitions, they've been using classic logic for decision making. V11, I believe is where the classic logic starts to get replaced with AI.
If I'd ask my wife if she used it, she'd think I was crazy! She knows that I know that she uses it EVERY SINGLE DAY.

But as has been stated to you, there are a lot of obvious safety stops built-in today. You can see the car often be overly cautious. But if you don't use it, or using it rarely, you probably see the hesitance and override it.
What statement? There are no KPI:s available outside of the community tracker and similar efforts - which you just dissed.

There are few accidents as the drivers doesn't trust the system. If the performance would, let say, magically 10x to 80 miles per DE, there will be more accidents (than a human driver) until the system 1000x:s (which it obviously never will on current hardware).

S-curve: The End of Starsky Robotics
Valley of degraded supervision: Tech Roadmap for Automakers Disillusioned With Robotaxis | The Ojo-Yoshida Report

Try reading FSD Mileage or even more current 90 million miles of FSD!!

Yes, 90 million miles of "drivers doesn't trust the system." And how many accidents have been reported by the press, let alone the number of accidents reported by the press that actually end up being FSD?

And instead of reading articles about failed self-driving attempts try reading ones that are succeeding!! The ones that about failures tell you how to not do it. That's not the way that Tesla does it!

Five years later and AV professionals are no longer promising Artificial General Intelligence after the next code commit.

Interesting that ChatGPT seems to be changing that opinion.
 
If I'm understanding your statement correctly, While Tesla may be using Supervised RL for training, that's not the only inputs that go into decision making and a lot of that I don't think is even in the cars yet, V11 I believe brings that to the table. While Tesla has been using AI for recognitions, they've been using classic logic for decision making. V11, I believe is where the classic logic starts to get replaced with AI.
If I'd ask my wife if she used it, she'd think I was crazy! She knows that I know that she uses it EVERY SINGLE DAY.

But as has been stated to you, there are a lot of obvious safety stops built-in today. You can see the car often be overly cautious. But if you don't use it, or using it rarely, you probably see the hesitance and override it.


Try reading FSD Mileage or even more current 90 million miles of FSD!!

Yes, 90 million miles of "drivers doesn't trust the system." And how many accidents have been reported by the press, let alone the number of accidents reported by the press that actually end up being FSD?

And instead of reading articles about failed self-driving attempts try reading ones that are succeeding!! The ones that about failures tell you how to not do it. That's not the way that Tesla does it!



Interesting that ChatGPT seems to be changing that opinion.
That second link about 90 million miles is only AP data. Very impressive, on its own, but still almost useless. Does not say when AP is activated.

Highways are much safer than city streets. AP is only used on highways (supposed to be). The inherent bias of AP miles is not accounted for, as usual, since forever.

I still agree these cars are the safest by far, and I love AP (on the updates where PB is not an issue, like it is for me right now), but we have to be careful about making any concrete conclusions from that data. It's just impossible.
 
That second link about 90 million miles is only AP data. Very impressive, on its own, but still almost useless. Does not say when AP is activated.

Highways are much safer than city streets. AP is only used on highways (supposed to be). The inherent bias of AP miles is not accounted for, as usual, since forever.

I still agree these cars are the safest by far, and I love AP (on the updates where PB is not an issue, like it is for me right now), but we have to be careful about making any concrete conclusions from that data. It's just impossible.
So the caption is wrong?

fsd-beta-miles-driven.jpg


Interesting since it seems to match the introduction dates of the FSD waves.
 
So the caption is wrong?

fsd-beta-miles-driven.jpg


Interesting since it seems to match the introduction dates of the FSD waves.
Never mind, I was talking about an entirely different link than either of those 2. Some how I got sent to the link about how much safer AP is than all other cars per million miles travelled.

Apologies, my answer was irrelevant to your point.

Edit: I see what my point was. That article says something like FSD is getting amazingly safer, and then links to a story about only AP stats. So yeah, sloppy reporting.

"Furthermore, with over 90 million miles driven on FSD outside of highways, the published data shows a clear improvement in safety statistics,"

It's just totally misleading. The link is ONLY about AP.
 
Never mind, I was talking about an entirely different link than either of those 2. Some how I got sent to the link about how much safer AP is than all other cars per million miles travelled.

Apologies, my answer was irrelevant to your point.

Edit: I see what my point was. That article says something like FSD is getting amazingly safer, and then links to a story about only AP stats. So yeah, sloppy reporting.

"Furthermore, with over 90 million miles driven on FSD outside of highways, the published data shows a clear improvement in safety statistics,"

It's just totally misleading. The link is ONLY about AP.
Since FSD is a superset of AP, wouldn't that suggest that FSD Safety may have increased as well? So maybe not totally misleading.
 
And instead of reading articles about failed self-driving attempts try reading ones that are succeeding!! The ones that about failures tell you how to not do it. That's not the way that Tesla does it!
Like this one? Waypoint - The official Waymo blog: First Million Rider-Only Miles: How the Waymo Driver is Improving Road Safety

I like the transparency, in contrast to Tesla's "safety" reports.
Interesting that ChatGPT seems to be changing that opinion.
In the same way as many in the Tesla community believe that Tesla is an innovator or leader in autonomy, only laymen and see ChatGPT as a step towards AGI. It has no idea what it it predicting. "Most likely next word" is what is does. Follow some researchers like Yoshua Bengio, Geoffrey Hinton, and Yann LeCun or Gary Marcus on Twitter for a more sober take.
 
  • Like
Reactions: drtimhill
Non FSD, I drive by the same sign in AP where it goes from 65 to 50 all the time. If a truck is on my right and blocks the sign it stays at 65. If it sees the sign, it drops to 50. Just sayin.

(and it show the sign if it sees it, not if it does not)

What vehicle are you in when it does this? Can you make a video to show it reading the sign and then not reading the sign?

And sorry, I am not trying to say I don't believe you or any one else, my point is that there is conflicting info(probably related to unknown variables) and I like to conduct tests to show the different results.
 
Like this one? Waypoint - The official Waymo blog: First Million Rider-Only Miles: How the Waymo Driver is Improving Road Safety

I like the transparency, in contrast to Tesla's "safety" reports.

In the same way as many in the Tesla community believe that Tesla is an innovator or leader in autonomy, only laymen and see ChatGPT as a step towards AGI. It has no idea what it it predicting. "Most likely next word" is what is does. Follow some researchers like Yoshua Bengio, Geoffrey Hinton, and Yann LeCun or Gary Marcus on Twitter for a more sober take.

And Waymo is having to work hard to convince the public to use it's vehicles. But more specifically, Waymo is commercially carrying customer, EXTREMELY different thing that Tesla is doing.

But again, so what. The entire world is sorry that you don't like the way Tesla is reporting. Does that make the product any different. No.

It just means that when people show you facts, you have problems disclaiming them with facts.

How many square miles is Waymo available in? Why isn't it more? That's because they are using hyper-accurate maps and information that does not scale worldwide, let alone nationwide. I dare say that if construction occurs in one of their areas, they have to be specifically programmed to get around it. They probably have the people driving their territory at least once a day to look for differences.

Waymo is a different technology that has significant problems scaling. It solves their problem, but it's not a generic driverless car.
 
How would one do this?

Well drtimhill indicated that there are already videos showing it, but those were all AP1 Model S vehicles with the MobileEye tech.(that I have come across)

OxBrew implies that he has a situation that is regularly repeatable so it wouldn't be very hard to set up a GoPro or have a passenger record the scenario.

There have also been implications by various people that say this is commonly reported, which would open up the possibilities and specific signs and people that could test the idea and document it.

I will attempt to find a good speed change location that I could test and document...After finding a good location, then it is just a matter of getting into the proper traffic scenario and get my vehicle and another vehicle in the correct geometry to create the required test conditions.

Actually the other easier thing would be get a place where the speed changes and have someone temporarily cover the speed limit sign...easy enough to do with a cohort on a residential or secondary road with minimal traffic...I'll have to look into trying this too.

As of right now I have 2 signs that do not get read at all, whether driving on my own, using TACC only, or using full AP...all on FSDb. Unfortunately there is an issue that prevents me from publicly posting any videos. This does push another theory of mine that there is another suspected variable(map data) that is causing my specific issue. Unfortunately, the theory includes TomTom data which I do not currently have access to actually look at to try and account for.
 
Last edited:
  • Like
Reactions: zoomer0056
Had an interesting event this morning. Admittedly I wouldn’t call it a “problem” but interesting nonetheless.

On a highway with AP on. Traffic slowed to bumper to bumper at about 20-30mph. Suddenly a car 2-3 vehicles ahead stopped completely and my AEB fully kicked in flawlessly. Beeping, red on dash, very definitive braking to avoid rear ending car in front. My S stopped(slowed to about 2mph) right about two car lengths behind the car in front of me. In my rear view I noticed the car behind me not slowing as fast as I felt it needed to so I hit the accelerator to get closer to the car I’m front of me and to give the car behind me a bit more room to stop. My s still has the red light and was still beeping and it didn’t respond quickly to my press of the accelerator. All in all, no accident occurred but it did feel odd that my foot on accel was kinda overridden by the AEB…a bit.
 
Had an interesting event this morning. Admittedly I wouldn’t call it a “problem” but interesting nonetheless.

On a highway with AP on. Traffic slowed to bumper to bumper at about 20-30mph. Suddenly a car 2-3 vehicles ahead stopped completely and my AEB fully kicked in flawlessly. Beeping, red on dash, very definitive braking to avoid rear ending car in front. My S stopped(slowed to about 2mph) right about two car lengths behind the car in front of me. In my rear view I noticed the car behind me not slowing as fast as I felt it needed to so I hit the accelerator to get closer to the car I’m front of me and to give the car behind me a bit more room to stop. My s still has the red light and was still beeping and it didn’t respond quickly to my press of the accelerator. All in all, no accident occurred but it did feel odd that my foot on accel was kinda overridden by the AEB…a bit.
As @Daniel in SD noted from the manual, you have to press the accelerator hard to override AEB, so your experience was intended. And it's good to hear it worked as intended. It's never fun testing safety features, we just trust they're there and work.
 
  • Like
Reactions: 2101Guy
Had an interesting event this morning. Admittedly I wouldn’t call it a “problem” but interesting nonetheless.

On a highway with AP on. Traffic slowed to bumper to bumper at about 20-30mph. Suddenly a car 2-3 vehicles ahead stopped completely and my AEB fully kicked in flawlessly. Beeping, red on dash, very definitive braking to avoid rear ending car in front. My S stopped(slowed to about 2mph) right about two car lengths behind the car in front of me. In my rear view I noticed the car behind me not slowing as fast as I felt it needed to so I hit the accelerator to get closer to the car I’m front of me and to give the car behind me a bit more room to stop. My s still has the red light and was still beeping and it didn’t respond quickly to my press of the accelerator. All in all, no accident occurred but it did feel odd that my foot on accel was kinda overridden by the AEB…a bit.
In many instances, overriding the accelerator is exactly what AEB is designed to do.

As is shown on some commercial, AEB kicking in as someone backs out of a driveway. Driver is pressing accelerator, AEB says no. people in back of the car are saved.
 
  • Like
Reactions: 2101Guy
See Dirty Tesla for a good example:
(in a model 3)
PERFECT! Great video, good testing, but the conclusions aren't supported by the data collected in the video.

Let me start by saying my background is designing and running tests on how different electronics work and don't work in different situations. The tests that the person in the referenced video does is pretty good and gets some interesting results. Unfortunately those results do not prove that the car is READING the speed on speed limit signs. It does seem to show that the car is SEEING the speed limit signs but the data actually seems to show that the car is using some other data source to get the actual speed data.

At timestamps 5:01, 5:24, 5:57 it didn't display sign
At timestamps 6:37, 6:53, 7:35, 7:53, displayed sign correctly, changed speed in car, but this by itself doesn't prove that it wasn't map data related.
At Timestamps 7:04, 7:27, displayed sign but again doesn't prove it was vision and not map data.
7:15, displayed school zone speed limit sign but didn't apply it...there is no indication that Tesla acknowledges School Zone markings, therefor this could still be map data related.

Ok so to summarize the above results, they don't negate the real idea that the car uses map data to get speed data. It also shows that the car isn't displaying all speed limit signs that one would believe should be shown in the conditions provided.

Now it gets more interesting with a better testing methodology.

7:58, So this is where he puts up multiple home made signs up in a parking lot. Lets go through all of the signs and analyze.
- First was a 25mph sign. The car started out thinking it was 25mph anyway, and displayed a sign that showed 25mph..ok great, seems to show that it is reading and displaying right? Well hold on... - Second sign 8mph. Car showed a sign "super far away" and it showed 25mph. He backed up and tried again and it still displayed the sign "far away" but it showed and applied 15mph.
- Next sign was a blank sign(no number), and it displayed the sign in a more correct location but still showed and applied 15mph with the person assuming that the car is remembering the last thing it displayed(which was wrong by the way).
- Next sign, 100mph...same as before, displayed a sign but it still showed 15mph...try number 2 and it showed and applied a sign that said 30mph.
- Last sign, 9,001mph, and it displayed a 30mph sign.
- Lastly he took the sign away completely and it still displayed a 30mph sign.

My last stated opinion was "I don't believe the car ALWAYS sees, displays and reacts to clearely visable speed limit signs." I will expound on this a bit. First, I believe the car EITHER has problems in the identification of speed limit signs, OR just decides to ignore some signs for unknown reasons(this based on my personal observation daily of two signs that I pass that are NEVER displayed). Second, I believe that the car does REACT to signs that it DISPLAYS on screen.(based on the parking lot tests in the referenced video) Lastly, I do NOT believe that the car actually READS the speed on the speed limit signs.(based on the parking lot tests in the referenced video). I think that the car uses some other data source or just guesses what it thinks the speed might be.

i have thought about doing this type of parkinglot test, but didn't because I didn't want to have the possibility of getting bad data due to it being a parking lot and not a regularly marked public road...but it turns out there was some good data.

I'm not trying to be argumentative with my assertions I am looking at this more scientifically because of my own observations(and others) that conflict with the idea that the car visually "READS" speed limit signs. I am not saying that the car cannot read them, I am saying that I don't believe the car currently IS reading them.