Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Why are TACC, AP (and ?FSD) so bad?

This site may earn commission on affiliate links.
I will say one more thing... Proving that the car does READ speed limit signs can POSSIBLY be very easy to do...as long as the car isn't as smart as we hope it is.

The easiest test to do would be to take a public road and place a speed limit sign that is say 5mph or 10mph different than the roads speed limit.

Lets go through possible results of the above test...

1. Car DISPLAYS test speed limit sign with test sign speed correctly in the car visualization. The car updates the circled road speed on the display with the test sign speed.

This PROVES that the car is READING speed limit signs.

2. Car Displays test speed limit sign with test sign speed correctly in the car visualization but does NOT update circled road speed on the display with the test sign speed.

This probably proves that the car is reading the speed on the test sign but there is a question as to why it didn't update the circled road speed. Likely reason is that the car compared the visual result with another data source and the other data source had a higher priority for action.

3. Car Displays test speed limit sign but displays the incorrect test sign speed. Car does or does not update circled road speed display.

This would either be a general mistake in the reading of the value, or could fall in the category of the car is guessing based on other unknown data.

4. Car doesn't display sign at all and nothing updates.

This is a who knows what happened, more testing required.
 
And who gives a crap? We know that the car can read signs. We know that the car has the speed limit data in it's maps. As long as it is following the speed limit, that's all that matters.

And that video (or at least the front picture) may the reason why Tesla may focus on map data as opposed to signs. But I don't know and it doesn't matter as long as the car is doing it right.

Remember that if you are going into a decreased speed area and a truck blocks your view of the Speed Limit sign, you'll still be responsible for paying the ticket. There have been numerous speed traps over time where speed limit signs are obscured, Saying you didn't see didn't get you out of the fine.
 
7:58, So this is where he puts up multiple home made signs up in a parking lot. Lets go through all of the signs and analyze.
- First was a 25mph sign. The car started out thinking it was 25mph anyway, and displayed a sign that showed 25mph..ok great, seems to show that it is reading and displaying right? Well hold on... - Second sign 8mph. Car showed a sign "super far away" and it showed 25mph. He backed up and tried again and it still displayed the sign "far away" but it showed and applied 15mph.
- Next sign was a blank sign(no number), and it displayed the sign in a more correct location but still showed and applied 15mph with the person assuming that the car is remembering the last thing it displayed(which was wrong by the way).
- Next sign, 100mph...same as before, displayed a sign but it still showed 15mph...try number 2 and it showed and applied a sign that said 30mph.
- Last sign, 9,001mph, and it displayed a 30mph sign.
- Lastly he took the sign away completely and it still displayed a 30mph sign.
I suspect that the car uses matching NNs to specific speed limit signs, so oddball speed limits that don't really exist will not be 'read' correctly. The car is not doing a text recognition. It does a sign recognition. So, oddball signs might result in whatever standard sign is the closest match. Since there are no real 8, 100, 9001 or blank speed limit signs, the results are going to be somewhat random.
 
I think the whole speed setting shouldn't even be in FSD, it's like how the V11 highway implementation doesn't have discrete follow distance settings anymore, it's just "Chill", "Moderate", "Aggressive" (don't remember the exact naming). Having this self-driving program that does all this complicated planning and moving in and out of lanes and slowing for speed bumps and narrow corridors and turns and so forth, trying to map this discrete "cruise control" speed setting onto it just seems silly. I assume its only temporary and will just get replaced with Chill/Moderate/Aggressive personalities that weigh it toward higher or lower speeds.

Model should predict the speed based on the conditions like a human would, so that includes weather and the type of roadway, traffic, narrow corridors, turn curvature, pedestrians, schools, really a giant set of features that are hard to define. And as input to the model it should also include the last seen speed sign, and maybe map or fleet speed data. Kinda like how they input the number of lanes from map data into the lanes network so it can weigh in on the vision-based prediction.
 
  • Like
Reactions: Whoze
It does seem to show that the car is SEEING the speed limit signs but the data actually seems to show that the car is using some other data source to get the actual speed data.
Agreed that his testing is rather messy and sporadic. And yes it's important to distinguish between the car SEEING the speed limit sign and ACTING on the sign. I dont think there is any argument about the car seeing the signs, since it picks them up in the car display. But does it actually ACT on the signs?

There are probably videos around, but rather than Google all day, I can reference two data points. The first is that there was a LOT of discussion (and, more importantly, demonstration) about how the car would pick up speed limit signs on roadside billboards and act on them. Tesla seems to have fixed this, as the reports of this died down. More to the point, I have personally several time seen the car react to construction zone temporary speed limit signs. In each case the car displayed the sign, popped it into the "Speed Limit" display on the screen AND slowed down in response to the sign. Since these were temporary signs I highly doubt there was any map data to trigger this behavior, which seems pretty conclusive to me. One remaining possibility is that the car was actually responding to traffic cones (it does slow down for these also), but in the cases where the car "saw" a speed limit sign it maintained the reduced speed beyond the cones until it happened to hit another speed limit sign. I find this pretty persuasive.

The definitive test, of course, would be to place a "real" speed limit sign on a road the reduces the speed from (say) 35mph to 25mph and see how the car responds with and without this sign. The problem there, of course, is the legality of such an exercise.

There is also the issue of how the car resolves discrepancies between signage and map data. Which should win? My money would be on "lowest speed wins" for safety reasons, but I dont think anyone has tested this, and it obviously influences general testing, since if it IS "lowest wins" you can wave "900mph" signs all day long and never get the car to respond. It's also unclear to me if the car is looking for generic "XXXmph" signs or is simply trained to see individual signs with specific speeds on them, which again would impact test methodology.
 
You can reduce the collision warning level, maybe it was too sensitive for your driving style.
I've never had autopilot unable to follow a curve... maybe you were driving a bit fast? Were you at or close to the speed limit?
For phantom brakings, were you on a divided highway? If not divided, then it's documented that autopilot isn't meant for that. This comment might apply to the curve too...
Be prepared for when it happens. At a steady 30mph, it was happily aiming for a central curb in a left 70ish degrees bend. I took over controls. When merging from the right side onto a freeway that has a left bend, it took me so close to the freeway wall on my right that I took over control. I estimate about 3 feet from the wall. Would it have hit? MAybe not, but when there is 10 feet of lane on the left, why not use it?
 
  • Like
Reactions: zoomer0056
That's not news. It can take some curves and not others. Over the years, it can take more and more curves but not all.

That's an essential factor. Autopilot can adjust its speed to accommodate some curves but not others.

When it does not, drivers need to memorize which curves and start to reduce the speed in the future manually.


Again, to be consistent with the system, sometimes there are no phantom brakes, and sometimes there are. The 8 car pile up accident in San Francisco is well divided and it still happened

View attachment 907499
Wow! Look at all the idiots behind the Tesla causing the accident and pile-up. WiIl the NTSB open an inquiry into why all those human drivers behind the Tesla on why the human software (organic brain) behind all those vehicles failed to stop in time?? Oh right, they already did, and issued a bunch of safe following distances as a consequence of the 1974 Independent Safety Board Act passed by Congress.
 
  • Like
Reactions: zoomer0056
I think the whole speed setting shouldn't even be in FSD, it's like how the V11 highway implementation doesn't have discrete follow distance settings anymore, it's just "Chill", "Moderate", "Aggressive" (don't remember the exact naming). Having this self-driving program that does all this complicated planning and moving in and out of lanes and slowing for speed bumps and narrow corridors and turns and so forth, trying to map this discrete "cruise control" speed setting onto it just seems silly. I assume its only temporary and will just get replaced with Chill/Moderate/Aggressive personalities that weigh it toward higher or lower speeds.

Model should predict the speed based on the conditions like a human would, so that includes weather and the type of roadway, traffic, narrow corridors, turn curvature, pedestrians, schools, really a giant set of features that are hard to define. And as input to the model it should also include the last seen speed sign, and maybe map or fleet speed data. Kinda like how they input the number of lanes from map data into the lanes network so it can weigh in on the vision-based prediction.
It's interesting that you wish AP could predict traffic like a human does. I believe that it does.. like, my Grandma. She freaks out when a semi passes by in the next lane, doesn't slow down until she is almost up the tailpipe of the vehicle ahead then backsaway (because she couldn't see it clearly from far away), hesitates at stop signs, sometimes tries to drive through stop signs and traffic red lights, scares me half to death taking hairpins at a higher speed than prudent (oh, wasn't expecting it to be that sharp 'o turn).:cool:
 
  • Like
Reactions: zoomer0056
Just recently took another long drive in my 2022 Model S LR and tried out TACC and AP again. (I didn't purchase FSD since it's not ready for prime time).
Still serious problems:
- Multiple phantom braking episodes on deserted straight roads (single and double lane)
- Multiple "Collision warning" panic attacks... again on empty straight roads
- AP doesn't handle curves below a certain radius and drifts out of its lane
- AP does adjust speed when the posted speed changes (good) but TACC doesn't (bad)

I don't know if FSD has these same problems but I certainly don't want to spend more money to find out.
I'd like TACC and AP to just do what was advertised... control the speed of the car and keep it in its lane. (And please, stop panicking)
I agree with you on all those issues. I used to drive a multi-million dollar vehicles whose autopilot has tried to kill me and my passengers so many times it has become routine. AP drives like an old grandma. Way too conservative, but I am set to "Chill", max following distance, and brake on release of accelerator, max collision avoidance. There are less conservative modes that I will eventually try out and might help with your issues, but I like my Tesla safety score of 99 which makes insurance more affordable than my Prius.
 
  • Like
Reactions: zoomer0056
Well then TACC is a TERRIBLE L2 driver assist. My mistake for assuming it was as good at NOT slamming on the brakes as our Volt, Caddy and Mustang Adaptive cruise control on regular roads. NEVER had phantom braking before on those and didn't even know Phantom Braking was a thing until got a Tesla. And it sucks.

To answer the question, Tesla can't do as good a job as Mobile Eye at controlling the speed of a L2 car. Before Tesla got into a pizzing match and wanted to over sell the Mobile Eye based AP back in 2016, Tesla used Mobile Eye and it was good. Then TechNOking oversold it, Mobile Eye pulled themselves from Tesla, and Tesla hasn't been able to do as good a job since on there own. Atleast that's my understanding from descriptions of MS owners than had the old, good Mobile Eye system.
I agree wholeheartedly. My wife has a Model 3 2021 and it has Enhanced Autopilot and it has phantom braked on the highway three times. It has the Radar, USS and Camera on the side and front.
I have a Tesla Model X 2016 FUSC and with the Mobil Eye and use the TACC and Lane Keeping and changing with Turn Signal. Never had a Phantom Braking, it simply slows down at a reasonable pace when in danger.

Not sure why and not wishing for a pissing match, just my observations.
 
  • Informative
Reactions: COS Blue
I agree with you on all those issues. I used to drive a multi-million dollar vehicles whose autopilot has tried to kill me and my passengers so many times it has become routine. AP drives like an old grandma. Way too conservative, but I am set to "Chill", max following distance, and brake on release of accelerator, max collision avoidance. There are less conservative modes that I will eventually try out and might help with your issues, but I like my Tesla safety score of 99 which makes insurance more affordable than my Prius.
Agreed, we did test this on my Wife Model 3 and by expanding distance, when she is driving she is in Chill Mode and the Phantom Braking is greatly reduced in our opinion.
 
EVERYONE drives on the interstate and can be distracted while in the flow of traffic. Or simply it is raining and the tires are not 100% or oil on the road. Or over 100 other issues. In the end, slamming on your brakes on the highway is bad if the car does it or the human. the car should be able to be made to do it better then the human.

So the argument is if the car stops in the middle of the highway and gets hit in the rear, it is the other drivers fault. Fact is technically you are right. Fact is this is a terrible argument, the damn car should not stop for very little reason on the HIGHWAY.

This is a Software/Hardware/Tesla/Technology problem, at some point Ladies and Gentleman we have to realize we are using Beta software that does dumb stuff sometimes.

For the good of Tesla and society, hold them accountable and that way like any good athlete, they can take the criticism and get better.
 
  • Like
Reactions: COS Blue
EVERYONE drives on the interstate and can be distracted while in the flow of traffic. Or simply it is raining and the tires are not 100% or oil on the road. Or over 100 other issues. In the end, slamming on your brakes on the highway is bad if the car does it or the human. the car should be able to be made to do it better then the human.

So the argument is if the car stops in the middle of the highway and gets hit in the rear, it is the other drivers fault. Fact is technically you are right. Fact is this is a terrible argument, the damn car should not stop for very little reason on the HIGHWAY.

This is a Software/Hardware/Tesla/Technology problem, at some point Ladies and Gentleman we have to realize we are using Beta software that does dumb stuff sometimes.

For the good of Tesla and society, hold them accountable and that way like any good athlete, they can take the criticism and get better.
They are under investigation by regulatory agencies for complaints of that very thing. What more do you want?
 
  • Like
Reactions: drtimhill
Perhaps the reason it’s all so bad is that Tesla never designed it to be a supervised driver aid, but is trying to sell it as such. If they wanted a driver assist, they would have made the highway hands free, and could easily have done so. That is the most boring part of driving and can take large chunks of time where boredom can lead to poor attention. But Tesla is set on robotaxi, so all of this effort - free testing by paying customers - is to develop that, yet they continue to call it a level 2 system probably to evade regulation. But musk is a scammer, and this is what we get (or actually don’t get). So we suffer with musk’s delusions and get used by him for not just free(slave) labor, but paying labor. You would have expected that a guy who could get a spaceship to back up and land could also get a car to park, but swing and a miss. Let’s blame the parking sensors and get rid of them, instead of the CEO or whatever musk thinks he is.
 
How about a way to opt-out of all beta features? That includes auto-wipers, AP, etc. How about a dumb cruise option for everyone if that wants it (it is already in the software, just hidden if you have AP/EAP/FSD).
Have you reached out to Tesla and offered feedback on features you'd like to see in the vehicles? If enough people provide the same feedback, perhaps the company will make adjustments. We've seen Elon react to some people's requests and add features, so the company may be open to the possibility.


That being said - you can also try working with Tesla on solving some of your issues if that's something you want to do.

Try:
Reinstalling your firmware (through the service menu, or by opening a service ticket and having them remotely reinstall)
Washing and thoroughly drying your car - then recalibrating your cameras
Checking your GPS for accuracy. Zoom in on the map and make sure your car is EXACTLY where it should be - even a little off can cause problems.
If you still experience severe problems, such as AEB phantom braking, constant red-wheel take over requests, or your GPS is inaccurate, open a service ticket as you may need to have your FSD computer, MCU, camera(s), or cabin camera enclosure replaced.

If you are just pissed and venting, then I offer my sympathy for your situation, and wish you could have a better experience like many of us do.
 
You would have expected that a guy who could get a spaceship to back up and land could also get a car to park, but swing and a miss.

Scientifically the first problem is easier. It's not easy to engineer the details, but what is required is known exactly by physics. There needs to be a certain amount of fuel, proper control systems (a solved problem with Kalman filters and successors), precise accelerometers, baffles in fuel tanks to prevent dynamical sloshing, exact knowledge of self and landing position, high accuracy throttleable rocket engines.

Second is a hard machine learning problem with only vision or ultrasonics with fuzzy solutions and requiring lots of experimentation. Short range lidar would help and permit more controlled 'robotics' algorithms, but would increase costs substantially.
 
Scientifically the first problem is easier. It's not easy to engineer the details, but what is required is known exactly by physics. There needs to be a certain amount of fuel, proper control systems (a solved problem with Kalman filters and successors), precise accelerometers, baffles in fuel tanks to prevent dynamical sloshing, exact knowledge of self and landing position, high accuracy throttleable rocket engines.

Second is a hard machine learning problem with only vision or ultrasonics with fuzzy solutions and requiring lots of experimentation. Short range lidar would help and permit more controlled 'robotics' algorithms, but would increase costs substantially.
Yeah, if you’re trying to make a car work on vision alone, but another way is just to do what the automakers that have made this feature for many years, instead of making people that want their car to park pay for a go fund me.
 
  • Like
Reactions: COS Blue
Agreed, we did test this on my Wife Model 3 and by expanding distance, when she is driving she is in Chill Mode and the Phantom Braking is greatly reduced in our opinion.
`
They are under investigation by regulatory agencies for complaints of that very thing. What more do you want?
I don't care about the regulatory aspects, to me it is more about the Technology being perfected for the good of all as driving clearly at some point would be safer if Humans do not drive.

So as to my asperations/wishes/wants. I want them to get it right and Autonomous driving to succeed. I want Tesla to be more honest and straight forward in the sales price, and to explain it's issues in details with its potential users with the seriousness of the risks. I want Tesla to provide it as an free upgrade to enhanced auto pilot while they are all over the map on their directions, hell it was just within the last 6 months they announced basically aborting multiple technologies they were using for Vision only. I want them to make options for all those cars they sold on promises they will not be able to deliver on.

But then again, I also want to win the lotto. Probably have as good a chance as all the above up their happening.

I am disappointed, because from my perspective Tesla has become a company that originally cared about it's employees and it's employees were Passionate about their founder, their cars and loved sitting with you and going over their products with pride.

So I also want Tesla to figure out how to get that magic back also, again, I like dreaming.
 
  • Like
Reactions: UserAnon99
Scientifically the first problem is easier. It's not easy to engineer the details, but what is required is known exactly by physics. There needs to be a certain amount of fuel, proper control systems (a solved problem with Kalman filters and successors), precise accelerometers, baffles in fuel tanks to prevent dynamical sloshing, exact knowledge of self and landing position, high accuracy throttleable rocket engines.

Second is a hard machine learning problem with only vision or ultrasonics with fuzzy solutions and requiring lots of experimentation. Short range lidar would help and permit more controlled 'robotics' algorithms, but would increase costs substantially.
Thank you, as the old expression goes it's not rocket science. The Engineer will answer, rocket science not that difficult.
Learning how to drive an automobile with Humans is a much more complex problem with many more variables.

Love this post.