Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.

ampra

New Member
Sep 4, 2023
1
8
texas
I have got to be honest. Full Self Drive capability in Tesla goes further than any other vehicle designed by man (or woman) but it left me wanting for more.
I bought my Model Y Long Range on Friday 9/1 and took it for a little ride from Houston to Corpus Christi on Sunday 9/3. Here are my first reactions to FSD on Monday 9/4.
From the moment i engaged the Full Self Drive AutoSteer capability, I was hugely disappointed at the hype it has created. Here's what happened:-
a. there were worn out tire parts on the road and i occasionally saw roadkills. If a human would have been driving the car, they would have slowed down and avoided those. Tesla, the car (not the company), does not observe those things and goes right on top of them
b. here in texas, the speed limit on highways/freeways are higher compared to rest of the country. I was driving at between 75 and 80 mph for the best part of my drive. Some state highways are single lane and hence cars pile up (unless you want to go to the other side of the road and speed up ahead). most of the cars drive at that high speed. any sudden braking or acceleration could lead to disaster. FSD did not realize that. it dropped from 80 to 50 mph within seconds when it notices go slow flashing yellow lights on at 75mph speed limit road. I tell you, these sudden braking were scary (perhaps scarier for the driver following me as they did not know that FSD was engaged). I highly recommend a visual sign (a blue light or something) in the tail suggesting that the car in front of me is being driven by a computer and not a human
c. When i drive the car, i keep the car at about a couple of feet away from the left lane markings. I increase that (i.e. steer slightly to the right) when there are vehicles coming from the opposite side on a single lane roads, just to be safe. FSD always drove with a bias towards the left marking, sometime too close (for my comfort) to the oncoming vehicles.
d. there were multiple occasions when the lane (on a single lane highway) forked out to a left-only turns. the left lane markings went further out to the left to create a left-only lane and the right one remained straight. On one occasion, when a heavy duty gas guzzler was tail-gating me too closely at high speed, the FSD veered towards the left as the marking moved to the left and then tried to come back to the straight lane when it realized that it does not have to turn left. I had to take over and avoid getting hit from behind.
there's a long way to go before we (at least I) can rely fully on the car to drive safely on its own. just asking the driver to keep the hands on the wheels isn't enough.
Screenshot (222).png
 
Everything you've brought up is a valid criticism of the system. The bit that will change over time - if you stick with it - is that you'll get progressively more comfortable with it. You'll find out what it can and cannot do and learn to use it like you would any other tool. You'll learn that it drives a certain way and that while it's not your way of driving, it can be a serviceable way. FSDb is certainly not an engage-and-forget kind of system. You'll learn that it doesn't handle certain things and you'll either become more alert at those times or you'll just disengage it because you don't feel like clowning around with it right then.

So spend some more time with it and see if you find it to be a useful tool for some of your driving.
 
I agree with JB, I would add maybe if you do want to play with it some try using it in areas where there isn't much traffic. I find there is far less pressure when no other cars are present at a 4 way stop or whatever. I find it does better than I expected at difficult stuff and does a lot worse than I expected at what I would think would be easy things. If you do end up using it for a bit you will start to learn the areas it has problems with or situations that it will probably do something stupid and just preemptively disengage. I fully understand the disappointment especially if you had been watching a bunch of youtube videos of people using FSDb because they will often lead you to think it's much better than it is. (not that they are all purposely trying to deceive but it can be hard to know what it feels like from a video as well as seems a lot of them have a bias and will make excuses for bad FSDb behavior or downplay it)

I would say if you like playing with FSDb for the tech and testing and novelty give it time, if you just want a system that can drive you then .. maybe we will get a step closer with v12 (just know usually after a major update like that its a 2 steps forward 1 step back type of thing for a bit until some point releases happen)

Edit: Also thank you for giving your experience. It's always interesting to get peoples real feelings about using it for the first time, its a good test to see where it really is, kinda like the "wife test".
 
  • Like
Reactions: JB47394
<summary: doesn't drive like a person>View attachment 974236

I've been re-thinking the end-to-end neural network training debate (on other threads), i.e. Tesla's new strategy of building in nothing in any intermediate perception or labeling stages, and only using observations and human driving in a generative/corrective architecture.

The quoted example from ampra is a situation where the end-to-end training on human driving will do better. Most of the time it will learn "natural human behaviors" better. There are socialized behaviors on certain roads and conditions that are not easily describable in any driving manual or algorithm, and yet collectively humans do them---often by watching other cars in the same environment.

So, supposing Tesla stays at a L2+++++ system, the end-to-end training has a high likelihood of eventually making a customer-acceptable driver *assist* product. It will generally feel 'nice', and it will give the illusion (falsely in my opinion) that robotaxis are around the corner. Because humans can't easily with N=1 tell the difference personally between 100 hrs MTBF (stretch goal for end to end FSD in 3 years, like an average FSD driver having to make a needed correction disengagement every 2 months, which is still ~100x better than today) and (wild ass guess) 100,000 - 1,000,000 which is needed for a regulator approved robo carrying elderly or children.

Realistically Tesla has no robotaxi software & service research or development at all. There is no testing or programming for anything like what an actual revenue-generating robotaxi will need, from first the much much much much lower failure rate (like by a factor of 100x-10000x is needed) or any of the other services like maps better than what can be acquired at low cost/free, dropoff pickup dynamics, safety, customer service, calling, fleet routing, fleet charging, etc.

Maybe in fact that's the actual plan underneath it all? Waymo is not minting free cash at all---and Elon loves making money.

Tesla will sell robotaxi hardware base to other vendors---they will have a strong manufacturing and cost advantage outside any Chinese automaker---and let external resellers deal with the programming & additional sensors and compute needed and they take all the blame and flak.

End to end driver assist will help sell Teslas.

For Tesla's human driven cars, it will forever be a "driver assist", with unpredictable and unexpected failure modes at odd circumstances, but infrequent enough and still with all the L2 disclaimers that 70% of the customers think it works well enough. Musk will continue to hype the robotaxi future but it will always be a future.

Since my own car will never have the hardware I think is necessary for L4 autonomy (all around imaging radar, high res cameras & more compute), I'm OK with this.
 
FSD isn’t something I’d trust personally, due to the experiences I’ve had where it’s been confused at intersections near my house - wanting to make turns on red lights - and spends a lot of time driving in the blind spots of other vehicles.

I get concerned how trusting people are - on a recent road trip I called in a plaid to the hwy patrol because the driver was on his phone texting while his car was driving erratically on whatever AP he was using.
 
I've used it just once as I didn't get the update in my MYP until Wednesday. The first thing I experienced was about 5 disengagements on the first street out of my gated community. I put on FSD and it bailed within a second, asking me 'what went wrong?'. I think it was bright afternoon Arizona sun, tar snakes on the pavement and a lack of painted lane lines. It did much better on the freeway, except going through construction zones where it suddenly dropped from 75mph to 55mph (speed limit sign) which nobody else on the freeway slowed at all. Some turns were rather aggressively last-second, others were very cautious. Sometimes I took over because I didn't know what FSD was going to do, and it was scaring me. Mostly though FSD would bail out on me, BUT THE CRUISE CONTROL SPEED WOULD STAY SET! So, for instance, I turned on FSD on a street going 40mph in a slight turn, FSD disengaged and left me in a car going 40mph in a straight line in a corner. This one really freaks me out. If FSD is going to stop, should the car continue at speed or disengage throttle as well?
 
Everything you've brought up is a valid criticism of the system. The bit that will change over time - if you stick with it - is that you'll get progressively more comfortable with it. You'll find out what it can and cannot do and learn to use it like you would any other tool. You'll learn that it drives a certain way and that while it's not your way of driving, it can be a serviceable way. FSDb is certainly not an engage-and-forget kind of system. You'll learn that it doesn't handle certain things and you'll either become more alert at those times or you'll just disengage it because you don't feel like clowning around with it right then.

So spend some more time with it and see if you find it to be a useful tool for some of your driving.
I completely agree with this. First time I used it I was nervous disengaging al the time. As I continued to use it I felt I became one with it and understood what it could and could not do and the rides became smoother and smoother. It is a great tool to have and really impressive at times.
 
  • Like
Reactions: JB47394 and Dewg
I've been testing it and it shines sometimes but the majority not so much. I get it so I'll continue to test hoping to better the greater good. I wish when FSD disables it would also disable cruise control as well, especially when street driving. Just seems odd that it stays on and a warning is quickly flashed on the screen as if the driver has ample time to glance to read a warning message that quickly disappears.
 
It is largely a party trick, and a dangerous one at that. I did a 3,000 mile road trip this summer and "tested" it under all manner of scenarios and conditions. There are many many posts here with extensive and detailed feedback but it can be easily summarized.

Under "optimal" conditions it performs pretty well. As you start to add more and more variables that performance degrades, rapidly. In all scenarios on my 3K road trip Enhanced Autopilot performed better and in a less risky manner. In a crowded rush hour in a dense urban setting, like for instance the DC area or NYC you'd be taking a very dangerous bet relying on this vehicle to navigate the myriad of variables you'd encounter under those conditions.

I am not even certain the hardware most of the cars have are actually capable of true FSD. The newer hardware perhaps but it would really need a tremendous amount of software intelligence to go along with it to see anything even close to what was initially promised.
 
  • Like
Reactions: QUBO and spacecoin
I too am a new Tesla owner, just got my plates today. Did a road trip without the wife, who assumed she would never see me again from the few times I showed her autopilot. Everything ampra mentioned, I also observed, but he missed a few which I'll add here. The worst thing I experienced was passing the 18 wheelers. Most of the time the pass is about 1/2 MPH and it takes forever to get around the rig. Yes, you can goose it, but I was letting it do it's thing to see what would happen. Several times the 18 wheeler would wobble in the right lane. I'm not talking about anything unusual at all happening. They're big trucks, they fill the lane with not a whole lot to spare, so a little wobble we've all been living with forever. Not the Tesla. Out of the blue it slams on the brakes (not the regenerative ones, the disk ones) and starts the klaxon warnings. And if there's some aggressive pickup truck behind you (like there always seems to be), you know he's either cursing the hell out of you or smashing into your rear end. The latter never happened to me because I stomped the accelerator and thank goodness, we all know the rocket that fires you away from the tailgater is there. But had my wife been with me I would not have been able to engage FSD for the rest of the drive. Which would have been a shame, because 98% of the time it's a better driver than I am, and I'm a pretty darn good driver. The patience and respectful lane changing is over the top good. Well, most of the time anyway. There were those 2% incidents of changing lanes into the emergency lane or the slow truck lane up a grade that I had to cancel or correct. It also feels a little unsure of complicated intersections when having to turn against oncoming traffic and errors on the side of pausing when it really needs to clear from the oncoming lanes. There are some traffic circles it will amaze you on and others it will terrify you on. Problem is, you can't really tell what it's going to do. I'm going to take over on those at least until the next update.

Here's one I'm hoping someone here can help me with. I did have one incident on this trip that would probably have been a bad fender bender or maybe full on wreck if I had been in my old Mazda. It was preparing for a left turn on a 4 lane highway. Oncoming traffic was approaching from a distant light that had changed and I was trying to see where the turn was and where the super chargers were that I was navigating too. Next thing I knew a severe turn to the right and sharp breaking along with the klaxons blaring scared the bejesus out of me. It was certainly not my intention to test any of the accident avoidance features of the Tesla, but it turned out that they do work extremely well. There was a blue something or other racing across the two lanes to my right and cutting the corner right in front of me. It all happened so suddenly I couldn't really piece it all together and still don't know exactly what happened. I thought I would look at the dash cam later to piece it all together. I have dashcam set on Auto but I thought, just to make sure I got this incident, I would honk my horn as that's an option I knew I had set. Nothing of this incident recorded. What did record was a boring 20 minute stretch of freeway where nothing happened. I thought maybe that was because I had the rear camera on the monitor for a while to see what that was like. But I tried that again to see and can't repeat it. It records people walking by in Sentry mode sometimes and not others. How do you get this to record things that forced you to take over from FSD or incident avoidance activations?
 
  • Funny
Reactions: pinolero
I too am a new Tesla owner, just got my plates today. Did a road trip without the wife, who assumed she would never see me again from the few times I showed her autopilot. Everything ampra mentioned, I also observed, but he missed a few which I'll add here. The worst thing I experienced was passing the 18 wheelers. Most of the time the pass is about 1/2 MPH and it takes forever to get around the rig. Yes, you can goose it, but I was letting it do it's thing to see what would happen. Several times the 18 wheeler would wobble in the right lane. I'm not talking about anything unusual at all happening. They're big trucks, they fill the lane with not a whole lot to spare, so a little wobble we've all been living with forever. Not the Tesla. Out of the blue it slams on the brakes (not the regenerative ones, the disk ones) and starts the klaxon warnings. And if there's some aggressive pickup truck behind you (like there always seems to be), you know he's either cursing the hell out of you or smashing into your rear end. The latter never happened to me because I stomped the accelerator and thank goodness, we all know the rocket that fires you away from the tailgater is there. But had my wife been with me I would not have been able to engage FSD for the rest of the drive. Which would have been a shame, because 98% of the time it's a better driver than I am, and I'm a pretty darn good driver. The patience and respectful lane changing is over the top good. Well, most of the time anyway. There were those 2% incidents of changing lanes into the emergency lane or the slow truck lane up a grade that I had to cancel or correct. It also feels a little unsure of complicated intersections when having to turn against oncoming traffic and errors on the side of pausing when it really needs to clear from the oncoming lanes. There are some traffic circles it will amaze you on and others it will terrify you on. Problem is, you can't really tell what it's going to do. I'm going to take over on those at least until the next update.

Here's one I'm hoping someone here can help me with. I did have one incident on this trip that would probably have been a bad fender bender or maybe full on wreck if I had been in my old Mazda. It was preparing for a left turn on a 4 lane highway. Oncoming traffic was approaching from a distant light that had changed and I was trying to see where the turn was and where the super chargers were that I was navigating too. Next thing I knew a severe turn to the right and sharp breaking along with the klaxons blaring scared the bejesus out of me. It was certainly not my intention to test any of the accident avoidance features of the Tesla, but it turned out that they do work extremely well. There was a blue something or other racing across the two lanes to my right and cutting the corner right in front of me. It all happened so suddenly I couldn't really piece it all together and still don't know exactly what happened. I thought I would look at the dash cam later to piece it all together. I have dashcam set on Auto but I thought, just to make sure I got this incident, I would honk my horn as that's an option I knew I had set. Nothing of this incident recorded. What did record was a boring 20 minute stretch of freeway where nothing happened. I thought maybe that was because I had the rear camera on the monitor for a while to see what that was like. But I tried that again to see and can't repeat it. It records people walking by in Sentry mode sometimes and not others. How do you get this to record things that forced you to take over from FSD or incident avoidance activations?
My wife felt the same way when I first took a test drive on a MYP. Her instructions were that I could only "accelerate" while alone, so the comment of never seeing you again was spot on. I've never seen such blue warnings but I'm new. I've only seen them when my hands are doing "Elon Mode".

What is frustrating is that the "warnings" disappear rather quickly. Esp with testing FSD on HW4, and I'm supposed to be able to do this in seconds while looking at the screen while attempting to avoid something. I just want to learn what I've done wrong or triggered the system.

I will say that today I was pleasantly alerted when my MYP thought I would not stop on time while my eyes were not looking straight forward, it beeped the warning signal & I immediately looked up again to see what the issue was.
 
My wife felt the same way when I first took a test drive on a MYP. Her instructions were that I could only "accelerate" while alone, so the comment of never seeing you again was spot on. I've never seen such blue warnings but I'm new. I've only seen them when my hands are doing "Elon Mode".

What is frustrating is that the "warnings" disappear rather quickly. Esp with testing FSD on HW4, and I'm supposed to be able to do this in seconds while looking at the screen while attempting to avoid something. I just want to learn what I've done wrong or triggered the system.

I will say that today I was pleasantly alerted when my MYP thought I would not stop on time while my eyes were not looking straight forward, it beeped the warning signal & I immediately looked up again to see what the issue was.
Yeah, a little creepy to know that it's watching you all the time. I did find that I could put sunglasses on and it didn't notice my looking around. Not that I don't normally keep my eyes on the road normally of course.
 
Has anyone else experienced situations where Full Self Driving software performed maneuvers that caused damages or injuries, with no time for driver to humanly intervene? It happened to me. Tesla drivers need to be aware that drivers assume all financial liability, even if the damages were caused by the Full Self Driving software, sensors or cameras. You are personally risking damages, injuries and financial liability to test out their software and driving systems. That is not stated in any disclaimer by Tesla. The software and sensors are not very reliable and to claim otherwise is false advertising that encourages false expectation. I bet far fewer people would use FSD or consider it a worthwhile feature if they knew they're financially responsible for everything the software does.
 
  • Like
Reactions: pinolero
Has anyone else experienced situations where Full Self Driving software performed maneuvers that caused damages or injuries, with no time for driver to humanly intervene? It happened to me. Tesla drivers need to be aware that drivers assume all financial liability, even if the damages were caused by the Full Self Driving software, sensors or cameras. You are personally risking damages, injuries and financial liability to test out their software and driving systems. That is not stated in any disclaimer by Tesla. The software and sensors are not very reliable and to claim otherwise is false advertising that encourages false expectation. I bet far fewer people would use FSD or consider it a worthwhile feature if they knew they're financially responsible for everything the software does.
It is known that the driver is in charge and car is acting funny, drive needs to take over. Usually we assume there would be enough time to intervene. How did it happen to you so there was no chance? I saw car running red lights thinking its blinking meanwhile it's solid red and intervened a few times. Also car tried to go straight meanwhile I was on the turning lane. There types are most critical interventions to me.
 
  • Like
Reactions: pinolero
Has anyone else experienced situations where Full Self Driving software performed maneuvers that caused damages or injuries, with no time for driver to humanly intervene? It happened to me. Tesla drivers need to be aware that drivers assume all financial liability, even if the damages were caused by the Full Self Driving software, sensors or cameras. You are personally risking damages, injuries and financial liability to test out their software and driving systems. That is not stated in any disclaimer by Tesla. The software and sensors are not very reliable and to claim otherwise is false advertising that encourages false expectation. I bet far fewer people would use FSD or consider it a worthwhile feature if they knew they're financially responsible for everything the software does.
So, what happened?
 
Has anyone else experienced situations where Full Self Driving software performed maneuvers that caused damages or injuries, with no time for driver to humanly intervene? It happened to me. Tesla drivers need to be aware that drivers assume all financial liability, even if the damages were caused by the Full Self Driving software, sensors or cameras.
I haven't personally but there's been a few reports of FSD making dangerous exit/turn attempts on freeways and/or highways. I also saw a recent tweet about an engaged FSD Tesla crashing into a tree with passenger injuries.
I bet far fewer people would use FSD or consider it a worthwhile feature if they knew they're financially responsible for everything the software does.
I think TSLA waives all risk for FSD use. It might not include negligence although negligence might be difficult to prove given TSLA's lack of transparency. If these things start happening more frequently TSLA will eventually find themselves without an FSD customer (sucker) base.
 
Last edited:
Has anyone else experienced situations where Full Self Driving software performed maneuvers that caused damages or injuries, with no time for driver to humanly intervene? It happened to me. Tesla drivers need to be aware that drivers assume all financial liability, even if the damages were caused by the Full Self Driving software, sensors or cameras. You are personally risking damages, injuries and financial liability to test out their software and driving systems. That is not stated in any disclaimer by Tesla. The software and sensors are not very reliable and to claim otherwise is false advertising that encourages false expectation. I bet far fewer people would use FSD or consider it a worthwhile feature if they knew they're financially responsible for everything the software does.
I have very similar experience with my MYP when going home. I just do not trust FSD to take the turn without curb rash, and I know Tesla will not pay for any damages as we are Guinea pigs so it's on us to take over at any time. Hence, I always take the wheel like Carrie sings.

As we all know, we can see and process real life activities of worthless drivers and anticipate what they may do and knowing the system is BETA and will be slow and unpredictable to respond to such actions. I've had my fair share of taking over because I do not trust it to do the right thing, and then impressed when I'm more comfortable that I can take control of the situation in an emergency and let it play out and FSD did great.

For me, I figure I'm helping the greater good for all of us and as others are doing as well, so we can all benefit down the road.
 
Last edited:
I just picked up my 2023 Model Y Performance with FSD. I absolutely love manually driving the car, but I’m going to lose my license pretty quickly when the speeding tickets start to pile up! I was only given about 5 minutes of instruction on how to operate the car at delivery time - good thing I had watched so many hours of videos on where to find stuff while waiting for my car to be built.

There are still two areas of difficulty that I need to resolve. First, the little beauty just doesn’t understand what I want when using the voice control. It takes me several tries to say something it recognizes. Things as simple as changing the fan output are tough. I thought “Fan to high” would work, but it doesn’t. Neither does “I’m hot.” And “Set fan to 7” is interpreted as “Set fan 27.” Ugg.

The other learning curve I’m sliding down is the use of FSD. While driving the 180 miles home from my closest Tesla dealer, the FSD worked as advertised, when it worked. But I must have looked at the screen too long trying to find a control and the FSD tripped off simultaneously with the warning to PAY ATTENTION! It said I couldn’t use FSD again on this trip. I managed to get it back by pulling over, turning the car off, and starting over. A little while later, I touched the control to open the glove box and BANG, the FSD kicked off instantly with the warning to PAY ATTENTION! again and a warning that if it happened twice more, it would be locked out PERMANENTLY! Since it seems to be so easy to violate the attention rule I’m pretty sure I won’t have any trouble inadvertently tripping the FSD off a couple more times. If the car permanently disables my $12,000 software upgrade I’m going to be seriously pissed.

Has everybody got this problem or is it new to the interior camera addition?
 
I've had lots of scary moments with FSD in my MYP, picked up in August 2023. Mostly it works fine on the freeway, but it does get annoying around other cars a lot. I've learned when to expect it not to work and just take over. It has a long way to go. But I've got it for 3 months free, so I've been trying it. I would never trust it around city streets. When it does disengage, a lot of time it just launches you back into TACC, and you really do need to instantly react.
 
  • Like
Reactions: UserAnon99
Has everybody got this problem or is it new to the interior camera addition?

My understanding is the new cameras see a LOT better than the older ones. I consider this a downgrade. :) I am not at all into my car monitoring my eyes in real time especially when as you pointed out it can be overly cautious to the point of absurdity at times.

Also, I don't think your FSD suspension is actually 'permanent' but it is like 2 weeks which is still super annoying on something you dropped 12K on, more 'nanny' type stuff that very much offends my libertarian sensibilities.