Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Autopilot

This site may earn commission on affiliate links.
i have to say after owning a model 3 since June and using autopilot practically every day on highways, we are a long way away from full Autopilot. Almost every day in my commute (40 miles) to and from work, in a HOV lane, my car experiences rapid deceleration for no reason. I report the bug every time. Something many others have reported. I also pay special attention to the traffic around me to try to identify the cause. The HOV lane has an extra wide divider between it and the left lane (3 feet at least) and a wall separating it from the other direction. No instances that I recall have ever been encountered where it looked like a car will come from a different lane or that traffic is slowing in front.

In two other occasions, a car was pulled over on the left side of the HOV lane but partially in the HOV lane (broken down) and my car did not know what to do as it could not go outside of the line to the right and couldn't pass the car without doing that.

One other quirk is that when a lane widens because of an entrance or exit to that lane, the car centers itself and sometimes will cut off a merging or exiting car. Again, the driver needs to take over.

I know that right now we need to stay engaged and I fully accept that. My point is that there is still a lot of work to be done for highway driving. I cant imagine full autopilot on streets yet.
 
  • Like
Reactions: JasonCinOtt
I imagine part of the reason the centering issue hasn't been 'solved' yet is that there needs to be a lot of NN training to help it properly decide whether to stick the car to the left lane marker or the right. This is, I imagine, further complicated by the fact that, despite the official stance and documentation stating that AP is for use on divided highways only (where 'stick to the lane line on the left' generally makes sense [in LHD-land]), many people use it on surface streets as well (where this is less likely to be the case).

Indeed, even a supposed 'stick to the left lane line' rule would be incorrect in a case where you're in the left lane and a left on-ramp is merging in- there you'd want to stick to the right lane line.
 
i have to say after owning a model 3 since June and using autopilot practically every day on highways, we are a long way away from full Autopilot. Almost every day in my commute (40 miles) to and from work, in a HOV lane, my car experiences rapid deceleration for no reason. I report the bug every time. Something many others have reported. I also pay special attention to the traffic around me to try to identify the cause. The HOV lane has an extra wide divider between it and the left lane (3 feet at least) and a wall separating it from the other direction. No instances that I recall have ever been encountered where it looked like a car will come from a different lane or that traffic is slowing in front.

In two other occasions, a car was pulled over on the left side of the HOV lane but partially in the HOV lane (broken down) and my car did not know what to do as it could not go outside of the line to the right and couldn't pass the car without doing that.

One other quirk is that when a lane widens because of an entrance or exit to that lane, the car centers itself and sometimes will cut off a merging or exiting car. Again, the driver needs to take over.

I know that right now we need to stay engaged and I fully accept that. My point is that there is still a lot of work to be done for highway driving. I cant imagine full autopilot on streets yet.
I've been thinking about getting it for driving 26 miles of mostly freeway each day with plenty of traffic. Do you think it's worth it? Does it reduce stress of stop and go?
 
  • Like
Reactions: JasonR67
I've been thinking about getting it for driving 26 miles of mostly freeway each day with plenty of traffic. Do you think it's worth it? Does it reduce stress of stop and go?

On long stretches of highway it can reduce some fatigue by maintaining speed and keeping the car in the lane but once things get congested I turn it off since it just gets annoying. You can’t fully trust it in stop and go traffic so you may actually increase your stress level worrying if it will actually stop. I find it easier to just drive the car myself.
 
I agree completely. I trust the TACC, but the auto-steer function does something funky (i.e. unexpected or unusual or scary or dangerous) every time I use it, and it is more work to babysit it than to just steer the car myself. I'd say Tesla is years away from full self-driving, and probably will never achieve it with the current M3 sensors and computer.
 
For reference, here's how I relate to this topic:
  • My car is on order, so I have about 15 minutes experience driving a Model 3. In that respect, I know jack squat.
  • I was a software developer for about 20 years. I still dabble.
  • Since then, for 12 years I've owned a driving school. (It's a step up, believe it or not.)
  • This means I'm betting my livelihood that almost everyone will need to learn to drive for at least another 20-30 years before I retire.
My take:
  • Self-driving cars will always be capable of crashing.
  • Humans will always be capable of crashing.
  • When humans crash, it's an every day occurrence. If a self-driving car is in full control at the time of a crash, it's national news.
  • In my opinion, the goal SHOULD be that self-driving cars are less likely to crash less than human drivers.
  • But that won't be enough, politically. Voters and politicians will be very slow to accept the idea that a product can have a brain fart which kills people. (Even though we already accept that fact about human drivers.)
  • Soon, cars will become far better than humans at identifying "big things" in their path, properly-labeled road signs, and the desired path of travel. They have radar, we don't. They will soon have 5G and IoT to recognize other vehicles so-equipped, which we don't. And their optics and spectral abilities may handle sun glare better than humans. (If not now, soon.) Electronic sensors definitely have some advantages over human eyes and ears.
  • BUT... Humans are amazingly good at identifying "what's gonna happen next". Computers can't get there until they recognize the "relevance" of their sensor data, in an "almost self-aware" kind of way. A few examples:
    • We see kids playing with a ball on a front lawn, and we slow down and move a bit to one side because we know that balls roll away, and that children are unpredictable.
    • We see a dome light on in a parked car, and even though we don't see anyone in the car, we prepare for the door to open.
    • We see a trucker's eyes as he looks in his rear-view mirror. Maybe we also notice a red glow from a signal reflecting from the rig, but the trailer signal is apparently not connected or burned out. We adjust or prepare for his turn or lane change.
    • We see "a small black something" in the roadway. Using our knowledge of wind, weight, and texture, we can tell the difference between a stray crowbar, a rag, or just a strip of tar where the road was repaired. If needed we avoid it, but we also avoid moving in a way that surprises other drivers.
    • We're behind a car with a mattress tied to the top, blowing in the wind. Or a pickup truck with a load of junk tied town with dental floss. We stay back or change lanes.
    • At an intersection, we see a stop sign that has turned sideways, or it was bent. Or it's missing entirely and the pole has nothing on it. We stop because we know the pole is there for a reason.
    • We see a police officer conducting traffic around an hazard. We see his eye contact, gestures, facial expressions, and maybe we hear him.
I'm only guessing here, but in the recent news about the Tesla which hit a parked firetruck, it wouldn't surprise me if the firetruck had a white line painted along its side which the car mistook for a road marking. This is something a human driver wouldn't have done.

Sooo... My apologies to Elon, but I'm as certain as I've ever been that the car I'm buying now will have driven its last mile loooong before I can "send my car to work" as a ridesharing service.

Although we're very close to cars being able to handle "every properly labeled street where nothing goes unexpectedly", that's just not gonna be enough for the real world, and for political acceptance and legal status.

And Tesla needs to stop calling it autopilot! There are too many people who think they can sleep, read, or text when it's on, and I think the term "autopilot" is part of the reason! True autopilot is for aircraft or boats, which are miles away from anything else when those features are used.
 
For reference, here's how I relate to this topic:
  • My car is on order, so I have about 15 minutes experience driving a Model 3. In that respect, I know jack squat.
  • I was a software developer for about 20 years. I still dabble.
  • Since then, for 12 years I've owned a driving school. (It's a step up, believe it or not.)
  • This means I'm betting my livelihood that almost everyone will need to learn to drive for at least another 20-30 years before I retire.
My take:
  • Self-driving cars will always be capable of crashing.
  • Humans will always be capable of crashing.
  • When humans crash, it's an every day occurrence. If a self-driving car is in full control at the time of a crash, it's national news.
  • In my opinion, the goal SHOULD be that self-driving cars are less likely to crash less than human drivers.
  • But that won't be enough, politically. Voters and politicians will be very slow to accept the idea that a product can have a brain fart which kills people. (Even though we already accept that fact about human drivers.)
  • Soon, cars will become far better than humans at identifying "big things" in their path, properly-labeled road signs, and the desired path of travel. They have radar, we don't. They will soon have 5G and IoT to recognize other vehicles so-equipped, which we don't. And their optics and spectral abilities may handle sun glare better than humans. (If not now, soon.) Electronic sensors definitely have some advantages over human eyes and ears.
  • BUT... Humans are amazingly good at identifying "what's gonna happen next". Computers can't get there until they recognize the "relevance" of their sensor data, in an "almost self-aware" kind of way. A few examples:
    • We see kids playing with a ball on a front lawn, and we slow down and move a bit to one side because we know that balls roll away, and that children are unpredictable.
    • We see a dome light on in a parked car, and even though we don't see anyone in the car, we prepare for the door to open.
    • We see a trucker's eyes as he looks in his rear-view mirror. Maybe we also notice a red glow from a signal reflecting from the rig, but the trailer signal is apparently not connected or burned out. We adjust or prepare for his turn or lane change.
    • We see "a small black something" in the roadway. Using our knowledge of wind, weight, and texture, we can tell the difference between a stray crowbar, a rag, or just a strip of tar where the road was repaired. If needed we avoid it, but we also avoid moving in a way that surprises other drivers.
    • We're behind a car with a mattress tied to the top, blowing in the wind. Or a pickup truck with a load of junk tied town with dental floss. We stay back or change lanes.
    • At an intersection, we see a stop sign that has turned sideways, or it was bent. Or it's missing entirely and the pole has nothing on it. We stop because we know the pole is there for a reason.
    • We see a police officer conducting traffic around an hazard. We see his eye contact, gestures, facial expressions, and maybe we hear him.
I'm only guessing here, but in the recent news about the Tesla which hit a parked firetruck, it wouldn't surprise me if the firetruck had a white line painted along its side which the car mistook for a road marking. This is something a human driver wouldn't have done.

Sooo... My apologies to Elon, but I'm as certain as I've ever been that the car I'm buying now will have driven its last mile loooong before I can "send my car to work" as a ridesharing service.

Although we're very close to cars being able to handle "every properly labeled street where nothing goes unexpectedly", that's just not gonna be enough for the real world, and for political acceptance and legal status.

And Tesla needs to stop calling it autopilot! There are too many people who think they can sleep, read, or text when it's on, and I think the term "autopilot" is part of the reason! True autopilot is for aircraft or boats, which are miles away from anything else when those features are used.

Well stated. You have articulated the reality of the situation very well. I applaud Elon and Tesla for having vision and passion to advance the world toward autonomous driving but I really think the gap from where we are to where they want to be is way too large. I agree, Tesla really should market the features as advanced drivers aids, etc. instead of autopilot or FSD. I actually think they would get more buyers as many are scared of the concept of autonomous driving.

The best part of owning a Tesla is how it drives. It is a quantum leap improvement over the way an ICE based vehicles drives. Add fuel savings, supercharger network, OTA updates and it is the best car on the market by far.
 
For reference, here's how I relate to this topic:
  • My car is on order, so I have about 15 minutes experience driving a Model 3. In that respect, I know jack squat.
  • I was a software developer for about 20 years. I still dabble.
  • Since then, for 12 years I've owned a driving school. (It's a step up, believe it or not.)
  • This means I'm betting my livelihood that almost everyone will need to learn to drive for at least another 20-30 years before I retire.
My take:
  • Self-driving cars will always be capable of crashing.
  • Humans will always be capable of crashing.
  • When humans crash, it's an every day occurrence. If a self-driving car is in full control at the time of a crash, it's national news.
  • In my opinion, the goal SHOULD be that self-driving cars are less likely to crash less than human drivers.
  • But that won't be enough, politically. Voters and politicians will be very slow to accept the idea that a product can have a brain fart which kills people. (Even though we already accept that fact about human drivers.)
  • Soon, cars will become far better than humans at identifying "big things" in their path, properly-labeled road signs, and the desired path of travel. They have radar, we don't. They will soon have 5G and IoT to recognize other vehicles so-equipped, which we don't. And their optics and spectral abilities may handle sun glare better than humans. (If not now, soon.) Electronic sensors definitely have some advantages over human eyes and ears.
  • BUT... Humans are amazingly good at identifying "what's gonna happen next". Computers can't get there until they recognize the "relevance" of their sensor data, in an "almost self-aware" kind of way. A few examples:
    • We see kids playing with a ball on a front lawn, and we slow down and move a bit to one side because we know that balls roll away, and that children are unpredictable.
    • We see a dome light on in a parked car, and even though we don't see anyone in the car, we prepare for the door to open.
    • We see a trucker's eyes as he looks in his rear-view mirror. Maybe we also notice a red glow from a signal reflecting from the rig, but the trailer signal is apparently not connected or burned out. We adjust or prepare for his turn or lane change.
    • We see "a small black something" in the roadway. Using our knowledge of wind, weight, and texture, we can tell the difference between a stray crowbar, a rag, or just a strip of tar where the road was repaired. If needed we avoid it, but we also avoid moving in a way that surprises other drivers.
    • We're behind a car with a mattress tied to the top, blowing in the wind. Or a pickup truck with a load of junk tied town with dental floss. We stay back or change lanes.
    • At an intersection, we see a stop sign that has turned sideways, or it was bent. Or it's missing entirely and the pole has nothing on it. We stop because we know the pole is there for a reason.
    • We see a police officer conducting traffic around an hazard. We see his eye contact, gestures, facial expressions, and maybe we hear him.
I'm only guessing here, but in the recent news about the Tesla which hit a parked firetruck, it wouldn't surprise me if the firetruck had a white line painted along its side which the car mistook for a road marking. This is something a human driver wouldn't have done.

Sooo... My apologies to Elon, but I'm as certain as I've ever been that the car I'm buying now will have driven its last mile loooong before I can "send my car to work" as a ridesharing service.

Although we're very close to cars being able to handle "every properly labeled street where nothing goes unexpectedly", that's just not gonna be enough for the real world, and for political acceptance and legal status.

And Tesla needs to stop calling it autopilot! There are too many people who think they can sleep, read, or text when it's on, and I think the term "autopilot" is part of the reason! True autopilot is for aircraft or boats, which are miles away from anything else when those features are used.


I love this post, thanks for taking the time to write it and articulate some of the challenges in such a well thought out manner. I have said a few times that, its actually amazing how much "math" humans do when they drive, without realizing it.

If a car races up on a freeway beside you going significantly faster than you are, and you have a 2 car length space in front of you, an experienced human driver who is paying attention will anticipate that driver jumping in front of him / her, and plan for that accordingly.

The biggest issue we have now is, human drivers dont pay attention like they used to, while a computer will theoretically always be paying attention. Until almost all the cars are able to "talk" to each other in a way we currently can not, and there are sensors etc in roads to provide feedback, I dont think a car will be able to "drive me to work, from garage to parking lot".

These drivers aids have made great strides, and will continue to do so, but.... well you laid it out, and there are a lot more examples just like that where humans can anticipate things they dont even realize they are anticipating.