Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

AP tried to crash my car TWICE this week!

This site may earn commission on affiliate links.
Was it, really, though?
The driver is responsible for taking over when AP fails, and the OP did, but the AP still did fail on its own. The driver is responsible for taking over when AP fails (and failure to do so would indeed be driver’s fault), but the driver is not at fault for AP’s failures itself... how can they be, AP made the mistake and nothing the driver did could have stopped it (other than not using AP)?

Incorrect. The driver is responsible for the total operation of the car, even when the car is assisting with autopilot. It is way more than simply taking over when AP fails. The driver is absolutely at fault for any/all failures of Autopilot.

We all joke around about the car taking control, but that is nothing but a pipe dream at this point. I do not expect those abilities any time soon (even though I shelled out the extra $4k for FSD).
 
Last edited:
Incorrect. The driver is responsible for the total operation of the car, even when the car is assisting with autopilot. It is way more than simply taking over when AP fails. The driver is absolutely at fault for any/all failures of Autopilot.

We all joke around about the car taking control, but that is nothing but a pipe dream at this point. I do not expect those abilities any time soon (even though I shelled out the extra $4k for FSD).

The driver is certainly responsible for AP but to call him/her at fault for AP failures does not sound correct to me. How can a driver be at fault for something they have absolutely no control over or definite insight into? If AP has a bug that manifests itself by surprise, it is the driver’s fault? It is the driver’s responsiblity, sure, but fault?

The driver is responsible for controlling AP so that the overall driving event remains legal and safe, even with APs faults, that is true. But to call AP’s failures the driver’s fault does not ring true to me — except in circumstances where the driver for some other reason could be found to have caused AP to do it...
 
So about a week after getting the “Autopilot features limited” warning I had a near accident using AP this week. The car tried to veer into a side barrier without any warning, and I was able to take quick control to avoid crashing. I didn’t get any take over w

AP is from what I understand on all variants CURRENTLY, is not designed to work at those speeds and in those situations in the video. Highway speeds and on-ramp to off-ramp on anything over AP-1. The End.

Your going like 10 miles per hour. Your experimenting, its not going so well either. Be Careful.
 
AP is from what I understand on all variants CURRENTLY, is not designed to work at those speeds and in those situations in the video. Highway speeds and on-ramp to off-ramp on anything over AP-1. The End.

Your going like 10 miles per hour. Your experimenting, its not going so well either. Be Careful.

Yes, one can certainly argue that certain types of roads or speed ranges or similar features are not suited for AP and the manual does list some limitations. That is fair.

Of course people have had AP steer off lane on limited highways at the correct speed ranges too so the issue itself probably will manifest anywhere?
 
issue itself probably will manifest anywhere?

Exactly, thats why you must monitor the wheel in whatever way one desires to do that. Its written everywhere in the manual and by now if you don't know it, well nobody can help that person.

The get of Jail Free Card here is again this system is in Beta, heard it, living it, we must all embrace it.

Anyone is fooling themselves otherwise is EXPERIMENTING.
 
Exactly, thats why you must monitor the wheel in whatever way one desires to do that. Its written everywhere in the manual and by now if you don't know it, well nobody can help that person.

The get of Jail Free Card here is again this system is in Beta, heard it, living it, we must all embrace it.

Anyone is fooling themselves otherwise is EXPERIMENTING.

OP did just that, though. They did not crash nor did they even veer off-course.

Discussing the failure of AP to keep the lane is still a worthy topic, no?
 
I am always nervous using AP near a barrier and as a result in those situations always hold the wheel with both hands. Ironically that is when I get the nags ‘because’ I am using both hands.

I can’t see anything wrong in second video, and nothing to be over alarmed in first. The title of thread is rather over dramatic.
 
  • Like
Reactions: phibetakitten
I am always nervous using AP near a barrier and as a result in those situations always hold the wheel with both hands. Ironically that is when I get the nags ‘because’ I am using both hands.

I can’t see anything wrong in second video, and nothing to be over alarmed in first. The title of thread is rather over dramatic.

The surprise when driving probably felt worse than it looks afterwards hence the reaction. Luckily the driver did the right thing and stopped AP from doings it bad deeds, so we are looking at a nice video instead...
 
The driver is certainly responsible for AP but to call him/her at fault for AP failures does not sound correct to me. How can a driver be at fault for something they have absolutely no control over or definite insight into? If AP has a bug that manifests itself by surprise, it is the driver’s fault? It is the driver’s responsiblity, sure, but fault?

The driver is responsible for controlling AP so that the overall driving event remains legal and safe, even with APs faults, that is true. But to call AP’s failures the driver’s fault does not ring true to me — except in circumstances where the driver for some other reason could be found to have caused AP to do it...

An Autopilot failure at best, might be considered a "contributing factor," But fault lies completely with the driver. If you don't think that is true, you should not be using the feature at all.
 
An Autopilot failure at best, might be considered a "contributing factor," But fault lies completely with the driver. If you don't think that is true, you should not be using the feature at all.

I think we have veered into the domain of pointless semantics. :) I will not consider the driver’s fault if Tesla introduces a bug or fails to implement a feature in Autopilot, as long as Autopilot is used according to their instructions. I do consider the driver responsible for catching such failings to a very high degree (as long as driver remains responsible for the drive), of course, perhaps excluding intentional malice or whatever a court might find Tesla’s responsibilty as a manufacturer in general.

Autopilot’s bugs, failings and progress (forwards or backwards) nevertheless are worthy topics for discussion. Those are distinct from issues caused by the activity or inactivity of the driver as they are analysis of the technology itself. Nothing to do with the driver when analyzing the technology. If Autopilot makes a move for the ditch by itself, that is a topic for discussion.
 
Last edited:
I think we have veered into the domain of pointless semantics. :) I will not consider the driver’s fault if Tesla introduces a bug or fails to implement a feature in Autopilot, as long as Autopilot is used according to their instructions. I do consider the driver responsible for catching such failings to a very high degree, of course, perhaps excluding intentional malice etc whatever a court might find Tesla’s responsibilty.

Autopilot’s bugs, failings and progress (forwards or backwards) nevertheless are worthy topics for discussion. Those are distinct from issues caused by the activity or inactivity of the driver as they are analysis of the technology itself.

Yup, I do not disagree. My main point is you absolutely must remain alert, in some case even more so, when using Autopilot. In fact when NoA first came out, I thought it took far too much attention away from driving (it is better now). It will make bad recommendations from time to time.

Example: two days ago, during a 700 mile drive, rural interstate with two lanes, using NoA set for Mad Max, version 48.12. I had just passed a semi near an exit. Autopilot had just signaled me to leave the passing lane. It did not take into account the fact there was a vehicle on the entrance ramp also attempting to arrive in front of the truck. Basically this meant two vehicles were set to merge into the same spot from opposite sides of the target lane. Yet Autopilot did not even recognize (display to me anyway) there was an extra lane to the right. So basically had I not noticed the car on the entrance ramp far right, my car could have merged into the car coming from the ramp.

This frustrates me, because I really want it to be better. Had the lane change been totally automatic (which we were all hoping for), the result could have been somewhat more exciting. In this case though I simply delayed acknowledging the lane change advice. But I really wish it was better! I have another road trip set for tomorrow, and I will set NoA to "average" rather than "Mad Max," and see if I can replicate the experience, however unlikely that is. I do know that "Average" extends the passing/merging distance a little.

Autopilot is really REALLY hard! Human drivers are still way better.
 
  • Like
Reactions: electronblue
OP did just that, though. They did not crash nor did they even veer off-course.

Discussing the failure of AP to keep the lane is still a worthy topic, no?

Yes for learning sake here, this is what this forum is really all about is helping others and ourselves in the dawn and beyond of the ownership experience. Start with the manual though, that will help also. Just a suggestion.




 
Yes for learning sake here, this is what this forum is really all about is helping others and ourselves in the dawn and beyond of the ownership experience. Start with the manual though, that will help also. Just a suggestion.

That — and figuring out where Tesla’s products are tech-wise. Understanding Autopilot’s progress and weaknesses are important topics in their own right, even for the experienced and experts amongst us.
 
...to call him/her at fault for AP failures does not sound correct to me...

The failure of a competent human operator to appreciate Autopilot many limitations (euphemism for incompetent) is to be blamed, not the incompetent Autopilot itself.

For example, when a toddler is first able to utter the word "dada" but unable to say "mamma". That's a known limitations of a toddler. Failure to utter "mamma" is an issue but we should not take the kid to have a vocal chord surgery.

Another troubling example is a babysitter complains that the toddler purposefully crawled and fell down the stairs and should be punished because it's the toddler's failure not babysitter's failure.

It's fine to learn why a toddler cannot utter "mamma", cannot competently stand or walk but to blame it's the toddler's failure is not learning.

Autopilot is like a toddler and human driver is like a babysitter.

If a human can't take care of a baby then don't take a babysitter assignment. If a human can't handle Autopilot then don't buy it and stop blaming!
 
Last edited:
@Tam I still do not think it is the babysitters fault that the toddler falls. Yes, it would be their fault if the toddler falls down the stairs — but not the fact that the toddler sometimes falls and does unexpected things (like Autopilot).

The babysitter is responsible for keeping the overall situation and toddler safe, but not at fault for the mistakes the toddler makes. The only way to stop a toddler (or Autopilot) from making mistakes is to tie it down.

I mean seriously. Babies learning to walk tumble (safely) all the time. The babysitter is not ”at fault” for that. Autopilot sometimes makes errant exits towards the ditch. The driver is not at fault for that. They are responsible for catching them before something serious happens though.
 
Ok, I managed to find the footage of the second incident.

I've edited the clip so It starts straightaway. Pay attention to how the car is trying to turn out of the lane.


Both incidents happened at similar road layouts.

%255B796cab2501127b7e95ed9d1f3fe4b2af%255D_Image%2525202018-12-27%252520at%2525209.29.43%252520PM.png


%255B101a73f5135ae9c0c99dba669dca7f86%255D_Image%2525202018-12-27%252520at%2525209.29.17%252520PM.png


Any thoughts guys?
Yes...There is a problem with certain lane exits and lane widenings such as a given lane splitting in two. In the former case use the right mouse wheel to slow the car down 5-10 mph. In the later case use blinker to tell the lane to change lanes to the direction you want it to go. I've had my car 7-8 weeks and those are the only two instances where the car either needs help and/or my anxiety needs help. In the case of exits slowing works like a champ. Remember, currently it can't read exit speed signs so exits are taken at full speed and the time it takes to make a mistake is less than figuring out the best solution.
Also, and this is a human problem, when using NoA vs lane keep, remember lane keep WILL NOT do exits. In theory it will if you use turn signal w/i 100'. I've yet to see that or I put signal on 200-300 feet and that doesn't work. It needs more experimentation perhaps speed reduction as well as turn signal. Final thought: Read manual cover to cover.
 
...The driver is not at fault for that...

Maybe we are talking about the same thing that Autopilot is not advanced enough and it does fail at times.

When I bought AP2, I knew exactly what I got into. I saw how AP1 progressed from nothing to gradually to AutoSteer.

I also learned along the way that owners got into crashes even with Autopilot.

When I bought my AP2, Autopilot was only working at the maximum speed of 35 MPH on a freeway and my freeway around here is 70 MPH. That's suicidal Autopilot speed!

I take the word "beta" seriously and use it accordingly.

The thread here does not appreciate the "beta" meaning and that could be suicidal!

Yes, Autopilot has many limitations but if owners wait long enough, it will get better.

Even with the more advanced system, George Hotz says:

“Every self-driving car on the road today is worse than a human. Everyone, Waymo included."

We have a choice of falsely believing that Autopilot is so advanced that mishaps cannot happen.

Or we can be prepared and know that the technology is not there just yet and human driver still has to do the hard work.
 
Incorrect. The driver is responsible for the total operation of the car, even when the car is assisting with autopilot. It is way more than simply taking over when AP fails. The driver is absolutely at fault for any/all failures of Autopilot.

We all joke around about the car taking control, but that is nothing but a pipe dream at this point. I do not expect those abilities any time soon (even though I shelled out the extra $4k for FSD).
I expect it w/i 18 months. Remember that initial video where the driver is hands free? Now there is a second video, clearly not the same as the first. one. They are close.