Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Autopilot lane keeping still not available over 6 months after delivery

This site may earn commission on affiliate links.
I don't know which category you fall under, but I remember that there were those that claimed Tesla's system would be no different than existing lane keeping in that it will nag you to touch the steering wheel at a set time interval.

Reading the review, that appears to not be the case here. It only tells you to touch the steering wheel if autopilot feels like it doesn't have enough data to continue. If it does, it will continue going on (there is no time interval).

This is awesome and seems in line with Elon's general thinking. In my stupid Prius, I had to agree to some lawyer language before it would let me use the navigation system. And even then, all input functions are disabled when the car is moving over 5 MPH. Talk about being condescending and patronizing to your customers. I'm glad Tesla considers its drivers intelligent.
 
I don't know which category you fall under, but I remember that there were those that claimed Tesla's system would be no different than existing lane keeping in that it will nag you to touch the steering wheel at a set time interval.

Reading the review, that appears to not be the case here. It only tells you to touch the steering wheel if autopilot feels like it doesn't have enough data to continue. If it does, it will continue going on (there is no time interval).

I was referring to a different aspect (the car would eventually pull over safely when it couldn't continue on its own and you don't take over), but I fall into the camp that I felt Tesla wasn't going to be onerously nagging. I personally thought it would nag when it needed to and be quiet otherwise, and this seems to be how Tesla implemented things. We shall know for sure soon enough! :)
 
Yes, it is. And while coming to a slow stop in the middle of a highway is probably not a fantastic idea, it strikes me as better than having the car blindly make lane changes that might be difficult for other drivers to predict.

Really, though, this is why IMHO "Autopilot" is a bad and frankly kind of terrifying idea. If you have a software package that enables and encourages the driver to take her focus off the road, but that cannot reliably operate the car in many situations, that just seems like a software package that's not ready for prime time.

There's a reason Google and everyone else is spending a lot of time testing these systems before pushing them out for wide release.
 
Yes, it is. And while coming to a slow stop in the middle of a highway is probably not a fantastic idea, it strikes me as better than having the car blindly make lane changes that might be difficult for other drivers to predict.

Using the sensors to attempt to detect a safe place to lane change, coupled with the emergency flashers is surely far safer than parking a car in the travel lane, which is massively likely to cause an accident either involving either the Tesla or subsequent traffic. Lane changes aren't "blind" -- the car has a sensor suite expressly installed for that very purpose. While they aren't perfect, I think it is far better to pull off the road using them than to continue down the road or park in active travel lane.

There are already many, many cars in production today that have "autopilot" functionality and every single one of them simply disengages and allows the car to wander off the road if you don't react in the situation described in the article.
 
Using the sensors to attempt to detect a safe place to lane change, coupled with the emergency flashers is surely far safer than parking a car in the travel lane, which is massively likely to cause an accident either involving either the Tesla or subsequent traffic. Lane changes aren't "blind" -- the car has a sensor suite expressly installed for that very purpose. While they aren't perfect, I think it is far better to pull off the road using them than to continue down the road or park in active travel lane.

There are already many, many cars in production today that have "autopilot" functionality and every single one of them simply disengages and allows the car to wander off the road if you don't react in the situation described in the article.

They all scare me, frankly.

And these lane changes are "blind," for two reasons: a) because Tesla doesn't have enough sensors to really adopt this feature (e.g., a rear facing radar), and, more importantly b) because as Ingineer points out, this feature *specifically engages when the car doesn't have enough information to keep driving itself.*
 
They all scare me, frankly.

And these lane changes are "blind," for two reasons: a) because Tesla doesn't have enough sensors to really adopt this feature (e.g., a rear facing radar), and, more importantly b) because as Ingineer points out, this feature *specifically engages when the car doesn't have enough information to keep driving itself.*

No: this feature engages when the car doesn't have enough information to keep driving itself AND the driver is not responding or providing input. That's a big conditional.
 
Question is, if it doesn't have sufficient data to continue autopilot, how does it have enough to safely pull over?

I've test driven a bunch of cars with lane keeping. None safely pull over. They just shut off lane keeping. You have to take over. Which is no big deal because you're supposed to be watching the road.
The problem with nags that force you to grab the steering wheel is that almost 100% of the time you get the nag, the car locks back on lane lines in short order and the system shuts off if you don't grab the steering wheel anyway.
Anyone who has driven a lot using lane keeping, like myself, will tell you that you will know when to take over before being nagged.
 
They all scare me, frankly.

And these lane changes are "blind," for two reasons: a) because Tesla doesn't have enough sensors to really adopt this feature (e.g., a rear facing radar), and, more importantly b) because as Ingineer points out, this feature *specifically engages when the car doesn't have enough information to keep driving itself.*

I understand your position, but rear-facing radar is only required to detect someone travelling at a significantly higher speed than your vehicle who is presumably passing you on the right on a three (or more) lane highway while you have your emergency flashers engaged. One would hope that is a vanishingly uncommon situation. If you were in the left lane, you would presumably just pull onto the left shoulder.

Obviously, it is unfortunate that you ask the car to undertake a fairly complex manuever just when it thinks it has incomplete data, but it seems clearly much better than the alternatives. Its also likely that the failure mode is an inability to detect the lane markings, not the edges of the road and not other vehicles. I think trying to park gracefully in this situation is an excellent ambition and I applaud Tesla for thinking it through and taking the action proposed. One hopes they would also contact emergency services if the driver remains unresponsive. This is a potential life-saving innovation in more ways than one. I believe that fatal accidents due to drivers falling asleep or suffering medical emergencies are fairly common.
 
No: this feature engages when the car doesn't have enough information to keep driving itself AND the driver is not responding or providing input. That's a big conditional.


Right, and the conditional means that it is much less likely to happen--but we were talking about, when it does happen, whether it is fair to call these lane changes "blind"; the conditional is irrelevant to that question. The only thing that matters for that inquiry is whether the car has enough info to safely execute a lane change. I'm asserting that because one of the conditions precedent to it doing this is that it doesn't have enough info to keep driving, it almost by definition doesn't have enough info to make a safe lane change.

- - - Updated - - -

I understand your position, but rear-facing radar is only required to detect someone travelling at a significantly higher speed than your vehicle who is presumably passing you on the right on a three (or more) lane highway while you have your emergency flashers engaged. One would hope that is a vanishingly uncommon situation. If you were in the left lane, you would presumably just pull onto the left shoulder.

Obviously, it is unfortunate that you ask the car to undertake a fairly complex manuever just when it thinks it has incomplete data, but it seems clearly much better than the alternatives. Its also likely that the failure mode is an inability to detect the lane markings, not the edges of the road and not other vehicles. I think trying to park gracefully in this situation is an excellent ambition and I applaud Tesla for thinking it through and taking the action proposed. One hopes they would also contact emergency services if the driver remains unresponsive. This is a potential life-saving innovation in more ways than one. I believe that fatal accidents due to drivers falling asleep or suffering medical emergencies are fairly common.


This is where we differ--my guess is that a car that enables (and to some degree, encourages) you to take your attention off the road will lead to an increase in the number of incidents where the driver falls asleep behind the wheel--either because drivers will fall asleep more readily when not paying attention, or (more likely) because of the moral hazard that comes from trusting these systems.

That's an empirical question, I recognize, and one that should be studied. I might be wrong about it. But if I am right about it, then releasing an incomplete self-driving system is actually not a life saving innovation--it's dangerous.
 
Right, and the conditional means that it is much less likely to happen--but we were talking about, when it does happen, whether it is fair to call these lane changes "blind"; the conditional is irrelevant to that question. The only thing that matters for that inquiry is whether the car has enough info to safely execute a lane change. I'm asserting that because one of the conditions precedent to it doing this is that it doesn't have enough info to keep driving, it almost by definition doesn't have enough info to make a safe lane change.

Well the way I see it there are two possible scenarios:

1. No autopilot, driver falls asleep. Hopefully not putting pressure on the accelerator or steering wheel, but in any case the car is out of control and will crash into something with almost 100% certainty.

2. Autopilot, driver falls asleep. Information to the auto-drive isn't perfect, but is likely enough to slow down and signal (with hazard lights) that something is wrong. It has enough info from the ultrasonics to not strike cars moving at similar speeds and probably enough info to at least glean the road direction and where it can pull over, even if it can't perfectly keep the lane anymore.

I'll take door #2, it is in all cases an improvement.

I would go so far as to say if there were another way to detect unresponsive loss of consciousness, I would want AP to *automatically* engage to safely attempt to stow the car.
 
Last edited:
I just want to point out that I for one would rather have the Tesla algorithm if I were using AP and did fall asleep or have a medical emergency. The alternatives definitely seem to have very worse outcomes!

It will also be a extremely rare occurrence outside of us geeks testing it to see what it does. (and in that case, we can do so safely)
 
This is where we differ--my guess is that a car that enables (and to some degree, encourages) you to take your attention off the road will lead to an increase in the number of incidents where the driver falls asleep behind the wheel--either because drivers will fall asleep more readily when not paying attention, or (more likely) because of the moral hazard that comes from trusting these systems.

This argument would seem to apply to cruise control or even heated cabins as well. It does not appear that cars with self-steering capabilities today have higher insurance rates, so the empirical data does not appear to support your position. Volvo says that 53% of all fatal accidents in Sweden are caused by cars departing the road at speed -- an automated process that prevents you from exiting the highway seems likely to be a huge net benefit.

I'd like to see Tesla also implement something akin to emergency braking, but for emergency steering that will detect an attempt to steer off the road and take control from you and prevent it. I believe that is Volvo's approach. This should be independant of autopilot, much like emergency braking is independant of TACC today.
 
Maybe. We'll see. All I know is that people seem to drive a lot faster in AWD cars in the snow than in RWD cars, despite the fact that those cars aren't any better at stopping.

My guess is that autopilot will inspire similar risk-taking by people who believe that the technology makes the risk "safe," even though the technology only really is designed to deal with one very specific set of circumstances.
 
Well the way I see it there are two possible scenarios:

1. No autopilot, driver falls asleep. Hopefully not putting pressure on the accelerator or steering wheel, but in any case the car is out of control and will crash into something with almost 100% certainty.

2. Autopilot, driver falls asleep. Information to the auto-drive isn't perfect, but is likely enough to slow down and signal (with hazard lights) that something is wrong. It has enough info from the ultrasonics to not strike cars moving at similar speeds and probably enough info to at least glean the road direction and where it can pull over, even if it can't perfectly keep the lane anymore.

I'll take door #2, it is in all cases an improvement.

To be fair, JST's argument is that you are much less likely to fall asleep in the no-autopilot scenario. He fears that technology will lull you into inattention and you will be more likely to require the assistance. I agree that it will increase the incidence of that happening, but my experience on the roads today with texters and overtired drivers suggests to me that scenario #1 happens far too often. I'd rather take my chances with #2, particularly since I'm confident I can remain responsible and attentive.

- - - Updated - - -

Maybe. We'll see. All I know is that people seem to drive a lot faster in AWD cars in the snow than in RWD cars, despite the fact that those cars aren't any better at stopping.

My guess is that autopilot will inspire similar risk-taking by people who believe that the technology makes the risk "safe," even though the technology only really is designed to deal with one very specific set of circumstances.

I believe we can agree that the average person is an idiot and a very poor driver. What's more tragic is that half of them are even worse than that.
 
To be fair, JST's argument is that you are much less likely to fall asleep in the no-autopilot scenario. He fears that technology will lull you into inattention and you will be more likely to require the assistance. I agree that it will increase the incidence of that happening, but my experience on the roads today with texters and overtired drivers suggests to me that scenario #1 happens far too often. I'd rather take my chances with #2, particularly since I'm confident I can remain responsible and attentive.

I can accept that as a hypothetical. I doubt it will prove to be true. Especially if the scenario is just falling asleep due to road weariness or being inattentive. In that case, I'd hope the alarm would be sufficient to wake you up.
 
This argument would seem to apply to cruise control or even heated cabins as well. It does not appear that cars with self-steering capabilities today have higher insurance rates, so the empirical data does not appear to support your position. Volvo says that 53% of all fatal accidents in Sweden are caused by cars departing the road at speed -- an automated process that prevents you from exiting the highway seems likely to be a huge net benefit.

I'd like to see Tesla also implement something akin to emergency braking, but for emergency steering that will detect an attempt to steer off the road and take control from you and prevent it. I believe that is Volvo's approach. This should be independant of autopilot, much like emergency braking is independant of TACC today.


But systems that make it harder to inadvertently depart from your lane are very different from what Tesla is talking about here. Tesla is proposing a system that isn't just a back-up; it's a system that affirmatively encourages you to pay less attention to driving. That's its key feature. That's the value add that Tesla is promising.

I'm not a huge proponent of cruise control, either, but there's a big difference between this and cruise control. People WILL use this for texting and checking Facebook, which isn't really something you can do with cruise control.
 
But systems that make it harder to inadvertently depart from your lane are very different from what Tesla is talking about here. Tesla is proposing a system that isn't just a back-up; it's a system that affirmatively encourages you to pay less attention to driving. That's its key feature. That's the value add that Tesla is promising.

I'm not a huge proponent of cruise control, either, but there's a big difference between this and cruise control. People WILL use this for texting and checking Facebook, which isn't really something you can do with cruise control.

They may indeed. They are doing that right now, too. What's protecting them in that case? Look, there will be incidents caused by inattentive use of autopilot. Just like there are incidents caused by misunderstanding ABS, airbags, turn signals. But the benefits of those systems far outweigh the outlier events where they contribute to injury or accident. I think it will be so with AP, you don't, that's fine. Neither of us have any data :) Let's check back in 2 years.
 
They may indeed. They are doing that right now, too. What's protecting them in that case? Look, there will be incidents caused by inattentive use of autopilot. Just like there are incidents caused by misunderstanding ABS, airbags, turn signals. But the benefits of those systems far outweigh the outlier events where they contribute to injury or accident. I think it will be so with AP, you don't, that's fine. Neither of us have any data :) Let's check back in 2 years.

Agreed.

If we survive. :eek: