Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Another Model X crash, driver says autopilot was engaged

This site may earn commission on affiliate links.
I think it depends on the circumstances. In this case I think 11 seconds is a suitably short period to suggest that Autopilot was likely a contributing factor. But if the driver had exited the highway, then crashed on surface streets 11 seconds later, then it probably wouldn't be AP-related.

When NHTSA says they are investigating whether driving aids were in use at the time, I don't think they're going to confine themselves to the moment of impact.

The 11 seconds was because the driver overrode the AP so it disengaged 11 seconds before the driver crashed the car. 11 seconds wasn't some Tesla magic number time interval. AP would have stayed engaged and safely come to a stop had the driver not taken over and accelerated. What more do you want the car to do? Read minds?
 
The 11 seconds was because the driver overrode the AP so it disengaged 11 seconds before the driver crashed the car. 11 seconds wasn't some Tesla magic number time interval. AP would have stayed engaged and safely come to a stop had the driver not taken over and accelerated. What more do you want the car to do? Read minds?

11 seconds is a damn long time, too. Try counting it out and imagine yourself driving..
 
  • Informative
Reactions: Magus
Agreed. I think there is still one improvement which could be made, though, which I'll talk about below.

Firstly, I think that job number one is to prevent a driver from becoming distracted, or incapacitated, in the first place. Once a driver has entered this state, there are much fewer options and things can go south quickly. In that respect, the driver failed. Autopilot's beeps and prods are designed to help prevent this, so in that respect, Autopilot also failed.

That's not really fair to Autopilot, though, since no matter what it does, the driver could still fall asleep, or worse. With level 2 autonomy there is little can be done, but still, there is something. I imagine this conversation between two Autopilot engineers:

A: "So, what do we do if the driver falls asleep?"

B: "We nag him awake."

A: "And if that fails?"

B: "We reduce power to the car."

A: "Great. But... you don't think he'd wake up in a panic, note that the car is slowing down, do something rash and roll the car, do you?"

B: "Well... maybe."

A: "But he deserved it."

B: "Yes."

A: "And what if someone is having a stroke?"

B: "Don't have a stroke."

To prevent this kind of situation, when disengaging, Autopilot should not allow driver input until the car has completely stopped. It's not like Autopilot was having trouble with the road. It could have continued driving, and thus would have been able to continue to stop safely. This prevents an incapacitated driver from immediately taking control of the car and doing something stupid when Autopilot decides to stop the car. After all, the driver is known to be incapacitated. They don't know exactly where they are, perhaps even what lane they are in, who is in their blind spot, how fast they are going, etc. And they may be groggy and paniced to boot. Why should they be allowed to resume driving immediately? They can do so after the car has stopped.
Horrible idea. To cover one outlier case, you want to stop the car on a busy road when the driver has realized he needs to put his hands on the wheel and is ready to take over. Have you seen what happens when a car stops on the connector in Atlanta? Your proposal would cause way more accidents than it prevents.
 
  • Like
Reactions: xkwizit
I don't propose always locking out the driver; only when Autopilot is disengaging because the driver has not responded to alerts. In this case, the car was going to stop anyway, and thus, yes, it has to have the capability to safely stop. However, even if the stop were suboptimal, I don't think it will be at all likely that an incapacitated driver will be able to take any courses of action which are safer than the auto-stop. Thus, the forced autostop will result in the fewest accidents.

If Autopilot disengages because it can't handle the current conditions, then, of course, the driver should be allowed to take control.

You aren't making sense. AP will stay engaged as it comes to a stop unless the driver takes over. In most cases the driver is ignoring the warnings and will then realize he has to take over when the car starts to slow. The last thing I want to do is interfere with the driver rest,ing control of the vehicle.
 
  • Like
Reactions: xkwizit
... What more do you want the car to do? Read minds?
That does seem to be what a lot of naysayers seem to want. I don't actually think they've thought it through to that logical conclusion, but it sure comes across that way. Ain't gonna happen.

So, a preemptive step would be to stimulate drivers so they won't nod off. Also not going to happen.

Or force the driver's attention to the road.
 
Last edited:
Do you hate cruise control too?

I just don't understand the comparisons with cruise control, or even TACC. If I let go of the wheel in cruise control, the car will drift. So it requires a certain level of attention. If I have Autosteer engaged and I let go of the wheel, the car will drive for miles just fine without my intervention. That is a whole different ballgame.

I fully admit that if I had an AP car, I would be much more tempted to pick up my phone and respond to a text in Autosteer, where I would not do that with only TACC engaged. My brain is going to asses the risks of that action differently if the car is steering for me vs just modulating my speed. And if you think I am a small part of the population when I say that (because everyone else understands Autosteer except me), I would disagree. I think many people may make the same risk assessment, especially the folks who already text and drive now (I do not text and drive now, but I am human and occasionally read a text that comes in while driving. I am a terrible person, I know).

I am not a hater of autopilot. I am looking forward to trying it out in my model 3. I will be buying a CPO S soon, however that will most likely not be an AP car. I just see some human factors issues that will (already are?) arising from Teslas current implementation.
 
I just don't understand the comparisons with cruise control, or even TACC.
The comparisons are quite obvious.

Cruise control - Automatic pedal - can stop pushing the pedal
AP - Automatic pedal and steering - can stop pushing the pedal and stop turning the wheel.

Its a very similar ballgame. More complex, but extremely similar.

Maybe age has something to do with it. I remember when there was no automated anything. For goodness sake Palmdale California wasn't even on the map until 20 years after cruise control was invented.
Cruise control was as scary as AP is now. Same inhibitions. Same concerns. Same Same Same
 
  • Like
Reactions: xkwizit
I just don't understand the comparisons with cruise control, or even TACC. If I let go of the wheel in cruise control, the car will drift. So it requires a certain level of attention. If I have Autosteer engaged and I let go of the wheel, the car will drive for miles just fine without my intervention. That is a whole different ballgame.

I fully admit that if I had an AP car, I would be much more tempted to pick up my phone and respond to a text in Autosteer, where I would not do that with only TACC engaged. My brain is going to asses the risks of that action differently if the car is steering for me vs just modulating my speed. And if you think I am a small part of the population when I say that (because everyone else understands Autosteer except me), I would disagree. I think many people may make the same risk assessment, especially the folks who already text and drive now (I do not text and drive now, but I am human and occasionally read a text that comes in while driving. I am a terrible person, I know).

I am not a hater of autopilot. I am looking forward to trying it out in my model 3. I will be buying a CPO S soon, however that will most likely not be an AP car. I just see some human factors issues that will (already are?) arising from Teslas current implementation.
@Az_Rael Picking up the phone for a call - no worries you can Bluetooth you're phone to the car and enjoy hands free talking.
Texting - no matter whether you're on AP, TACC, cruise control or no control- that's an individual habit and preference. If you're so strong about not texting while driving you wouldn't do it on AP either. Me on the other hand would like to do everything but driving and am very patiently waiting for fully autonomous driving so I can do what I love to do in the nearly 25,000 minutes I spend on the road each year.
 
  • Funny
Reactions: EVie'sDad
I just don't understand the comparisons with cruise control, or even TACC. If I let go of the wheel in cruise control, the car will drift. So it requires a certain level of attention. If I have Autosteer engaged and I let go of the wheel, the car will drive for miles just fine without my intervention. That is a whole different ballgame.

I fully admit that if I had an AP car, I would be much more tempted to pick up my phone and respond to a text in Autosteer, where I would not do that with only TACC engaged. My brain is going to asses the risks of that action differently if the car is steering for me vs just modulating my speed. And if you think I am a small part of the population when I say that (because everyone else understands Autosteer except me), I would disagree. I think many people may make the same risk assessment, especially the folks who already text and drive now (I do not text and drive now, but I am human and occasionally read a text that comes in while driving. I am a terrible person, I know).

I am not a hater of autopilot. I am looking forward to trying it out in my model 3. I will be buying a CPO S soon, however that will most likely not be an AP car. I just see some human factors issues that will (already are?) arising from Teslas current implementation.
Garlan really answered it. With dumb cruise control you get to take your feet off the pedals but of course it will run you right up the rear of a car. I think society is losing its risk tolerance. I remember people just assuming that cruise control was an aid to the driver and it was the driver's responsibility to use it properly.
 
The 11 seconds was because the driver overrode the AP so it disengaged 11 seconds before the driver crashed the car. 11 seconds wasn't some Tesla magic number time interval. AP would have stayed engaged and safely come to a stop had the driver not taken over and accelerated. What more do you want the car to do? Read minds?

I'm not sure what you mean. My point is simply that 11 seconds is too short a period of time to conclude that Autopilot is not related to the accident.
 
I'm not sure what you mean. My point is simply that 11 seconds is too short a period of time to conclude that Autopilot is not related to the accident.

My point was that AP didn't disengage on its own. AP was commanded to disengage by the driver's actions. Furthermore, count slowly to 11. That's a long time after AP disengaged. Finally, had AP stayed engaged, the car would have come to a stop and not crashed. AP did everything it was supposed to do including warning the driver about inattention. The driver took over so he was in control. It then took definitive action by the driver to crash the car. Had the driver continued slowing he probably wouldn't have crashed. Instead the driver caused the car to speed up. Then the car crashed.

Your post reads as if the AP disconnected on its own and 11 seconds was too short of a time for the driver to recover from a bad situation. None of that is correct. AP didn't disconnect on its own. The car was in fine shape and would not have crashed had AP stayed engaged. The driver had to increase speed and veer off the road in order for the car to crash.

Imagine a similar accident involving active cruise control. A car comes up on another car in front and starts to slow down. The driver touches the brake and disengages cruise control. The driver then presses down on the accelerator causing the car to slam into the rear of the car in front 11 seconds later. "Oh, cruise control caused the accident. It is the car manufacturer's fault!"
 
There was a case a while ago where some woman in an RV put on the cruise control and went into the back to make a sandwich. The obvious happened, and she sued the company. They then had to put the ridiculous warnings in the manual. There is an old saying that you can't fix stupid. Technology should not stop because of stupid people. We need to encourage responsibility. None of these accidents are the fault of Tesla.
 
There was a case a while ago where some woman in an RV put on the cruise control and went into the back to make a sandwich. The obvious happened, and she sued the company. They then had to put the ridiculous warnings in the manual. There is an old saying that you can't fix stupid. Technology should not stop because of stupid people. We need to encourage responsibility. None of these accidents are the fault of Tesla.
I couldnt have said it better.
 
  • Like
Reactions: GoTslaGo
There was a case a while ago where some woman in an RV put on the cruise control and went into the back to make a sandwich. The obvious happened, and she sued the company. They then had to put the ridiculous warnings in the manual. There is an old saying that you can't fix stupid. Technology should not stop because of stupid people. We need to encourage responsibility. None of these accidents are the fault of Tesla.
I agree with your conclusion.

The fact that there never was a Winnebago/CC lawsuit underscores the gullibility and hysteria that can (mis)guide our decision making. In the context of this conversation, that is likely to work against Tesla.