Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Autopilot: Crashed at 40mph

This site may earn commission on affiliate links.
...fine print of Tesla's isn't going to get them very far...

It is unfortunate but it's a norm that almost everything has a fine print or footnote.

Some people claimed that they never signed for the balloon payment and now they are caught off guard. All they knew were: They only signed for the loan but not all those complications.

However, the legal system still favors all those fine prints or footnotes. So, those who didn't read them or didn't understand them still lost their homes.

It's the same way when I first heard that a Tesla could go 300 miles!

I was fooled by not reading how they could achieve 300 miles!

It's the same when I heard I could get Tesla for $49,900 after federal tax credits. But it's a little more complicated than that.

And so on.

I think the legal system really wants us to do lots of homework and find out where are all those fine prints or footnotes hidden.
 
Last edited:
Yeah, I'm thinking that not just some little icon, but the entire driver's instrument cluster ought to get a colored tint overlay, sort of what we see in Sci-Fi shows when the shields are up. Maybe blue or green. And why not? Semi-autonomous driving is a big deal. Might as well show it off a bit more with a snazzier indication that its on.

-- Ardie

Only problem with colored overlays is for colorblind people. My step father was red/green colorblind, and I saw a post here where a blue colorblind person couldn't tell if AP was engaged or not with just the blue lines. It probably needs a separate demarcation like an outline or a icon or something in addition to a color change.

Maybe make the rainbow road graphics standard instead :)
 
Only problem with colored overlays is for colorblind people. My step father was red/green colorblind, and I saw a post here where a blue colorblind person couldn't tell if AP was engaged or not with just the blue lines. It probably needs a separate demarcation like an outline or a icon or something in addition to a color change.

Maybe make the rainbow road graphics standard instead :)

If the entire cluster was tinted vs black, that I could tell. And I'm the most colorblind person my ophthalmologist has even seen... I could only see the grayscale sample in the colorblind test book :(

I am unable to easily tell the blue lines on the side of the road (except watching it change), unable to tell the color of the steering wheel icon, not even sure if the speedometer (TACC) icon changes color. I assume it does?

Of course, Tesla hasn't done anything about the blue vs gray yet, so I'm not holding my breath.

The Ford Fussion changes the icon look when cruise control is off, enabled, on. Not just the color.
 
  • Informative
Reactions: neroden
Sorry, I just don't get this premise. The system is just an ACC system with auto steering at this point. The only advantage it has over other auto steer systems is it doesn't have a nag timer and it is smoother. However, it is quite far from completely driving the car: it doesn't know to change lanes to reach its destination, it doesn't react to traffic controls (traffic lights / stop signs), it can't make sharp turns, etc.


You don't need an iron clad attention span. It's the same as using cruise control. Treat it the same and I don't see the issue (and it seems a vast majority of people using AP are able to do so). There was another thread about how a lawyer tried to test the law by texting with AP active and he got the ticket regardless.

Sadly, now we know you DO need an iron clad attention span. Or do you still think that's a fair statement?
 
  • Informative
  • Disagree
Reactions: neroden and Topher
Honestly, we should have a betting pool on when, not if, Tesla gets sued in relation to its autopilot.

It would have been over pretty quickly. Now we know somebody had died (Joshua Brown) in his Tesla while on AP only a week or so before we were writing these comments. How did Tesla keep it quiet for a month and a half? Amazing. I'd say by the end of this year the case will be over and we'll see if the "Beta defense" works.

Updating this post. I have reconsidered. I'm not a lawyer so I didn't think of this right away, but this will never go to court. It would be idiotic of Tesla to let it. If Joshua Brown's family files a suit Tesla will offer them many millions of dollars to make it go away. The amount they have to give them would be NOTHING in comparison to losing this case in court.
 
Last edited:
There are a few articles covering a same incidence in Lebec, CA on 4/26/2016:

Another driver says Tesla’s autopilot failed to brake; Tesla says otherwise

Tesla Autopilot Misfires, Model S Crashes into Car at Speed on the I-5

Second Model S driver crashes and blames Tesla Autopilot for not stopping

Arianna Simpson was driving her Model S north from Los Angeles on I-5, cruising in autopilot mode. "All of a sudden the car ahead of me came to a halt. There was a decent amount of space so I figured that the car was going to brake as it is supposed to and didn't brake immediately. When it became apparent that the car was not slowing down at all, I slammed on the brakes but was probably still going 40 when I collided with the other car,"

The article reported that Tesla's log showed the autopilot mode was disengaged when she manually hit the brake.

If I understand correctly, in summary:

1) the driver blamed the crash to the "beta" automation that did not slow down the car in time and she had to manually applied the brake when it was too late.

2) Tesla blog shows that the automation was disabled at the time of the crash and the car was in manual mode at that time when the driver manually applied the brake.

simpson-crash-1.jpeg
 
It would have been over pretty quickly. Now we know somebody had died (Joshua Brown) in his Tesla while on AP only a week or so before we were writing these comments. How did Tesla keep it quiet for a month and a half? Amazing. I'd say by the end of this year the case will be over and we'll see if the "Beta defense" works.

Updating this post. I have reconsidered. I'm not a lawyer so I didn't think of this right away, but this will never go to court. It would be idiotic of Tesla to let it. If Joshua Brown's family files a suit Tesla will offer them many millions of dollars to make it go away. The amount they have to give them would be NOTHING in comparison to losing this case in court.
I would be willing to bet the up to this point only Tesla knew that autopilot was engaged during the accident.
 
I am a pilot. I have flown over 25,000 hours, mostly in aircraft with AP, including the ability to AutoLand in weather where you cannot see the runway till the nose wheel comes down to the runway. I think the Tesla AP should be treated similarly to the way professional pilots treat the autopilot. Monitor it CLOSELY, especially in tight quarters. And you don't wait and see what it will do till it is too late.
Many of the crew caused accidents with the modern automated aircraft have been attributed to pilots waiting too long to disconnect the automation and take control, or not fully understanding in what mode of operation the autopilot system was engaged in. Since we do not receive extensive training on the Tesla AP system, we should be spring loaded to the disconnect position whenever we see a threat developing.
When in doubt, disconnect. Don't wait till it is too late to apply brakes. Tesla's cautions on the autopilot are very plainly stated and should be taken seriously. l love the Tesla, but I love my life even more.
 
Nope. No driver aid or automation capability can be used as an excuse for anything, because the driver is still responsible for everything.

Commercial airliners have been routinely flying with autopilot since the 1950s. Not once has the NTSB ever cited the autopilot as the cause of an accident because the autopilot by definition has no responsibility for flight safety. That responsibility rests solely on the pilot and no one else. Autopilot, flight management systems, GPS, instrumentation, glide slopes, etc. are all pilot aids.

Telsa's autopilot functions identically. The purpose is to let the driver assume a supervisory role instead of an operational role. But the responsibility for safe driving doesn't change with the activation of autopilot.

Couldn't agree more. But many drivers at fault will still play the blame game.
 
I am a pilot. I have flown over 25,000 hours, mostly in aircraft with AP, including the ability to AutoLand in weather where you cannot see the runway till the nose wheel comes down to the runway. I think the Tesla AP should be treated similarly to the way professional pilots treat the autopilot. Monitor it CLOSELY, especially in tight quarters. And you don't wait and see what it will do till it is too late.
Many of the crew caused accidents with the modern automated aircraft have been attributed to pilots waiting too long to disconnect the automation and take control, or not fully understanding in what mode of operation the autopilot system was engaged in. Since we do not receive extensive training on the Tesla AP system, we should be spring loaded to the disconnect position whenever we see a threat developing.
When in doubt, disconnect. Don't wait till it is too late to apply brakes. Tesla's cautions on the autopilot are very plainly stated and should be taken seriously. l love the Tesla, but I love my life even more.
And that one word already makes that impossible. Tesla drivers simply are not professionals and will never be.

Honestly coming from Germany it already seems far too easy to get a license for a normal car in the US.
 
  • Helpful
  • Like
Reactions: KZKZ and neroden
The main problem here is that the media is framing this accident completely incorrectly. This had nothing to do with Autosteer. If there was a technical problem, it is with some combination of TACC and AEB. It's important to note that many, many cars have both of these features. My 2010 Prius did. Yes, there may be a problem, but it's likely not unique to Tesla.

The primary factor here is that, if Autosteer were engaged, it is possible that the driver was not paying attention. I doubt anyone knows how engaged the driver was at this point in the investigation. If he were driving an un-automated vehicle, could he have prevented the accident? If AEB worked flawlessly, would it have slowed the car sufficiently to reduce the severity of the accident? Nobody knows the answers to these questions yet.
 
I can tell you that there have been many, many times where the ACC and the auto braking did not "see" the car in front of me in the BMW i3 that I have. It's not just a Tesla problem, it's a technology problem. We have these systems and they are advancing quickly now but they still require oversight by the driver.
 
  • Informative
Reactions: neroden
I can tell you that there have been many, many times where the ACC and the auto braking did not "see" the car in front of me in the BMW i3 that I have. It's not just a Tesla problem, it's a technology problem. We have these systems and they are advancing quickly now but they still require oversight by the driver.

But if the technology is functioning as designed, whereas the design has known and acknowledged limitations, how is it a technology "problem"? TACC, AEB, Autosteer have limitations, but I don't think it's correct to characterize those as "flaws" or "problems". Those technologies are doing exactly what they're supposed to do. It's just that we wish it would do more.

The true "problem" is the human-machine interface. Humans have a lot of difficulty working in concert with these systems because there is lack of understanding and training, incorrect assumptions of capability, over-reliance on the capabilities, false sense of security, and over-confidence of your own fallback capabilities. In short, we are human and flawed, and these systems offer more opportunity for our own flaws to be expressed.
 
It would have been over pretty quickly. Now we know somebody had died (Joshua Brown) in his Tesla while on AP only a week or so before we were writing these comments. How did Tesla keep it quiet for a month and a half? Amazing. I'd say by the end of this year the case will be over and we'll see if the "Beta defense" works.

Updating this post. I have reconsidered. I'm not a lawyer so I didn't think of this right away, but this will never go to court. It would be idiotic of Tesla to let it. If Joshua Brown's family files a suit Tesla will offer them many millions of dollars to make it go away. The amount they have to give them would be NOTHING in comparison to losing this case in court.

When are you going to give up? We get it, you don't like autopilot, you think Tesla should get sued, you question their PR, Elon's motives etc. You shouldn't have to get on your knees to post here but it's obvious you have an axe to grind. It's getting a little tired.
 
I am a pilot. I have flown over 25,000 hours, mostly in aircraft with AP, including the ability to AutoLand in weather where you cannot see the runway till the nose wheel comes down to the runway. I think the Tesla AP should be treated similarly to the way professional pilots treat the autopilot. Monitor it CLOSELY, especially in tight quarters. And you don't wait and see what it will do till it is too late.
Many of the crew caused accidents with the modern automated aircraft have been attributed to pilots waiting too long to disconnect the automation and take control, or not fully understanding in what mode of operation the autopilot system was engaged in. Since we do not receive extensive training on the Tesla AP system, we should be spring loaded to the disconnect position whenever we see a threat developing.
When in doubt, disconnect. Don't wait till it is too late to apply brakes. Tesla's cautions on the autopilot are very plainly stated and should be taken seriously. l love the Tesla, but I love my life even more.

precisely. being an avid armchair pilot enthusiast for many years, I fully agree with your post. Vigilance is necessary to ensure safety, be forewarned.
 
Updated comments from the driver in WSJ:

Tesla’s Autopilot Vexes Some Drivers, Even Its Fans

Arianna Simpson, a venture capitalist in San Francisco, said the Autopilot in her Model S “did absolutely nothing” when the car she was following on Interstate 5 near Los Angeles changed lanes, revealing another car parked on the highway.

Her Model S rammed into that car, she said, damaging both vehicles but causing no major injuries.

Tesla responded that the April crash was her fault because she hit the brakes right before the collision, disengaging Autopilot. Before that, the car sounded a collision warning as it should have, the car’s data show.

“So if you don’t brake, it’s your fault because you weren’t paying attention,” said Ms. Simpson, 25. “And if you do brake, it’s your fault because you were driving.”

She doesn’t expect to use Autopilot much once her Model S is repaired, partly because she thinks she would constantly second-guess the automated-driving system.
 
keeping that following distance at 5 or 6 might have helped slow down sooner, giving her a chance to react like the car in front of her did. The data never lies, follow the facts. We don't know, but Tesla does know how her car was configured, and if she was following too close at too high a rate of speed and not being fully aware of her surroundings or upcoming traffic (e.g. other cars traffic lights signalling a slowdown or obstruction in the road), then yes it may have been her fault, not the auto-pilots. It only does what it is told (and programmed to be capable of doing).