Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Blog Tesla: Autopilot Was Activated During Fatal Model X Crash

This site may earn commission on affiliate links.
Autopilot was activated when a Model X crashed into a concrete barrier killing the driver last week near Mountain View, Calif., according to a release from Tesla.

“In the moments before the collision, which occurred at 9:27 a.m. on Friday, March 23rd, Autopilot was engaged with the adaptive cruise control follow-distance set to minimum,” the company said. “The driver had received several visual and one audible hands-on warning earlier in the drive and the driver’s hands were not detected on the wheel for six seconds prior to the collision. The driver had about five seconds and 150 meters of unobstructed view of the concrete divider with the crushed crash attenuator, but the vehicle logs show that no action was taken.”

Damage to the Model X was severe, in fact Tesla said it has it has “never seen this level of damage to a Model X in any other crash.” The company blames the severity of the crash on the absence of a crash attenuator designed to reduce the impact into a concrete lane divider. The crash attenuator was reportedly destroyed in a separate accident 11 days before the Model X crash and had yet to be replaced.

“Our data shows that Tesla owners have driven this same stretch of highway with Autopilot engaged roughly 85,000 times since Autopilot was first rolled out in 2015 and roughly 20,000 times since just the beginning of the year, and there has never been an accident that we know of,” the company said in an earlier statement. “There are over 200 successful Autopilot trips per day on this exact stretch of road.”

U.S. National Transportation Safety Board is investigating the crash.

Here’s Tesla’s update in full:

Since posting our first update, we have been working as quickly as possible to establish the facts of last week’s accident. Our hearts are with the family and friends who have been affected by this tragedy.

The safety of our customers is our top priority, which is why we are working closely with investigators to understand what happened, and what we can do to prevent this from happening in the future. After the logs from the computer inside the vehicle were recovered, we have more information about what may have happened.

In the moments before the collision, which occurred at 9:27 a.m. on Friday, March 23rd, Autopilot was engaged with the adaptive cruise control follow-distance set to minimum. The driver had received several visual and one audible hands-on warning earlier in the drive and the driver’s hands were not detected on the wheel for six seconds prior to the collision. The driver had about five seconds and 150 meters of unobstructed view of the concrete divider with the crushed crash attenuator, but the vehicle logs show that no action was taken.

The reason this crash was so severe is because the crash attenuator, a highway safety barrier which is designed to reduce the impact into a concrete lane divider, had been crushed in a prior accident without being replaced. We have never seen this level of damage to a Model X in any other crash.

Over a year ago, our first iteration of Autopilot was found by the U.S. government to reduce crash rates by as much as 40%. Internal data confirms that recent updates to Autopilot have improved system reliability.

In the US, there is one automotive fatality every 86 million miles across all vehicles from all manufacturers. For Tesla, there is one fatality, including known pedestrian fatalities, every 320 million miles in vehicles equipped with Autopilot hardware. If you are driving a Tesla equipped with Autopilot hardware, you are 3.7 times less likely to be involved in a fatal accident.

Tesla Autopilot does not prevent all accidents – such a standard would be impossible – but it makes them much less likely to occur. It unequivocally makes the world safer for the vehicle occupants, pedestrians and cyclists.

No one knows about the accidents that didn’t happen, only the ones that did. The consequences of the public not using Autopilot, because of an inaccurate belief that it is less safe, would be extremely severe. There are about 1.25 million automotive deaths worldwide. If the current safety level of a Tesla vehicle were to be applied, it would mean about 900,000 lives saved per year. We expect the safety level of autonomous cars to be 10 times safer than non-autonomous cars.

In the past, when we have brought up statistical safety points, we have been criticized for doing so, implying that we lack empathy for the tragedy that just occurred. Nothing could be further from the truth. We care deeply for and feel indebted to those who chose to put their trust in us. However, we must also care about people now and in the future whose lives may be saved if they know that Autopilot improves safety. None of this changes how devastating an event like this is or how much we feel for our customer’s family and friends. We are incredibly sorry for their loss.

Photo: @DeanCSmith/Twitter

 
Last edited by a moderator:
The BMW 5 series is a faster sexier car and as far as deaths go, is every bit as safe as the XC90. from 2012 through 2015 no deaths occurred in a 5 series.
The common factor is that these cars are driven by the same demographic. 35-55yr old people who make 250k+ per year. Very safe drivers.

Actually, at least 20 deaths occurred in 528i's in 2014 alone. Better recheck the source of your info. Mine is: Driver death rates
 
My car doesn't operate that way. It doesn't care at all about turn signal or navigation destination.
I've seen my wife simply use the lane change feature at a gore point - and it takes the turn - it would be weird to use a turn signal at that point though - possible confusing other drivers. . . but it would be normal beforehand -

And yes - destination does not matter with AP.

We have AP2.5 in a 2017 EAP S 75D
 
  • Like
Reactions: FlyF4
I dont have

I dont have either, just a curious observer, but was only commenting on the best driver awareness technology no matter who it is from would be seem to be preferred if you are going to insist the driver needs to be aware and make them liable. If there is technology to do this and its not used I dont think “we said pay attention” is going to go far in a civil court especially since the better the technology gets the more it will lull people to ignore the pay attention warning.

Thanks for the reply. I guess I don't see SuperCruise as best technology for the reasons stated above and that's just from a few articles reviewers have done on it. Some of the reviewers even chose it over Tesla's AP which to me is laughable if you can only use it on a subset of roads. Who wants to spend that kind of money for less coverage and still have to watch the road? What I get is that they are trying to push readers towards one of the big auto guys, and if they were on par with coverage okay but they're not. As I see it there isn't any system working well enough for all highway driving (where you would be doing hours of driving) such that the driver doesn't have to be driver aware/hands on wheel basically (for abrupt situational changes)/with foot available to brake if necessary (since AEB doesn't cover all situations either). That said I find it really irresponsible of manufacturers, car reviewers, and drivers--including Tesla owners--to show their cars driving in hand's free mode.
 
  • Informative
Reactions: cb32000
Thanks for the reply. I guess I don't see SuperCruise as best technology for the reasons stated above and that's just from a few articles reviewers have done on it. Some of the reviewers even chose it over Tesla's AP which to me is laughable if you can only use it on a subset of roads. Who wants to spend that kind of money for less coverage and still have to watch the road? What I get is that they are trying to push readers towards one of the big auto guys, and if they were on par with coverage okay but they're not. As I see it there isn't any system working well enough for all highway driving (where you would be doing hours of driving) such that the driver doesn't have to be driver aware/hands on wheel basically (for abrupt situational changes)/with foot available to brake if necessary (since AEB doesn't cover all situations either). That said I find it really irresponsible of manufacturers, car reviewers, and drivers--including Tesla owners--to show their cars driving in hand's free mode.
For the purposes of public safety I am less
concerned about coverage and more concerned about assigning liability. I do not think “read the manual” and “always pay attention” will be enough to shield automakers. The amount of time before ignoring the warning and an accident could be hours, weeks, months etc. Without strong monitoring and verification of awareness I see the issue as problematic. As for all the people doing dumb things, they share the road with us and as more and more tech rolls out there will be more and more people napping with their hand on the wheel or other dopey things. I think the tech is available to prevent this without resorting to the hope that dumb people take personal responsibility to keep me or you safe. Tesla has been lucky that their system did not cause a third party injury or fatality.
 
Last edited:
  • Like
Reactions: kkboss
Thanks for the reply. I guess I don't see SuperCruise as best technology for the reasons stated above and that's just from a few articles reviewers have done on it. Some of the reviewers even chose it over Tesla's AP which to me is laughable if you can only use it on a subset of roads. Who wants to spend that kind of money for less coverage and still have to watch the road? What I get is that they are trying to push readers towards one of the big auto guys, and if they were on par with coverage okay but they're not. As I see it there isn't any system working well enough for all highway driving (where you would be doing hours of driving) such that the driver doesn't have to be driver aware/hands on wheel basically (for abrupt situational changes)/with foot available to brake if necessary (since AEB doesn't cover all situations either). That said I find it really irresponsible of manufacturers, car reviewers, and drivers--including Tesla owners--to show their cars driving in hand's free mode.

Supercruise is less likely to be in an accident. Part of that is because it tries to force the driver to pay more attention. But the biggest reason is because it's only on the CT6. Who the hell buys a CT6? :p And then it only works on very specific roads and doesn't even work all the time on those roads. It'll be a while before it is really tested beyond its limits like we've seen with Autopilot.

Autopilot is one of the reasons I bought a Tesla. And it's versatility was very attractive. However like I said I wish that safety was just as high a priority along side that versatility. They should've included as much proven tech (flir, Lidar, multiple forward facing cameras) as they could to help minimize killing the occupants and others.

I don't know how the Tesla NN works but it seems like they could be accomplishing alot with safety because of the number of vehicles on the road. Like they had so many Teslas drive that one stretch of interstate where the accident was, they could map safe zones based on where previous Teslas have traversed. That gore area could've then been mapped out. It would help to keep the cars from driving off the shoulders of the roads or into barriers. The areas with the most Teslas would become the safest and most established.
 
  • Like
  • Informative
Reactions: Canuck and bhzmark
Key question is: wld super cruise stay in the cr@ppy mismarked Caltrans lane? Or wld it also sometimes drift into the gore zone? And if so, would it stop or take other action before driving into the barrier?


Supercruise is less likely to be in an accident. Part of that is because it tries to force the driver to pay more attention. But the biggest reason is because it's only on the CT6. Who the hell buys a CT6? :p And then it only works on very specific roads and doesn't even work all the time on those roads. It'll be a while before it is really tested beyond its limits like we've seen with Autopilot.

Autopilot is one of the reasons I bought a Tesla. And it's versatility was very attractive. However like I said I wish that safety was just as high a priority along side that versatility. They should've included as much proven tech (flir, Lidar, multiple forward facing cameras) as they could to help minimize killing the occupants and others.

I don't know how the Tesla NN works but it seems like they could be accomplishing alot with safety because of the number of vehicles on the road. Like they had so many Teslas drive that one stretch of interstate where the accident was, they could map safe zones based on where previous Teslas have traversed. That gore area could've then been mapped out. It would help to keep the cars from driving off the shoulders of the roads or into barriers. The areas with the most Teslas would become the safest and most established.
 
I guess we will all find out when the final report comes out. I don't speculate.

I think speculation has a place and can make people aware of things they have not thought about in the past, like the fact that Tesla's AP really lulls you into a false sense of security. It did for me, anyway, and I found that scary because it worked so good, at least for the one week I had it in a loaner and I can't wait to get it soon with my model 3. But I am really concerned about my wife and kids using it since I can easily see them doing other things and not paying proper attention. I will make them watch this video a few times before they drive it, which everyone should do before using AP so they understand its pretty severe limitations:


As I've said before, I can live with me killing me, but I can't live with AP killing me (pun intended), or worse, one of my family members. My speculative view is that Tesla's AP overall is making our roads more safe simply because the death rate from automobiles is 32,000 per year in the US, roughly 90 a day, and is the leading cause of death for teenagers. The stats are similar proportionally in Canada. So when someone dies not paying attention on AP, I can't help but speculate how many lives were saved when others relied on AP and it perhaps prevented an accident or an accident would have happened by human error were it not on AP. I'm glad my 3 will come with a front facing camera, and I would like an electrical jolt to my kids (and me and my wife) if our eyes leave the road on AP. Tesla has only made the problem worse by calling it AP (it should be called driver's assist -- please pilots don't say it's a correct term since it's for us lay people) plus Tesla releases self-driving videos that I find suspect but get a ton of views so most people think this is what AP can do when it can't.

We speculated why Joshua Brown hit that semi, in that he wasn't paying proper attention, and I'm sure that made a lot of people think about paying more attention while on AP and perhaps saved a life.
 
  • Like
Reactions: NerdUno
I think speculation has a place and can make people aware of things they have not thought about in the past, like the fact that Tesla's AP really lulls you into a false sense of security. It did for me, anyway, and I found that scary because it worked so good, at least for the one week I had it in a loaner and I can't wait to get it soon with my model 3. But I am really concerned about my wife and kids using it since I can easily see them doing other things and not paying proper attention. I will make them watch this video a few times before they drive it, which everyone should do before using AP so they understand its pretty severe limitations:


As I've said before, I can live with me killing me, but I can't live with AP killing me (pun intended), or worse, one of my family members. My speculative view is that Tesla's AP overall is making our roads more safe simply because the death rate from automobiles is 32,000 per year in the US, roughly 90 a day, and is the leading cause of death for teenagers. The stats are similar proportionally in Canada. So when someone dies not paying attention on AP, I can't help but speculate how many lives were saved when others relied on AP and it perhaps prevented an accident or an accident would have happened by human error were it not on AP. I'm glad my 3 will come with a front facing camera, and I would like an electrical jolt to my kids (and me and my wife) if our eyes leave the road on AP. Tesla has only made the problem worse by calling it AP (it should be called driver's assist -- please pilots don't say it's a correct term since it's for us lay people) plus Tesla releases self-driving videos that I find suspect but get a ton of views so most people think this is what AP can do when it can't.

We speculated why Joshua Brown hit that semi, in that he wasn't paying proper attention, and I'm sure that made a lot of people think about paying more attention while on AP and perhaps saved a life.

That is awful, but, as the video says, there is no way to know if Auto pilot was being used or not because Tesla hasn't been given access to the car. I find it very suspicious that the car did not react in any way to a clearly visible issue. My AP1 car swerves when trucks start to move towards the edge of the lane closest to me even if they are still 100% in their lane. No AEB at all? Of course, this video truly begs the question, where was the driver? He had at least 5 seconds of a totally visible parked truck blocking his lane and he didn't even react a little? You would anticipate the car helping you, but, the driver is first and primarily responsible for what he does with the car. This could even be someone intentionally crashing the car based on the complete lack of any reaction to the circumstances. We don't know, that is speculation. But, whatever he was doing, 5 plus seconds of no eyes on the road at all while driving at high way speeds in the rain? That is a death wish.
 
That is awful, but, as the video says, there is no way to know if Auto pilot was being used or not because Tesla hasn't been given access to the car.

I wouldn't give my vehicle to Tesla either if one of my family members was killed in it. Any lawyer would tell you that only happens after litigation is commenced and the vehicle is allowed to be inspected on terms in a Court Order, such as no destructive testing, and having the Plaintiff's experts present and approving each part of any analysis by Tesla.

In any event, it was on AP during the crash in that video, as the lawsuit has now determined according to the press and Tesla has not denied:

Tesla fatal crash case in China: company admits the car in “autopilot” when accident happened

If it wasn't on AP, Tesla would issue a press release. There is none. Do you need any more evidence?

My AP1 car swerves when trucks start to move towards the edge of the lane closest to me even if they are still 100% in their lane.

And there's that false sense of security I warned about.
 
  • Love
Reactions: u00mem9
I wouldn't give my vehicle to Tesla either if one of my family members was killed in it. Any lawyer would tell you that only happens after litigation is commenced and the vehicle is allowed to be inspected on terms in a Court Order, such as no destructive testing, and having the Plaintiff's experts present and approving each part of any analysis by Tesla.

In any event, it was on AP during the crash in that video, as the lawsuit has now determined according to the press and Tesla has not denied:

Tesla fatal crash case in China: company admits the car in “autopilot” when accident happened

If it wasn't on AP, Tesla would issue a press release. There is none. Do you need any more evidence?



And there's that false sense of security I warned about.

Good additional info. Works for me on the use of auto pilot.

False sense of security? I don't get it. The warnings couldn't be more clear. If you don't know that you are driving a car you are driving that car, you shouldn't be behind the wheel. Lets face it, people die in cars every day and that will never change regardless of how much automation we employ. You can blame the vehicle, and it clearly didn't perform as expected, but, the driver is primary, period. I'd put it 90-10 on the driver if I were adjudicating it - but that's just one opinion.

As for legal proceedings, you have stated a layman's understanding of the general idea of how things work in many of the states of the US. This wreck happened in China and the suit is in China so unless you know Chinese P.I. law, you are speculating, again. You didn't mention that a wreck like that here would be investigated by the NHTSA and they would have a primary right to do that, regardless of any involvement by lawyers.
 
False sense of security? I don't get it. The warnings couldn't be more clear.

The warnings couldn't be more lame in my opinion, starting with the name. I think it borders on criminally negligent for Tesla to keep calling it AP. Then Tesla shows people viral videos on how its vehicles can drive all by themselves.

And you did not believe your AP can hit a parked car until I proved it to you. So the warnings can't be that good if you thought your car couldn't do what I showed you it did. What, Tesla didn't warn you in their warnings that "couldn't be more clear"?

If you don't know that you are driving a car you are driving that car, you shouldn't be behind the wheel.

When a momentary lapse of attention can end your life, the warning should require watching the video I posted three times so it sinks in. Anyone driving my AP 3 will be doing that before I hand it over. I don't discount loved one's lives in such a cavalier manner, as if they deserved it, like you do. Your coldness astounds me.


You can blame the vehicle, and it clearly didn't perform as expected, but, the driver is primary, period.

I placed no blame. I made comments. If you read my posts in other fatality threads here, I said, at law, the driver will likely be found to be at fault but over 90% of these cases settle. Tesla has insurance and legal proceedings are costly (to say the least) and uncertain, and Tesla doesn't want a precedent set, so they settle with no admission of liability and a NDA. To date, no case has gone to trial on this issue and there's no surprises there.

As for legal proceedings, you have stated a layman's understanding of the general idea of how things work in many of the states of the US.

Okay. I've only practiced insurance defence for over 25 years, often referring to US case-law because it's the same common-law and similar statutes, but I guess you know better.
 
Last edited:
The warning couldn't be more lame in my opinion, starting with the name. I think it borders on criminally negligent for Tesla to keep calling it AP. Then Tesla shows people viral videos on how its vehicles can drive all by themselves.

And you don't believe your AP can hit a parked car until I prove it to you. Then you try to call me out for other nonsense.



When a momentary lapse of attention can end your life, the warning should require watching the video I posted three times so it sinks in. Anyone driving my AP 3 will be doing that before I hand it over.




I placed no blame. I made comments. If you read my posts in other fatality threads here, I said, at law, the driver will likely be found to be at fault but over 90% of these cases settle. Tesla has insurance and legal proceedings are costly (to say the least) and uncertain, and Tesla doesn't want a percent set, so they settle with no admission of liability and a NDA. To date, no case has gone to trial on this issue.



Okay. I've only practiced insurance defence for over 25 years, often referring to US case-law because it's the same common-law, but I guess you know better.

Have you practiced law 1 moment in CHINA? If not, all this talk about what the legal system will do is pure speculation - you should know that more than most as a practicing attorney.

I have seen my car go around many stopped objects and cars changing lanes into mine while on AP(1), but, had it not done so, I would absolutely have taken over, the expectation of ALL Tesla drivers. But, as I said, if Tesla agrees AP was on, I don't doubt it and said it did not operate as expected and that is quite concerning.

Momentary lapse of attention? You and I have entirely different definitions of that. Start counting the moment you see the parked truck in the video. At a minimum 5 seconds passes from when it is visible till impact. If this car was going 75 (a typical speed limit in China for highways according to Wikipedia) then this driver had 550 feet or more in which to act and did not slow, did not swerve, did nothing.

If you were defending a truck driver who did not have his eyes on the road for >5 seconds >550 feet, would you not be concerned that he would be found to be grossly negligent? Is this not a complete disregard for the health, safety and welfare of himself and all those around him? I feel very sorry for the kid and his family, but, this was not a split second, momentary lapse of judgement situation.

Depending on the legal standards in China, Tesla may well settle the case or not. I have no way of knowing the BOP or any crucial information to know, including how much value they place on life. But, if their Tort system resembles the one in say Mexico, then a very small sum of money would be considered full compensation. I do know that the Chinese, including their government routinely disregard international law as to intellectual property, so, one has to wonder what the respect for the rule of law is in China.
 
  • Love
Reactions: FlyF4
At a minimum 5 seconds passes from when it is visible till impact.

That's a momentary lapse of attention to me. It's a "very brief period of time", which is the definition of "moment". I've seen so many good people make much worse mistakes. Your heartlessness for the deceased and their loved ones is, well, better not to say, as my mother taught me.

As to all your Chinese law ramblings, the car was on AP and we have the video. We also have other fatalities investigated in the US and I've read the resulting reports, have you? The recommendations are that Tesla's warnings and nags are not sufficient, like you claim, resulting in action taken by Tesla in firmware updates.

Anyway, I've made my point. I hope some people took it to heart, and those who have a heart I am certain will.
 
Last edited:
Doesn't the latest Tesla AP have this or that just in the Model 3 and not the S/X?

In typical Tesla fashion, they have added the camera hardware...software to follow any day now...

I don't think anyone really knows what that interior camera will be used for on the Model 3. Apparently it is already on and capable of sending screenshots of you to Tesla after your airbags deploy (according to VeryGreen) but Tesla has not mentioned it or even asked for consent for it to be on. I think the assumption was it would be used for eye tracking, but it could just as easily be used as CYA for Tesla to prove drivers aren't paying attention in AP accidents.
 
  • Informative
Reactions: Canuck