Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Blog Tesla: Autopilot Was Activated During Fatal Model X Crash

This site may earn commission on affiliate links.
Autopilot was activated when a Model X crashed into a concrete barrier killing the driver last week near Mountain View, Calif., according to a release from Tesla.

“In the moments before the collision, which occurred at 9:27 a.m. on Friday, March 23rd, Autopilot was engaged with the adaptive cruise control follow-distance set to minimum,” the company said. “The driver had received several visual and one audible hands-on warning earlier in the drive and the driver’s hands were not detected on the wheel for six seconds prior to the collision. The driver had about five seconds and 150 meters of unobstructed view of the concrete divider with the crushed crash attenuator, but the vehicle logs show that no action was taken.”

Damage to the Model X was severe, in fact Tesla said it has it has “never seen this level of damage to a Model X in any other crash.” The company blames the severity of the crash on the absence of a crash attenuator designed to reduce the impact into a concrete lane divider. The crash attenuator was reportedly destroyed in a separate accident 11 days before the Model X crash and had yet to be replaced.

“Our data shows that Tesla owners have driven this same stretch of highway with Autopilot engaged roughly 85,000 times since Autopilot was first rolled out in 2015 and roughly 20,000 times since just the beginning of the year, and there has never been an accident that we know of,” the company said in an earlier statement. “There are over 200 successful Autopilot trips per day on this exact stretch of road.”

U.S. National Transportation Safety Board is investigating the crash.

Here’s Tesla’s update in full:

Since posting our first update, we have been working as quickly as possible to establish the facts of last week’s accident. Our hearts are with the family and friends who have been affected by this tragedy.

The safety of our customers is our top priority, which is why we are working closely with investigators to understand what happened, and what we can do to prevent this from happening in the future. After the logs from the computer inside the vehicle were recovered, we have more information about what may have happened.

In the moments before the collision, which occurred at 9:27 a.m. on Friday, March 23rd, Autopilot was engaged with the adaptive cruise control follow-distance set to minimum. The driver had received several visual and one audible hands-on warning earlier in the drive and the driver’s hands were not detected on the wheel for six seconds prior to the collision. The driver had about five seconds and 150 meters of unobstructed view of the concrete divider with the crushed crash attenuator, but the vehicle logs show that no action was taken.

The reason this crash was so severe is because the crash attenuator, a highway safety barrier which is designed to reduce the impact into a concrete lane divider, had been crushed in a prior accident without being replaced. We have never seen this level of damage to a Model X in any other crash.

Over a year ago, our first iteration of Autopilot was found by the U.S. government to reduce crash rates by as much as 40%. Internal data confirms that recent updates to Autopilot have improved system reliability.

In the US, there is one automotive fatality every 86 million miles across all vehicles from all manufacturers. For Tesla, there is one fatality, including known pedestrian fatalities, every 320 million miles in vehicles equipped with Autopilot hardware. If you are driving a Tesla equipped with Autopilot hardware, you are 3.7 times less likely to be involved in a fatal accident.

Tesla Autopilot does not prevent all accidents – such a standard would be impossible – but it makes them much less likely to occur. It unequivocally makes the world safer for the vehicle occupants, pedestrians and cyclists.

No one knows about the accidents that didn’t happen, only the ones that did. The consequences of the public not using Autopilot, because of an inaccurate belief that it is less safe, would be extremely severe. There are about 1.25 million automotive deaths worldwide. If the current safety level of a Tesla vehicle were to be applied, it would mean about 900,000 lives saved per year. We expect the safety level of autonomous cars to be 10 times safer than non-autonomous cars.

In the past, when we have brought up statistical safety points, we have been criticized for doing so, implying that we lack empathy for the tragedy that just occurred. Nothing could be further from the truth. We care deeply for and feel indebted to those who chose to put their trust in us. However, we must also care about people now and in the future whose lives may be saved if they know that Autopilot improves safety. None of this changes how devastating an event like this is or how much we feel for our customer’s family and friends. We are incredibly sorry for their loss.

Photo: @DeanCSmith/Twitter

 
Last edited by a moderator:
So, it may seem unusual for me, given my calling out of Tesla for things in the past, that I've been somewhat defending them on this particular accident with friends and others where the conversation has come up, essentially noting that the driver just should have been paying attention and that is the end of the discussion.

While that particular sentiment still mostly stands (pay attention, people)... on a recent 1000+ mile trip, my longest using AP2, I noticed some key differences between how AP1 and AP2 handle certain situations. One in particular I'm reasonably certain was a point of note in this crash.

Some notes:

AP1 senses cars in multiple lanes simultaneously, and also senses multiple cars in each lane when possible. AP2 seems to do a variant of this internally, but doesn't display the adjacent lanes.

Additionally, AP1 has a well defined follow-the-leader state. Autosteer on AP1 can follow a lead car almost indefinitely, regardless of lane markings. AP2 appears to be able to do this to a certain extent, but at no where near the level AP1 is capable of. (As an example, I can lock AP1 on to a car entering my housing development which has no lane markings at all, and it will follow them until they reach my house or turn off the road without issue. I've yet to get AP2 to be able to even engage in this situation, let alone actually follow another vehicle.)

Finally, AP1 takes all lanes of traffic into consideration when making decisions about the driving path. If, for example, you're following a vehicle in the left lane of a highway, along side a lane to the right, and there are indistinct markings that would diverge to the left (such as the case with this fatal crash), AP1 will weigh the position of the lead car and adjacent cars against the suspected lane markings, and in this case it will always continue on the path that the lead car is taking (or cars detected in an adjacent lane, if no lead car) unless the lead car is also diverging from the other lane. AP2, however, doesn't seem to take the lead car position or the position of the other vehicles in other lanes into account as heavily when making these decisions, and will continue to follow the lane marking it thinks is correct despite the position of other cars. (In the complete absence of lane markings, AP2 will tend to follow a lead car, but not nearly as accurately as AP1.)

This was VERY obvious to me on my long trip through sections I've driven with AP1 dozens of times, including long term construction areas with terrible lane markings (old markings, poor markings, no markings, etc), where AP1 would do commendably well most of the time, but AP2 didn't know its head from its tail.

All of that said... heres where I'm probably going to catch a lot of flak.

I think Tesla should probably be held accountable on this accident.
I firmly believe, given my extensive experience with the systems both as a user and hacker, that if the Model X involved in this accident had been equipped with AP1 and not AP2... this accident would not have occurred.

And herein lies the issue. Tesla knowingly released a system that was and still is inferior to its predecessor in almost every conceivable functionality. This is a system that has potentially fatal safety considerations and deficiencies compared to the already existing system. Instead of continuing with the existing system, they decided to throw all of that out the window and allow new customers to operate a less capable system in order to move their autonomous development in-house for greater control, and thus greater profit, despite the system being provably less functional than the system that already existed years prior. They let their need to get away from Mobileye trump customer satisfaction and ultimately customer safety.

In summary, I feel like Tesla should be held fully responsible for this accident and any other accident that could have been prevented had the vehicle been equipped with AP1 instead of AP2. They released a system that was incomplete and incapable compared to the system that was already available and sold, with the main point being that the older system would have been extremely unlikely to have encountered the same issue that lead to this fatal accident. Instead they rushed out hardware with incomplete software to new customers, knowing full well that it was not as capable as the original.

As an enthusiast and overall supporter, I'm sure I and most others could forgive Tesla for a temporary lack of feature parity between AP1 and AP2... but it's been well over a year (18 months now?) without even coming close to AP1 parity, let alone "EAP", with AP2. And now that lack of parity has lead to a death, and Tesla should be ashamed of themselves.

Edit: clarified some points, fixed some typos and punctuation errors.
 
Last edited:
In summary, I feel like Tesla should be held fully responsible for this accident and any other accident that could have been prevented had the vehicle been equipped with AP1 instead of AP2. They released a system that was incomplete and incapable compared to the system that was already available and sold, with the main point being that the older system would have been extremely unlikely to have encountered the same issue that lead to this fatal accident. Instead they rushed out hardware with incomplete software to new customers, knowing full well that it was not as capable as the original.

You have a more unique perspective than many since you have both AP1 and AP2 cars. For those whose first Tesla was an AP2, does the comparison to AP1 have any relevance? Is the argument valid if AP1 never existed?

The removal of AP1/ MobileEye was not Tesla's unilateral choice. Are drivers worse off with some level of assist v.s. zero assist until it is good enough? Is a non-AP car really a safer option than an AP2? Are all car companies liable that do not have AP1 installed? (The SawStop argument). Is an AEB system that only covers 50% of use cases less safe than a non system that covers 0%?

I firmly believe as a software engineer that if Caltrans had kept the lines painted (and there was not a lead car that entered the gore point, this has not been clarified), the accident would not have happened. The AP2 behavior was deterministic and in line with what I would expect from it.
I also believe, if the barrier had been reset, the accident would not have been fatal.
I also am 99.999% confident the driver could have pressed the brake or turned the wheel to prevent the collision.
 
  • Like
Reactions: bonnie
All of that said... heres where I'm probably going to catch a lot of flak.
Edit: clarified some points, fixed some typos and punctuation errors.

yes, i disagree on your reasoning but recognize i may be wrong. what i want is for tesla to "fix" this bug/feature. whether its requiring EAP to function only when you have GPS and navigation ON and then coordinating navigation with road positioning and EAP vision or however they do it. we've now seen at least 2 other videos consistent with the crash.
 
The guy had 5 full seconds to avoid driving into the wall at what speed he hit the wall.

Sorry- this is driver error - not car error.

Unless he had full self-driving installed - he's is fully responsible for killing himself. Hands off the wheel - was a chronic scofflaw about not having them on the wheel -

Seems more likely that not we will find the guy was emailing, texting or using the web when he was driving.

Stupid is as stupid does and I have zero sympathy for idiots.
 
Imminent collision means the system detected the condition (relative to it's motion and road condition) the vehicle is in collision course with an object. In my view, at six second mark, the autopilot could detect and predict, at current speed and road condition, is not able to operate safely. At this point in time (six second mark), the vehicle is most likely NOT in "unavoidable" collision course and the driver may only have next few seconds to take action. Now, after that precious few seconds has past by, the autopilot can, with high degree of confident that the collision is unavoidable. This is the moment I suggested the autopilot would take final action to protect the occupants.
The 'driver' is supposed to be paying attention.

The 'driver' is legally in charge of the operation of the vehicle.

The 'driver,' if he was paying attention, had plenty of time to avoid the collision with the K rail.

Ergo, the driver was NOT paying attention. End of story.
 
  • Like
Reactions: SlicedBr3ad
. . . I have been driving for 50 years and nothing like this has ever happened to me. I'm thinking that the auto-pilot could benefit from linking with the navigator. That way, I could tell the car where we are going before I leave. You know, pre-flight planning. I also have an aircraft pilots licence.
I'm hopeful that the autopilot and auto-steering will one day make driving much safer.

Kevin Burns
Well of course 'nothing like this ever happened to you' - you've never owned a vehicle that could partially drive itself before. Would be unusual for a non-Tesla to suddenly decide to drive for you.

I'm thinking that part of the 'paying attention' to the road part of using the Tesla AP means that if it tries to take you someplace you do not want to go you need to take control and direct the vehicle where you want it to go.

I have a really nice autopilot in my airplane - that does not prevent me from saying: 'what is it doing now' when I, as the operator, improperly configure it or the navigator I am using.
 
In summary, I feel like Tesla should be held fully responsible for this accident and any other accident that could have been prevented had the vehicle been equipped with AP1 instead of AP2. .

I'll respectfully disagree, based on the following:
  • The driver received his X late 2017, indicating he never used AP!. That's an important point here. He would not have any AP1 expectations. AP1's behavior is immaterial. He was only familiar with AP2's behavior. He was a software engineer. I'm sure he was not oblivious to AP2 behavior.

  • According to his family, he'd complained about AP in this specific location multiple times. If you had that experience, when approaching that very same area, would you activate AP and then allow yourself to be distracted? (Rhetorical, I know you wouldn't.)

  • And lastly, while I probably trust your interpretation of how AP2 is making decisions more than most, I don't believe you actually know how much AP2 is taking into consideration. You did only say 'it seems' & I appreciate your carefulness with words.
Others might conclude that if Tesla offered NO autopilot, the accident wouldn't have happened. I've noticed some AP2 behaviors that exceed AP! (such as a slight slowing down when seeing tighter curves or a fork in a local road here where AP1 consistently will take the wrong road and AP2 will not). Different products, but again, the driver had zero expectations based on AP1 behavior, because his X would never have been outfitted with AP1.

I've carefully not pointed fingers at the driver, at Tesla, at CalTrans. We're all searching for answers, but I think until we have all the data, we can't reasonably conclude that one party should be held 'fully responsible'.
 
The problem is that this 300-or whatever % safer number is being based on a demographic of drivers who are already that much safer than average without autopilot.
The average TeslaX/S owner is middle aged and makes 290k per year. No teenagers, elderly or drunks are driving those cars. I use the Volvo XC90 as a comparison.
Volvo has more of those on the road than Model Xs... yet most years there are no deaths reported for XC90 drivers. The XC90 doesn't have Tesla autopilot. So how
does it achieve such a standard? It is all demographics. If Tesla made a $20k car that teenagers were buying... the accidents would pour in whether it had autopilot or
not. Actually I think it would be worse to give a teenager autopilot because you know how irresponsibly it would be used.

Who do you think the demo of the Volvo is? They are known and always have been for safety over looks and have always attracted those most interested in safety. Your exact argument for why you apparently think Tesla's aren't safe (even though crash test data proves you dead wrong) is the same argument for Volvos...except that their cars are also slower, unsexy, not progressive and not a temptation to push to the limits, unlike Teslas.
 
@wk057 I was thinking the same thing as Bonnie mentioned in post about the driver being new to Tesla and only having experienced AP2. Many of us are in that same situation. I get that people who have been with Tesla longer will naturally compare the two and for you guys you are always looking for those points of difference.

So not having come from AP1, are you saying since AP1 would consistantly follow the car in front of them and sounds like you think that's a better thing, that he would have been better with that system? And what if he was on AP1 and the guy in front wasn't paying attention and went into the barrier. I assume if the 2nd driver wasn't paying attention he would have followed the crashing vehicle into the spot as well, but maybe braked before hitting the car that hit the barrier? Not seeing how in theory that's any better.

I don't feel any of the systems out there can be totally relied on and it's why we are still at driver in control of supervising the assist.
 
Because reading is hard, here is what wk057 said:

"In summary, I feel like Tesla should be held fully responsible for this accident and any other accident that could have been prevented had the vehicle been equipped with AP1 instead of AP2. They released a system that was incomplete and incapable compared to the system that was already available and sold, with the main point being that the older system would have been extremely unlikely to have encountered the same issue that lead to this fatal accident. Instead they rushed out hardware with incomplete software to new customers, knowing full well that it was not as capable as the original."

He didn't say "Tesla is at fault because the driver had a vast comparative knowledge of Tesla's systems."

He didn't say "the driver isn't at fault."

He didn't say "Caltrans isn't at fault."

He seems to be saying, and please correct me if I'm wrong, that Tesla knowingly reduced the safety of their cars and that, to him, is unacceptable. It's not about the driver's state of mind. It's about Tesla knowingly putting customers at greater risk in order to save money.
 
He seems to be saying, and please correct me if I'm wrong, that Tesla knowingly reduced the safety of their cars and that, to him, is unacceptable. It's not about the driver's state of mind. It's about Tesla knowingly putting customers at greater risk in order to save money.

And I would disagree with that statement, Tesla did not choose to drop MobileEye when they did. Their plan was to run in parallel, but ME parted ways early. Never about saving money (to my knowledge)
 
  • Like
Reactions: Matias
Since Volvos were brought up, regarding Volvos and their reputation for safety, they don't always do so well either. Take this accident with a Volvo just happened (3/22/18) in Indianapolis: Victim identified in deadly accident on Indianapolis' north side Check out the several car photos that are included with the article. Not sure what model that was or if it had any driver assist features at the time or AEB, but the car is pretty much demolished (the video does indicate that responders had to use the jaws of life to extract the passengers). The front seat passenger died. Driver and other passenger hospitalized. Driver trying to pass another car at the time and hit a light pole. I think Volvo has advertised that they haven't had any deadly accidents with their cars but this one would make that statement incorrect now.

There was this Volvo one-car rollover accident in Mashpee, Massachusetts, just this past month too although the driver wasn't seriously hurt: Driver Not Seriously Injured In Rollover Crash Couldn't find any more info on this car's accident to say how it happened.

This Time Magazine article and video from 2015 illustrates how not to sell people on the XC60's driver's assist features. Running over your potential customers won't get you many buyers. I can't believe any sales person would even attempt that, but epic fail all the same. We have seen many news accounts of cars plowing into standing or slow walking pedestrians on the street.

I do wonder if Volvo will still accept all liability for accident when it's cars reach full automous mode. It's a claim their CEO made in 2015. Will other manufacturers hold themselves to the same liabilty for their cars?

Unlike when a Tesla crashes and the Tesla name is widely reported, even in the deadly crash Volvo's name isn't widely reported or commented on. I really had to search to find these few recent examples and the media doesn't seems interested to comment on the car's safety features when accidents do happen which I find odd.
 
Last edited:
  • Helpful
Reactions: SlicedBr3ad
My gosh - how can any of this be Tesla's fault for whatever reason?

When you sit in the drivers seat, you are the driver.

You are responsible for the safe operation of the vehicle.

Not cameras and software.

I've been in the car and operating AP2 in a situation where you have gore point like this - the car doesn't know which direction to go - its not a mindreader. Even with a destination entered.

ALL you have to do is to put on your turn signal and it will tell the car which way to go. and the car will turn. Most of the time/.

Or - simply steer the vehicle in the direction you want it to go. If you have a gore point with lines going in two different directions how do you expect the vehicle to choose?

We ONLY use the AP in situations where the vehicle is going straight and narrow with no deviations - and any time the vehicle needs to change direction WE are in control - we do NOT let the vehicle take exits by itself. . . . your job as driver is to minimize the chances of the car killing you. As others have noted - AP is like a beginning driver - it can easily maintain speed and distance in a straight line - ask it to do more and all of a sudden you start asking for problems.
 
My bet is AP2 swerved at the last moment leaving the driver with no time to react.
Now, how exactly would that be a problem if you pay attention and hold the steering wheel as you're supposed to? It literally disengages immediately when the user and AP disagrees before the car moves much at all.

Use the system like it says in the manual. It's a level 2 system, it needs to be babysit all the time. It's not AP's fault, even if it drives straight into a concrete wall. You were supposed to prevent it from doing that.
 
  • Like
Reactions: mongo