Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Blog Tesla Sued for Deadly Crash on Autopilot

This site may earn commission on affiliate links.
The family of a man who died last year after his Tesla Model X crashed on a California highway while operating on Autopilot, is suing Tesla.

The suit alleges wrongful death and negligence stemming from failures and false promises regarding the Autopilot driver-assistance system.

The incident took place on March 23 on a busy stretch of Highway 101 when Apple engineer Walter Huang’s vehicle drifted out of its lane and crashed into a concrete rail. The car’s battery erupted into flames.

The National Transportation Safety Board reported later that the car had accelerated from 62mph to 70 mph four seconds before the crash.

Tesla published a blog post in March 2018 defending Autopilot as not responsible for the crash.

“Our data shows that Tesla owners have driven this same stretch of highway with Autopilot engaged roughly 85,000 times since Autopilot was first rolled out in 2015 and roughly 20,000 times since just the beginning of the year, and there has never been an accident that we know of,” the company said in the post. “There are over 200 successful Autopilot trips per day on this exact stretch of road.”

Tesla said the reason this crash was so severe is that the crash attenuator, a highway safety barrier which is designed to reduce the impact into a concrete lane divider, had either been removed or crushed in a prior accident without being replaced.

 
Last edited by a moderator:
Yes, but that car was in the data set of non-AP passages. Would you like to take a guess at sample size difference?
No, don't care. Of course it's a larger sample size. Your line of argument is irrelevant to two issues:

1. The protection portion was left unreplaced by the only entity that could replace it. It was in this state of disrepair for months if not years. I had researched it back when this incident occurred but now don't remember the exact time period. Also, how other countries treat this situation is similarly irrelevant. Had this impact sled been properly replaced, the driver would have likely survived.

2. The driver is ultimately responsible, and was unfortunately not paying attention at the time of the accident.
 
Do you have any numbers on the accidents avoided because of using autopilot?
Don't deflect please. For every AP car passing the crash site, thousands or at least hundreds without it did.

Also, the first time an AP car manage to by by unhurt, it should log the change and earmark it as increased risk. Like it it were a cartoon bombs. take extra care. AP obviously wasn't aware of a missing absorber despite likely many passages. It's Cali, right? AP may not even have been aware of where each lane goes and where they are located relative to the surroundings. I feel that if you would drive the same route with a Waymo car, it would need many more passages to eventually hit that lane divided. And likely just never, because it sees the world differently than AP does.
 
I believe they have a separate lawsuit for that.
not that it really matters, but that's not how it works. When there is joint (possible) liability, plaintiff drags all the potential defendants in, & they begin pointing fingers at each other (even @ the plaintiff) & eventually they're faulted/assigned a ratio. Whether the outcome reveals ⅓/⅓/⅓ liability - 1%/1%/%/98% or whatever - is yet to be determined. People shouldn't get their shorts all Bunch up over this. It happens every day. A coworker just finished up his experience - getting deliberately hit by an unlicensed, undocumented entry into the US, & a criminal record driver to boot. A few days after the fact, a passenger of the plaintiff was concocted too. Coworker's insurance company, with no intestinal fortitude, after nearly 2 years, rolled over and paid them a nominal fee to go away. Then people wonder why their rates are so high. The insurance Lobby is one of the largest in the nation. Fake collisions ..... nice work if you can get it. My coworker has since been introduced to the world of dash cams.
.
 
The car veered off the lane. How did this happen on autopilot? What made the car deviate from its path. Why didn't the car brake when approaching the barrier rather than accelerate.
The only answers we usually get it that a million miles of AP engaged results in fewer accidents than disengaged. Despite there being limited overlap in road miles between these two data sets.
It seems a lot needs to go wrong for such an accident to happen, but there are a lot of miles being logged every day. Any bug that is allowed to exist is going to cause a casualty. With more cars on the road, it may happen sooner unless machine learning starts to kick in.
 
The car veered off the lane. How did this happen on autopilot? What made the car deviate from its path. Why didn't the car brake when approaching the barrier rather than accelerate.

Long thread on this, but it appears there were poor/faded lane markings, the car latched to the wrong lane line or was just confused, and the car is not designed to stop for things going 0 mph (barriers) from highway speeds.

Several (crazy?) people were able to recreate the behavior of latching onto the wrong lines in a gore point situation.

Moral of the story: never let your guard down while using AP

Edit: one example showing the behavior in the crash area. Model X Crash on US-101 (Mountain View, CA)
 
Last edited:
rather than accelerate.
Did it really accelerate beyond it's set speed? Obviously it didn't detect the concrete barrier otherwise we wouldn't be talking about this. There's a lot of things it doesn't slow down for. It's not FSD, like the cruise control that's installed in most cars. It's a driver-assistance system. It will kill you if you let it.
 
Last edited:
  • Like
Reactions: argon2018
Edit: one example showing the behavior in the crash area. Model X Crash on US-101 (Mountain View, CA)
So much for machine learning. A Tesla chashed there and it wants to do it again as it seems from the footage.
Please people, stop talking about neural and self learning until we see it actually, consistently, learning. And if Tesla pushed an update for that section of road, it took them long enough for a random next person who would likely have been unaware, to possibly die the same way.

Also, let's take this in.
A well published accident happened there and AP was not disabled for it so Tesla could take time to figure out any specifics. It could be a consistent error and a matter of time before claiming another life.

With this information in hand, the family might have a case against Tesla. They are simply not doing the utmost and from what outsiders can see, hiding behind offensively skewed accident statistics. And now making AP standard before their testing and correcting protocols can be really vetted.
 
From what I know, the driver had himself to blame for not paying attention. Tesla full well knew this was a risk as it was causing accidents, even deadly ones..

How do you know this to be the case? It only takes a couple seconds to swerve into that barrier. Is it possible in those 2+ seconds the guy was checking his rear or side view mirrors to get over?
There are a lot of things that distract a driver from the road. And many of those things are responsible acts. We can't simply assume the guy was playing with his phone. Maybe he was messing
with the radio. Are you an irresponsible driver if you change radio stations? Point is, we can't assume.
 
You'll find that an accident takes less time to develop. Try it enough times with a faulty version of AP and you'll know, or at least your loved ones will. Real time is the way to monitor.

Also, wasn't it much less? People were changing or sleeping on the highway. Why was the nag changed?

Nags are definitely less than a min each. The people who fell asleep are those who have their hands on the wheel (go back and review that one Asian who fell asleep on the hwy). They purposely tried to defeat the system so it's all them.
 
  • Informative
Reactions: neroden
I don't see how Tesla could be found "at fault" when obviously the man in the car was not prepared to take over in the case of an emergency. Tesla tells everyone ALL THE TIME that you MUST be ready to take over if there is need. Apparently there WAS need for the driver to take over and he wasn't paying attention and crashed. It happens. Auto-pilot is NOT full self driving and everyone KNOWS it. I don't see how this is Tesla's fault, at all.
 
It’s time for people to step up and take responsibility for their own actions. When enabling autopilot, I am reminded to pay attention EVERY SINGLE TIME! This case better get thrown out.

If this is the same person who was complaining about the shortcomings of Autopilot, why wasn’t he paying attention? I’d think he’d be more apt to do so.

Couldn’t agree with you more. Every time you activate autopilot it tells you to keep 2 hand on the steering and to be ready to take over at any time. It’s assistive driving system at this point. The driver is totally responsible. He was obviously not paying attention and couldn’t react in time.

This family just wants to make money out of this. I hope our legal system works cause these are just getting ridiculous. Anyone who gets into an accident with Tesla, it’s automatically Tesla’s fault.
 
  • Like
Reactions: SO16
Did it really accelerate beyond it's set speed? Obviously it didn't detect the concrete barrier otherwise we wouldn't be talking about this. There's a lot of things it doesn't slow down for. It's not FSD, like the cruise control that's installed in most cars. It's a driver-assistance system. It will kill you if you let it.
Exactly, this driver treated autopilot as if it’s self driving bc it is so good. It’s his fault for not paying attention. If anything, Tesla and the city should counter sue this family for this driver bc he could have killed other people for not paying attention and not following the rules.

The driver literally was not paying attention. He didn’t react in time. End of story.

Tesla should counter sue so people stop this harassment. It’s ridiculous