Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Blog Tesla Sued for Deadly Crash on Autopilot

This site may earn commission on affiliate links.
The family of a man who died last year after his Tesla Model X crashed on a California highway while operating on Autopilot, is suing Tesla.

The suit alleges wrongful death and negligence stemming from failures and false promises regarding the Autopilot driver-assistance system.

The incident took place on March 23 on a busy stretch of Highway 101 when Apple engineer Walter Huang’s vehicle drifted out of its lane and crashed into a concrete rail. The car’s battery erupted into flames.

The National Transportation Safety Board reported later that the car had accelerated from 62mph to 70 mph four seconds before the crash.

Tesla published a blog post in March 2018 defending Autopilot as not responsible for the crash.

“Our data shows that Tesla owners have driven this same stretch of highway with Autopilot engaged roughly 85,000 times since Autopilot was first rolled out in 2015 and roughly 20,000 times since just the beginning of the year, and there has never been an accident that we know of,” the company said in the post. “There are over 200 successful Autopilot trips per day on this exact stretch of road.”

Tesla said the reason this crash was so severe is that the crash attenuator, a highway safety barrier which is designed to reduce the impact into a concrete lane divider, had either been removed or crushed in a prior accident without being replaced.

 
Last edited by a moderator:
AP with all its supposed redundancy and machine learning accelerated into it.

Facts stated not in evidence.

As per the NTSB:
The National Transportation Safety Board reported later that the car had accelerated from 62mph to 70 mph four seconds before the crash.

That means the acceleration event was completed ~400 feet prior to the impact.

That's not "accelerating into" an object when it's still over a football field's length away.
 
So much for machine learning. A Tesla chashed there and it wants to do it again as it seems from the footage.
Please people, stop talking about neural and self learning until we see it actually, consistently, learning. And if Tesla pushed an update for that section of road, it took them long enough for a random next person who would likely have been unaware, to possibly die the same way.

Also, let's take this in.
A well published accident happened there and AP was not disabled for it so Tesla could take time to figure out any specifics. It could be a consistent error and a matter of time before claiming another life.

With this information in hand, the family might have a case against Tesla. They are simply not doing the utmost and from what outsiders can see, hiding behind offensively skewed accident statistics. And now making AP standard before their testing and correcting protocols can be really vetted.

Do you understand that this is not self driving car? Autopilot is merely assistive driving system. The driver needs to stay attentive with both hands
On wheel. How is this unclear to you?

Only a person that doesn’t drive a Tesla speaks
Like that. What car do you drive?
 
People’s expectation: if you drive a Tesla, you should not get into an accident. If you do, the car is at fault. This is literally what a family member stated who doesn’t drive one yet. People’s expectations are not in line with reality. Just bc Tesla is rolling out FSD in the future, doesn’t mean it’s self driving right now. Even then, it will prob require drivers attention due current regulations. These lawsuits need to stop. It’s totally ridiculous
 
People’s expectation: if you drive a Tesla, you should not get into an accident. If you do, the car is at fault. This is literally what a family member stated who doesn’t drive one yet. People’s expectations are not in line with reality. Just bc Tesla is rolling out FSD in the future, doesn’t mean it’s self driving right now. Even then, it will prob require drivers attention due current regulations. These lawsuits need to stop. It’s totally ridiculous

Some people expect the Automatic Emergency Braking or Front Collision Warning systems to work properly.
Do those require driver attention?
 
People’s expectation: if you drive a Tesla, you should not get into an accident. If you do, the car is at fault. This is literally what a family member stated who doesn’t drive one yet. People’s expectations are not in line with reality. Just bc Tesla is rolling out FSD in the future, doesn’t mean it’s self driving right now. Even then, it will prob require drivers attention due current regulations. These lawsuits need to stop. It’s totally ridiculous

And they are 100% correct.

What's the purpose of a car's safety mechanisms if they do not work?
 
And they are 100% correct.

What's the purpose of a car's safety mechanisms if they do not work?
safety mechanisms are there to help you not replace you as a driver. It's 100% driver's fault if he was not holding the steering wheel or if he was not paying attention. Autopilot works very well and people rely on it like they can fall asleep. This is not what is advertised and every-time you activate it, it tells you to keep 2 hands on the wheel and be ready to take over anytime.

But if the drive hit the brake and it didn't work, then it would be Tesla at fault

does that make sense? you woulnt say the same for BMW Lexus Toyota but why Tesla
 
What % do they advertise?
They advertise that AEB can prevent 0% of collisions, just minimize the impact.

Tesla Owner's Manual said:
Warning: Automatic Emergency Braking is not designed to prevent a collision. At best, it can minimize the impact of a frontal collision by attempting to reduce your driving speed. Depending on Automatic Emergency Braking to avoid a collision can result in serious injury or death.
 
Last edited:
They advertise that AEB can prevent 0% of collisions, just minimize the impact.
I can't believe we are playing these word games.

What % of the time is AEB advertised to 'minimize impact'?
What % of the time does the frontal collision warning system work?
I have to assume they have done extensive testing to find out what the tolerances/success/issues/failures are
 
I can't believe we are playing these word games.

What % of the time is AEB advertised to 'minimize impact'?
What % of the time does the frontal collision warning system work?
I have to assume they have done extensive testing to find out what the tolerances/success/issues/failures are

There aren't any word games. I don't think Tesla's AEB is very good, but to answer your questions:

  • What % of the time is AEB advertised to 'minimize impact'? - 0% of the time, it attempts to brake.
  • What % of the time does the frontal collision warning system work? I don't have a percentage, but it does seem pretty low.
  • I have to assume they have done extensive testing to find out what the tolerances/success/issues/failures are. Yes, we are all beta testing now.
They even flat out say you can die by depending on it.
 
Do you understand that this is not self driving car? Autopilot is merely assistive driving system. The driver needs to stay attentive with both hands
On wheel. How is this unclear to you?

Only a person that doesn’t drive a Tesla speaks
Like that. What car do you drive?
Don't be that guy please. You're making the whole BEV scene look terribly immature with that kind of response.

You're telling me that would an army of software engineers in Cali, it's too much to ask to expect someone to make note of a crash that happened on AP and then take some measures to prevent recurrence, at least in that location?
Tesla for sure knows these cars are just robots, behaving as programmed. While selling the world on their machine learning.
Well, they don't back up the lie with common sense and man power.

How can you excuse someone with so much control over their fleet's software to NOT take any measures? They full well knew about the crash the instant it happened. They call you to check on you when the car senses an impact, especially when airbags go off. But never once does someone ANYTHING, just in case the 1% of 1% of 1% chance that MAYBE it was the software version's fault or a road situation trips up more than a statistically low number of cars.

Gross negligence if you ask me. And you know, how can we expect for these accidents to not happen anymore after Level 5 has been achieved? And when (not if) one happens, is this the kinds of response we are to expect from Tesla? Zero response, just them (or some AI communications rep) pulling up some iffy stats that FSD is still safer than average Tesla drivers? Is that a comforting response from the AI dev team, to you? Just let nature run its course until the next update that's scheduled and then delayed?
 
Don't be that guy please. You're making the whole BEV scene look terribly immature with that kind of response.

You're telling me that would an army of software engineers in Cali, it's too much to ask to expect someone to make note of a crash that happened on AP and then take some measures to prevent recurrence, at least in that location?
Tesla for sure knows these cars are just robots, behaving as programmed. While selling the world on their machine learning.
Well, they don't back up the lie with common sense and man power.

How can you excuse someone with so much control over their fleet's software to NOT take any measures? They full well knew about the crash the instant it happened. They call you to check on you when the car senses an impact, especially when airbags go off. But never once does someone ANYTHING, just in case the 1% of 1% of 1% chance that MAYBE it was the software version's fault or a road situation trips up more than a statistically low number of cars.

Gross negligence if you ask me. And you know, how can we expect for these accidents to not happen anymore after Level 5 has been achieved? And when (not if) one happens, is this the kinds of response we are to expect from Tesla? Zero response, just them (or some AI communications rep) pulling up some iffy stats that FSD is still safer than average Tesla drivers? Is that a comforting response from the AI dev team, to you? Just let nature run its course until the next update that's scheduled and then delayed?

Something tells me you're not a big believer in personal responsibility... Tesla is 0% at fault for this crash and the driver is 100% at fault, case closed.

Jeff
 
Something tells me you're not a big believer in personal responsibility... Tesla is 0% at fault for this crash and the driver is 100% at fault, case closed.

Jeff

Just like the pilots were at fault in the Boeing 737 MAX crash! It's their fault they couldn't predict and prevent the plane's shitty control system from flying it into the ground.
 
Just like the pilots were at fault in the Boeing 737 MAX crash! It's their fault they couldn't predict and prevent the plane's shitty control system from flying it into the ground.

Completely and totally different thing entirely... What a horrible false equivalency... Come on... Where do I start...??? Lets see, one is a car, the other is a plane. That's a pretty good start... One gives you instant visibility in relation to the ground and potential impact objects, the other severely limits what pilots can actually see as it relates to altitude and whatnot... Lastly, one system is advertised as beta and to be paying attention at all times, the other was slipped in with essentially zero notification/training with major known (and undisclosed) flaws...

Again... Come on...

Jeff
 
  • Like
Reactions: Triplett
Completely and totally different thing entirely... What a horrible false equivalency... Come on... Where do I start...??? Lets see, one is a car, the other is a plane. That's a pretty good start... One gives you instant visibility in relation to the ground and potential impact objects, the other severely limits what pilots can actually see as it relates to altitude and whatnot... Lastly, one system is advertised as beta and to be paying attention at all times, the other was slipped in with essentially zero notification/training with major known (and undisclosed) flaws...

Again... Come on...

Jeff


Ok, labeling something as beta absolves Tesla of any responsibility. Got it!