Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Tesla wins first US Autopilot trial involving fatal crash

This site may earn commission on affiliate links.
I think there are very irresponsible drivers who think they can drink and drive, putting autopilot on to help them. They are too drunk to make sure AP is on. It is quite possible AP comes off and they are too inebriated to even figure that out. Tragic - that too with kids in the car.
Thankfully the CEO of the company isn't on his social media network giving drunk drivers a further sense of comfort with drinking and using Autopilot because it will ironically almost certainly save their lives
 
  • Funny
Reactions: EVNow
OTOH, if the claims against large companies is same as against non-wealthy individuals, they will care very little about safety and just write off "small" claims as cost of doing business. Even with large exposure, we have seen multiple cases of large companies ignoring safety to save a few bucks or in the case of Uber or Cruise do things that are clearly not safe.

ps :



I think there are very irresponsible drivers who think they can drink and drive, putting autopilot on to help them. They are too drunk to make sure AP is on. It is quite possible AP comes off and they are too inebriated to even figure that out. Tragic - that too with kids in the car.
You can't legislate for stupid. 😆
 
  • Like
Reactions: sleepydoc
Actually, that's not a correct quote. The actual quote is "It may do the wrong thing at the worst time, so you must always keep your hands on the wheel and pay extra attention to the road."

It's "May" and not "Will". Normally I wouldn't nitpick this but you did specially say it was a quote.
...and it dramatically changes the meaning of the sentence.
 
  • Like
Reactions: pilotSteve
In the 90's when the Greyhound bus line was sporting forward and blind spot radars, the meager goal was giving bus drivers a split second of advanced warning for improved outcomes as well as data for accident reconstruction. Of course the systems back then didn't control steering and such but Greyhound expected fewer lawsuits. And with those old systems there wasn't much talk about forward warning system design snafus, failures, let alone NN training data challenges.

The $10k to $15k folks paid TSLA sets up a target for ongoing lawsuits over negligence. And TSLA claims the moon and waives liability to the fullest extent of the law. Of course these lawsuit results heavily depend on the local jury pool. But even then one wonders if juries will be as accomodating/forgiving especially when otherwise innocent drivers become roadway victims of modern driving conveniences. And I won't be surprised if disgruntled TSLA employees become expert witnesses.

I'm still waiting to hear anything about that multi car pile-up on the SF bridge/under/overpass.

Probably no surprise if the public never hears about cases TSLA loses.
I've seen footage of the SF bridge pileup and it's not at all clear what was going on. Even if it was a case of phantom braking, what was the driver doing/thinking? There are a lot of unanswered questions. Without more data it's impossible to do anything more than guess.
 
The most recent year models that rely exclusively on Vision are at the highest risk for future lawsuits. I hope the new radar in S and X are working well and on track to come and be active in 3, Y and Cybertruck. It was a mistake to jump so quickly away from the old radar without having a replacement ready. Vision has too many fail points. It’s very possible that the usefulness of these features becomes limited by this decision as autonomy progresses.
 
Actually, that's not a correct quote. The actual quote is "It may do the wrong thing at the worst time, so you must always keep your hands on the wheel and pay extra attention to the road."

It's "May" and not "Will". Normally I wouldn't nitpick this but you did specially say it was a quote.
Correct, my apologies.

But, in my defense, the car has, on every release so far, done something that would have bent metal or, if allowed to go its own way, would have required other drivers to dodge. One mistake anywhere by people not running FSD would have resulted in a crash.

I will admit it does the scare-the-pants act a lot less than it used to.
 
Like
I've said this many times - AP/FSD Beta will cause accidents, and it will kill people. People need to accept that. The point is that it should cause LESS accidents and kill LESS people than humans do. Last year 42,000 people died in car crashes in the US. If we can save any of those people by using ADAS systems, it's worth it.

The issue is human nature. Why is it more acceptable to us if another human kills a friend or loved one in a car accident vs a computer killing them? We want computer systems to be infallible and cause 0 accidents, but that's not reasonable.
well said. Like anything in our society, there are always a large group of people opposed to something and they will latch on to any excuse in an attempt to destroy it. FSD is one, EVs and green energy are but a few of the hot topics at the moment.

“FSD kills people”
“EV batteries are dirty to make and they are charged from coal!”
“Wind turbine blades can’t be recycled and they kill birds!”
Etc

The lesson here being that in the last few hundred years, technology may have advanced but humans certainly haven’t.
 
I've said this many times - AP/FSD Beta will cause accidents, and it will kill people. People need to accept that. The point is that it should cause LESS accidents and kill LESS people than humans do. Last year 42,000 people died in car crashes in the US. If we can save any of those people by using ADAS systems, it's worth it.

The issue is human nature. Why is it more acceptable to us if another human kills a friend or loved one in a car accident vs a computer killing them? We want computer systems to be infallible and cause 0 accidents, but that's not reasonable.
That and a major injury is basically hell on earth ..meaning the rest of your time on earth is hell
 
I think it was either this or reduce production - which meant putting Tesla is financial distress again. Having gone through that a couple of times - I can understand why Elon chose this path.
No, there was another option - tesla could have shipped the cars without the sensors and installed them once available. They chose the cheap option rather than the right one.
 
  • Funny
  • Like
Reactions: EVNow and Kimmi
That and a major injury is basically hell on earth ..meaning the rest of your time on earth is hell
Are you saying you'd prefer to die vs living with a major injury? That's a personal preference, impossible to calculate.
Also hard to calculate how many lives would be saved vs major injuries w/o death.
The assumption would be if there were less deaths, there would also be less major injuries, not more.
 
I think it was either this or reduce production - which meant putting Tesla is financial distress again. Having gone through that a couple of times - I can understand why Elon chose this path.
Based on the recent book it wasn’t going to affect production afaik. Elon wanted to reduce cost and said the radars were causing trouble. His engineers argued with him and said removing them was not a good idea, but Elon, of course, overrode them and made the decision which iirc at least one engineer left the company over. In March this year, it was reported that Tesla crashes have increased since removing radar functionality from AutoPilot.

Personal safety was foregone because Elon wanted to save a few bucks. If radar was a problem in software, they could have kept the hardware, but lessened its use until that was solved. Either by replacement hardware or by software updates.

Elon should have listened to his engineers and trusted them to solve the problem. Instead he went full in on Vision and now people’s lives are being affected by that decision.
 
Last edited:
No, there was another option - tesla could have shipped the cars without the sensors and installed them once available. They chose the cheap option rather than the right one.
Tesla didn't go bankrupt like so many other new auto companies in the last 100 years because they chose the "cheap" option.

Elon should have listened to his engineers and trusted them to solve the problem.
As an engineer that has argued with management to take the safer option on several occasions, I can say that if Silicon Valley execs listened to engineers all the time, the pace of innovation would be much slower ;)
 
As an engineer that has argued with management to take the safer option on several occasions, I can say that if Silicon Valley execs listened to engineers all the time, the pace of innovation would be much slower ;)
The problem with that argument in this case is that Autopilot reliability and safety has gone backwards, not forwards since removing radar. Innovation has been slowed due to the removal of radar.

You don’t seem to have any concern about the consequences of that decision. People are literally getting hurt because of it.

Innovation shouldn’t come at the cost of lives. While that may seem dramatic, that’s what is at stake with automated driving systems. The longer Tesla goes on selling cars that only support Vision, the more likely it is for Tesla to begin losing cases where Autopilot will be at fault and Tesla will be held accountable.
 
  • Like
Reactions: sleepydoc
Do we have any hard data-- as opposed to anecdotal data-- supporting this assertion?
You could Google it. I’m not the researcher. So, no. *I* don’t have hard data. I have personal experience going from version 10.x.x software to current. And yeah, my personal experience is anecdotal. But there are people following this with actual data. Blame “the media” however you choose. If my concerns are invalid, it will be proven over time. There’s no way to say for certain if Vision will eventually work as some expect. But it’s pretty obvious to understand cameras won’t work well at night, in weather and when otherwise obstructed.


 
Tesla has a history of removing hardware, or making major software changes before they're ready, and then testing them on the Beta fleet. Radar was one of those things. I personally had radar issues with my MY, so I saw improvement in PBs when radar was disabled for Tesla Vision. But that's not everyone's experience. I do acknowledge that once radar was disabled (and/or removed from new vehicles), the software wasn't quite ready for it. My personal experience on FSD Beta back in late 2021 was pretty nail-biting. It could barely make a right turn, and lefts would frequently fail mid-turn. However, as the FSD Beta software has improved, we'll likely see less accidents. The big issue is that the FSD Beta stack needs to take over AP/EAP so those cars benefit from the new code.
 
Tesla didn't go bankrupt like so many other new auto companies in the last 100 years because they chose the "cheap" option.


As an engineer that has argued with management to take the safer option on several occasions, I can say that if Silicon Valley execs listened to engineers all the time, the pace of innovation would be much slower ;)
Innovation is creating a new or better solution to a problem. Tesla did neither.
 
  • Like
Reactions: KerrySkates