Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Fatal autopilot crash, NHTSA investigating...

This site may earn commission on affiliate links.
This is really sad.

Scary thing is that the truck driver continued on driving, and so did the Tesla for a short while, and it only stopped because the AP camera on the top windshield got destroyed.

Imagine if something goes through the windshield killing the driver, but all AP systems remain operational - the car would just keep on driving...

Elon's comment about the radar confusing with street signs doesn't make any sense. The purpose of radar is to measure the distance to an object, and it clearly should have distinguished the truck a few feet off the ground vs. a street sign 20 feet above the road.
 
...if he had a dashcam and it was smashed then the card could be anywhere from the road to the pole, in the weeds, etc.

If the dashcam was smashed to that extent, then it's unlikely the memory card could have been written to (i.e. video file saved to flash memory) before the dashcam was destroyed. I'm not sure at what interval / granularity the flash memory is written/saved, or whether a partially written but interrupted video file will be readable/recoverable. If the critical section of video only existed in the dashcam's RAM at the time of impact, then that section would be lost forever. Perhaps someone more knowledgeable about this can give their thoughts.

Another remote possibility (this is pure conspiracy-theory speculation; I put the chance of this near zero, so I'm emphatically not suggesting it's true) is that the truck driver saw the dashcam, realized it contained evidence, and removed it from the scene. Given the Tesla driver's enthusiasm and track record of posting video, it is somewhat unusual that the dashcam wasn't in the car at the time of the crash. Of course, this possibility can be trivially ruled out if the dashcam is later found at his home or elsewhere.
 
Last edited:
  • Informative
Reactions: Lex
You cherry-picked my post there. At least reply to the whole thing. Everything failed in this scenario. Truck driver shouldn't have been there. AP absolutely failed in this scenario. And yes, most responsible party here is the driver unfortunately. To defend AP 100% here is absurd. It did NOT function properly.

This is not meant to be snippy but rather to clarify a point. If you design brakes on a car to stop from some speed in 200' but there is then an accident where the brakes would have had to stop from the same speed in 150', did the brakes fail? If an aircraft has a designated minimum runway requirement of 4000' and it runs off of the end of a 3000' runway, did the plane fail? The emergency braking system on the Tesla is designed to prevent rear end accidents. Lane keep assist worked fine in this accident. Basically you have advanced cruise control, which is designed to slow down when following a vehicle, fail to react to a lateral intrusion. Did it fail to do what it was designed to do? I don't think the system was ever designed to handle this scenario. The radar has many limitations in this scenario and MobilEye has said the camera isn't designed to handle this. AP is just lane keep assist merged with smart cruise control. It seems to me people want a lock designed for a child's diary to protect their bicycle from theft. I have never seen Tesla claim the car could react in this situation.

I may not be saying things properly but as an engineer I am sensitive to people insisting things work outside of the design envelope. We have many things in out lives that will have bad results in circumstances out of the design environment. I doubt automatic emergency braking will prevent you form running into a chain link fence. However, if it prevents rear end accidents it is still a good and useful thing.
 
This is really sad.

Scary thing is that the truck driver continued on driving, and so did the Tesla for a short while, and it only stopped because the AP camera on the top windshield got destroyed.

Imagine if something goes through the windshield killing the driver, but all AP systems remain operational - the car would just keep on driving...

Elon's comment about the radar confusing with street signs doesn't make any sense. The purpose of radar is to measure the distance to an object, and it clearly should have distinguished the truck a few feet off the ground vs. a street sign 20 feet above the road.

If the driver doesn't touch the wheel the car will eventually stop on it s own. Tesla gives warnings to touch the wheel and then slows down if there is no reaction form the driver.

The problem is that the radar used isn't a beam that is scanned (phased array radar) but a fixed antenna. Take the max distance you want to "see." Call this distance Lmax. You then ignore all returns with a greater delay. Now set the max height you want to read at that distance. Call this Hmax. That sets beam veritical dispersion angle. At distance Lmax/2 you will only see objects to a height of Hmax/2. The radar is on the bumper area. You quickly get to where you don't see very high. The next issue is distinguishing what you see. In an earlier post I used the example of a blue sheet of paper on a blue wall. The road generates some signal with a Doppler shift determined by the car's speed. A stopped vehicle will have the same Doppler shift as will a laterally moving vehicle. If the object is a large wall, then the increased reflection strength can be used to signal "hey there is something big ahead" but something porous (chain link fence) or a high trailer where most of the signal goes under the truck is much more difficult.
 
This is great. I do think LIDAR may be a part of the solution.I believe LIDAR also has more issues in fog. It will also be interesting to see what happens when all of those Tesla's with LIDAR jammers react to the LIDAR signal :)

Stereoscopic vision systems should be a big improvement but they have issues with fog and nighttime.

I'm just trying to point out that there is no panacea. As different technologies get cheaper we will see more integrated and capabilities will increase.

EDITED to remove comment when I read the details and found out the LIDAR unit mentions is a scanning unit. Very nice.
 
Small portable DVD player was found in the Tesla, no dashcam was found nor mounted:

DVD player found in Tesla Model S in May 7 crash -Fla officials
----------
The Florida Highway Patrol said on Friday that it found an aftermarket digital video disc (DVD) player in the wreckage of a Tesla Motors Inc Model S involved in a fatal May 7 crash.

"There was a portable DVD player in the vehicle," said Sergeant Kim Montes of the Florida Highway Patrol in a telephone interview.

She said there was no camera found, mounted on the dash or of any kind, in the wreckage.
----------
 
from the guardian in the uk:

"Driver in first known fatal self-driving car crash was also driving so fast that ‘he went so fast through my trailer I didn’t see him’, the truck driver involved said"

"The Tesla driver killed in the first known fatal crash involving a self-driving car may have been watching a Harry Potter movie at the time of the collision in Florida, according to a truck driver involved in the crash."

"[Truck driver] Baressi, who did not immediately respond to requests for comment, said the Harry Potter movie “was still playing ..."

Tesla driver killed while using autopilot was watching Harry Potter, witness says

(I can't vouch for the veracity of any of this, but my condolences to the family for their tragic loss irrespective of the circumstances)
 
And, for the record, McDonalds didn't change the temp of the coffee even after the lawsuit. Just like autopilot will (and should) continue to be available for people to use - with caution. If you don't use caution then unfortunate things can happen.

Yes, but that's not a good thing. That's not how we want corporate America to behave.

Instead of McDonalds making the coffee safer, they just went and made themselves immune to liability. But probably as many people burn themselves nowadays, just less people sue, because their lawyers know they can't win.

Is that really all we want Tesla to strive for? To give better warnings to make themselves more immune from legislation?
 
If a driver becomes unconscious, the car would detect the lack of responsiveness (lack of touch on steering wheel) within a minute or two and slow down by itself.

That would depend on several factors.

* The road in question
* The firmware in the car (autopilot version)
* The time between his last input and losing consciousness.

I've seen videos where someone used autopilot for 30 minutes with no input. I've seen video were that was more like 30 seconds.

Until Tesla gives us a statement about the how long it was before the accident that the car got it's last input from the driver we have to consider the possibility that he wasn't able to prevent the accident because he wasn't conscious/awake/alive.

Maybe autopilot got input from him 30 seconds before the crash, maybe it was 5 minutes. We just don't know.
 
I was just watching the news where this crash has become a big deal. As I watched I wondered if the headline is correct. Is this really the first self driving car fatality? AP is lane keep assist combined with active cruise control. Have there been any Mercedes fatalities where both systems were active? How about the 2015 or newer Hyundai Genesis? Since lane keep assist performed just fine, the issue was with active cruise control and automatic emergency braking. Have there been any deaths in a vehicle with automatic emergency braking? I suspect the answer is a "yes."
 
Er...? The existing metric is one fatality (so far) in 130 million AP-enabled miles, and you're speculating that the metric would be worse if the drivers didn't exercise safe behavior by choosing not to use it in unsafe conditions? What exactly does that prove?
The comparison is not very useful and even possibly misleading.

For it to be more useful, it should compare AP-driven miles to ONLY miles that AP can safely handle instead of comparing it to all vehicle miles traveled. As most know here, there are many limitations of AP making it unsafe to use in those areas/conditions.

If AP was always turned on w/its current limitations, it would surely have much worse accident rate.
 
The car gives "safety" features for free. So surely, REGARDLESS of autopilot being engaged the car should stop - right? If you tried to drive into a brick wall the car would be seeing it and slow down - right?

Tesla's blog admitted that both the system and driver failed to activate the brakes.

It is still much safer with Autopilot if you follow instruction which means to stay are alert and understand its limitations.

So let's talk about driving to a brick wall scenario.

Owners need to follow instruction.

If you wire your household blender to 12V battery and then it is not a design flaw that the blender can only work with 120V.

If you do the same with 240V then it is not a design flaw that the blender goes up in smoke.

The same is with Tesla free feature of Automatic Emergency Braking.

Please read the manual: It says very clear that its specification is not to avoid a crash. It says you should not rely on it to avoid a collision. Because if you do, you might die.

Owners can send a desirable features wishlist to Tesla for future design.

But:

When operating your car, owners should not operate based on their expectations of what the system should do and should not do. It is not about "should" when your lives at stake. It is about what it can.


zyqZbTO.jpg
 
After all the info that has come out, and having a few days to think about it, I gotta think the simple answer is true here. He probably figured he was pretty safe watching a Harry Potter DVD with no cars around him, when probably 1 of the very few scenarios existed that would ultimately lead to his death.

The 1 out of a million scenario was that a semi truck would turn in front of him at the exact time where his Model S would go under it during a very specific window of a few seconds where he wasn't watching the road.

One day soon the technology will exist that will slow a car just enough for someone to survive this.
 
  • Like
Reactions: JeffK
umm, it appears that about 45 days passed between the collision and the blog posting by tesla. something must have happened to make them go public. I am fairly certain that notifications were made a long time ago.
there is a lot of available info and data that has not been offered by tesla or the investigating authorities that IMHO renders most of what is being posted here as pure speculation.

It appears there was some sort of gag order or something about this. The news didn't hear a peep about this from anyone from May 7 to yesterday. The cops who investigated the crash didn't talk, the family didn't talk, the truck driver didn't talk, and nobody involved in the investigation talked. What I find amazing is so many people knew about it for close to two months and nobody said anything until the NHTSA released their preliminary report.
 
Basically you have advanced cruise control, which is designed to slow down when following a vehicle, fail to react to a lateral intrusion. Did it fail to do what it was designed to do? I don't think the system was ever designed to handle this scenario.

TACC in other cars is not just about following a vehicle. It will notice and act when someone suddenly shows up in your lane, or if you are coming over a hill and suddenly see a stalled/slow moving car in the lane.

Besides, I don't think vehicles entering a road from a side street are purely lateral intrusions - it isn't as if they drive 100% laterally and then make a 90 degree turn into your lane.

It TACC is not designed to notice when someone is coming off a side street into your lane, it should be disabled unless one thinks that it is really rare to have cars enter a road from a side street.
 
TACC in other cars is not just about following a vehicle. It will notice and act when someone suddenly shows up in your lane, or if you are coming over a hill and suddenly see a stalled/slow moving car in the lane.

Besides, I don't think vehicles entering a road from a side street are purely lateral intrusions - it isn't as if they drive 100% laterally and then make a 90 degree turn into your lane.

It TACC is not designed to notice when someone is coming off a side street into your lane, it should be disabled unless one thinks that it is really rare to have cars enter a road from a side street.

Why would you disable a safety feature just because it might do nothing in a particular scenario? That's nuts. It isn't like it brakes when it shouldn't. It failed to act leaving that to the driver.

Mercedes Distrionic has the same issues and the manual mentions stopped vehicles being an issue. As for someone moving into your lane TAAC does just fine. It does fine with a slow moving vehicle up ahead.
 
  • Like
Reactions: JohnSnowNW
Drivin - Here are some of the Distronic Plus warnings. I bolded the one about stationary vehicles.

WARNING



DISTRONIC PLUS does not react to: • people or animals • stationary obstacles on the road, e.g.

stopped or parked vehicles
• oncoming and crossing traffic As a result, DISTRONIC PLUS may neither give warnings nor intervene in such situations.

There is a risk of an accident.

Always pay careful attention to the traffic situation and be ready to brake.

WARNING



DISTRONIC PLUS cannot always clearly identify other road users and complex traffic situations.

In such cases, DISTRONIC PLUS may: • give an unnecessary warning and then brake the vehicle
• neither give a warning nor intervene • accelerate unexpectedly There is a risk of an accident.

Continue to drive carefully and be ready to brake, in particular when warned to do so by DISTRONIC PLUS.

WARNING



DISTRONIC PLUS brakes your vehicle with up to 40% of the maximum braking force. If this braking force is insufficient, DISTRONIC PLUS warns you visually and audibly. There is a risk of an accident.

In such cases, apply the brakes yourself and try to take evasive action.