Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Model Y Driving with FSD Accident

This site may earn commission on affiliate links.
All,

This is my first time to leave a post in this website and I just want to share some experience with you all. My wife was driving the Model Y with FSD driving in the local.
  • The vehicle was driving in location A of figure below (left lane of two lanes on Spring Cypress Rd).
  • The vehicle was supposed to make a right turn to Louetta Rd position C.
  • Because there is another vehicle on its right, my Model Y could not change the lane from left to the right before its turn.
  • However, per the map, my Model Y needs to make its right turn at the corner; it is not supposed to do so, because only the right lane can make the right turn.
  • At the corner, FSD was thinking hard about what to do before its turn. After the other vehicle passed, it decided to make the right turn.
  • After the right turn at a very low speed, it speeds up hitting into a community sign (position B in the Figure) causing the collision. The computer thought that was a lane to drive, but it is not.
Bug of the FSD: the computer is supposed to think ahead by changing the lane from the left to the right much earlier before it is supposed to make the right turn. It cannot change the lane at the last minute before its turn. Beyond that, the vehicle failed to see road edge and a big community sign in front of it.

Question: the vehicle was smoking at the first few seconds, but stopped after that. Both two front air bags are deployed. As you can see from pictures below, do you know how Tesla insurance will handle case like this? We are in communicating with the insurance and estimate center, but it just takes time, especially in Holiday season to come.

1702053737553.png


1702053936616.png


1702053984215.png


1702053994895.png
 
  • Like
Reactions: fsb1
Thanks for sharing the fsd experience so others can take cautions. Smoke and air bags sound like a total write off for Tesla insurance. For an average person (with no training in Test Pilot), I wouldn't recommend letting FSD to drive you.

Sorry for your loss.
 
Bug of the FSD: the computer is supposed to think ahead by changing the lane from the left to the right much earlier before it is supposed to make the right turn. It cannot change the lane at the last minute before its turn.
My question is: once the driver understood FSD made a wrong decision and/or the probable path of events wasn't going to be successful, why did they decide to leave FSD driving?
 
My question is: once the driver understood FSD made a wrong decision and/or the probable path of events wasn't going to be successful, why did they decide to leave FSD driving?
Many consumers are not trained Test Pilots. Test Pilots are already trained that their airplanes are unsafe for public use.

Consumers even heard that FSD is achieving level 4 or 5 this year, 2023, which is only 23 more days, so what could go wrong?


Disinformation trains consumers to even rely more on FSD technology to save them in accident scenarios like in this thread.
 
Did the dashcam system save any video? Very curious how Tesla handles this. I've heard horror stories about Tesla insurance.

There's no need for dashcam because there's no other drivers in this accident to blame. If you have comprehensive coverage, Tesla insurance will pay, no question asked. If you don't then you get nothing from Tesla Insurance.

If you try to use the dashcam to blame on FSD, well, good luck. The jury has blamed drivers and not the software.

 
There's no need for dashcam because there's no other drivers in this accident to blame. If you have comprehensive coverage, Tesla insurance will pay, no question asked.

If you try to use the dashcam to blame on FSD, well, good luck. The jury has blamed on drivers and not the software.


I was interested in seeing what happened, not for insurance.

This wouldn't be a comp claim, it'd be collision.
 
My question is: once the driver understood FSD made a wrong decision and/or the probable path of events wasn't going to be successful, why did they decide to leave FSD driving?
It was a corner turning; the driver was also confused at what FSD is going to do next while the FSD was thinking very hard. The driver was also thinking hard at what to do, the vehicle was turning at a very low speed around 4-5 miles/hour; once the turn is completed, it speeds up right away to 40miles/hour within a few seconds. Once it turns, I was told it was impossible for the driver change the lane on the Louetta Rd.

The driver was cautious at what FSD was doing, but failed to take the control back within such as short time.

Question: why TESLA's front sensor cannot detect a big stone wall in front of earlier after the turn? It just hits right into the stone wall causing the collision. If the sensor can see obstacles in longer distance in front of it, the collision can be prevented. Hardware design issue from Tesla??
 
I was interested in seeing what happened, not for insurance.

This wouldn't be a comp claim, it'd be collision.
Once the collision happens, the computer shut down and the display was off in black. There was no way to open the glove box and take the USB out.

I was trying to open the door accessing into the vehicle for the USB, the vehicle is completely shut down with door closed.

I guess the camera is not recording anything any more, am I correct?
 
There's no need for dashcam because there's no other drivers in this accident to blame. If you have comprehensive coverage, Tesla insurance will pay, no question asked. If you don't then you get nothing from Tesla Insurance.

If you try to use the dashcam to blame on FSD, well, good luck. The jury has blamed drivers and not the software.



We do have comprehensive coverage from Tesla insurance. We are not too concerned about if the insurance will pay or not. But we are just concerned how Tesla insurance will pay for it, how long it takes, and how bad it will affect our insurance history.

Obviously this is an error from the software, but I have to admit the driver should be above the software. With explanation in other threads, the driver just cannot take vehicle's control back within such a short time.
 
  • Like
Reactions: fsb1 and Deaf Paul
I was interested in seeing what happened, not for insurance.

This wouldn't be a comp claim, it'd be collision.

It was a corner turning; the driver was also confused at what FSD is going to do next while the FSD was thinking very hard. The driver was also thinking hard at what to do, the vehicle was turning at a very low speed around 4-5 miles/hour; once the turn is completed, it speeds up right away to 40miles/hour within a few seconds. Once it turns, I was told it was impossible for the driver change the lane on the Louetta Rd.

The driver was cautious at what FSD was doing, but failed to take the control back within such as short time.

Question: why TESLA's front sensor cannot detect a big stone wall in front of earlier after the turn? It just hits right into the stone wall causing the collision. If the sensor can see obstacles in longer distance in front of it, the collision can be prevented. Hardware design issue from Tesla??

What happened is that Tesla FSD technology is an unfinished product. And people keep asking why?

Imagine you sleep soundly in your bedroom, and suddenly the rainwater pours down to wake you up. Why???? Because the house is not finished. You moved in and slept in a house that has not got its roof completed: There's no roof at the spot above your bed!

Same with FSD. Why doesn't it stop? Because no one wrote the necessary programming codes to stop it. No roof=wet. No codes=collision.
 
Last edited:
Obviously this is an error from the software, but I have to admit the driver should be above the software.

As long as you dont start going at anyone from Tesla with these type of statements and focus on the insurance part, you will probably be fine with getting the car fixed / totaled whatever. If you start in on "This is FSDs fault" or trying to get Tesla to take blame instead of the driver, then you wont be fine in getting this resolved.

Its always the drivers fault, regardless of what happened / happens.
 
Once the collision happens, the computer shut down and the display was off in black. There was no way to open the glove box and take the USB out.
That's expected. Power disconnect to prevent electrocution to first responders.
I was trying to open the door accessing into the vehicle for the USB, the vehicle is completely shut down with door closed.
You might want to follow the instructions on jumping start the 12V to get to your doors and glove compartment.
I guess the camera is not recording anything any more, am I correct?

Yes. Once the power is disconnected, the dashcam shuts down.
 
  • Like
Reactions: SidetrackedSue
It was a corner turning; the driver was also confused at what FSD is going to do next while the FSD was thinking very hard.
It's a traffic light, and the car was in the wrong lane for a right turn. That was the right moment to disengage.

I'm also interested to understand how you know a neural network is thinking very hard :)

Question: why TESLA's front sensor cannot detect a big stone wall in front of earlier after the turn?

As far as I understood, only cameras' feeds are used by FSD. Beyond that, the neural network model consumed all the information available, and I guess that being in the wrong lane and having a car on the right side obstructing the right lane, and trying to stay clear to it, put the model in a scenario in which it is not well trained or with conflicting signals.
 
  • Like
Reactions: David99 and rlsd
What happened is that Tesla FSD technology is an unfinished product. And people keep asking why?

Imagine you sleep nicely in your bedroom, and suddenly the rainwater pours down to wake you up. Why???? Because the house is not finished. You moved in an sleep in the house that has not got its roof completed: There's no roof at the spot above your bed!

Same with FSD. Why doesn't it stop? Because no one wrote the necessary programming codes to stop it. No roof=wet. No codes=collision.

It's a traffic light, and the car was in the wrong lane for a right turn. That was the right moment to disengage.

I'm also interested to understand how you know a neural network is thinking very hard :)



As far as I understood, only cameras' feeds are used by FSD. Beyond that, the neural network model consumed all the information available, and I guess that being in the wrong lane and having a car on the right side obstructing the right lane, and trying to stay clear to it, put the model in a scenario in which it is not well trained or with conflicting signals.
Main way to tell from the vehicle was thinking very hard is from the car's behavior that it was trying to turn left and right. I think my wife missed that critical moment to disengage it, because the machine was confused. Human was supposed to take its control back right away. I believe that was our fault. I used to disengage the FSD whenever I encountered such a situation, but my wife failed. It is sad..
 
There's no need for dashcam because there's no other drivers in this accident to blame. If you have comprehensive coverage, Tesla insurance will pay, no question asked. If you don't then you get nothing from Tesla Insurance.

If you try to use the dashcam to blame on FSD, well, good luck. The jury has blamed drivers and not the software.


I believe our Tesla insurance will pay for all the loss after our deductbile, we are not much concerned about it.

The only concern is how Tesla will handle the case, will it be repaired, or totalled? And will the accident damage our insurance record? I guess it does. Just don't know how much the insurance will increase because of the collision.

By the way, the police also give my wife a ticket to show up in a court, I guess she has to admit all fault in the accident, not controling the speed in the collision, am I correct? It is our first time experience living in this country, no clue at what to do...
 
Main way to tell from the vehicle was thinking very hard is from the car's behavior that it was trying to turn left and right. I think my wife missed that critical moment to disengage it, because the machine was confused. Human was supposed to take its control back right away. I believe that was our fault. I used to disengage the FSD whenever I encountered such a situation, but my wife failed. It is sad..
Many drivers are conned into faith in FSD technology. They didn't believe that it could still get them into trouble, collisions, injuries, and fatalities.

When the system is about to do something wrong, drivers still have faith and think the technology will bail them out at the last second, so they wait until the collision happens.

To prevent this scenario, Waymo has to spend time writing up all the programming codes for these scenarios. They also do a High-Defintion Premapping for the route (the same way Tesla did for the 2016 demo). It also have multiple sensors while Tesla took away sensors. Its sensors are up-to-date, 4-d imaging radar, not the cheap Tesla radar...

What it means is, Waymo car would know at the start of the trip of all the chess moves: which lane to take (Tesla took the wrong lane), how to turn right to avoid this kind of collision...

Tesla doesn't believe in the Waymo's way.
 
I believe our Tesla insurance will pay for all the loss after our deductbile, we are not much concerned about it.
Correct.
The only concern is how Tesla will handle the case, will it be repaired, or totalled?
My guess with the smoke and airbag is: Totalled.
And will the accident damage our insurance record? I guess it does.
Correct.
Just don't know how much the insurance will increase because of the collision.
You can call them. In the meantime, I would create a budget for a potential premium increase.
By the way, the police also give my wife a ticket to show up in a court, I guess she has to admit all fault in the accident, not controling the speed in the collision, am I correct? It is our first time experience living in this country, no clue at what to do...
Just say the truth. The driver relied on FSD to drive and to prevent collisions for her but after the collision, it is now clear that it was a mistake. In hindsight, the driver should never have let the FSD drive her.
 
Many drivers are conned into faith in FSD technology. They didn't believe that it could still get them into trouble, collisions, injuries, and fatalities.

When the system is about to do something wrong, drivers still have faith and think the technology will bail them out at the last second, so they wait until the collision happens.

To prevent this scenario, Waymo has to spend time writing up all the programming codes for these scenarios. They also do a High-Defintion Premapping for the route (the same way Tesla did for the 2016 demo). It also have multiple sensors while Tesla took away sensors. Its sensors are up-to-date, 4-d imaging radar, not the cheap Tesla radar...

What it means is, Waymo car would know at the start of the trip of all the chess moves: which lane to take (Tesla took the wrong lane), how to turn right to avoid this kind of collision...

Tesla doesn't believe in the Waymo's way.
Thanks for the insight!

This makes me believe TESLA is really cheap, especially in front of safety. Why Elon and his company keeps saying the vehicle is especially designed for safety? Shame for us blind Elon followers..
 
Thanks for the insight!

This makes me believe TESLA is really cheap, especially in front of safety. Why Elon and his company keeps saying the vehicle is especially designed for safety? Shame for us blind Elon followers..
Was she injured?

Teslas are very safe in protecting passengers. The best way to protect the passenger is for the car to absorb the impact, like shown in your pictures. That's why they are lauded in crash test, not because the car is fine, but the occupants are.