Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Autopilot/FSD Pedestrian Accident

This site may earn commission on affiliate links.
In the linked article below a pedestrian was struck & killed while waling along a rural highway. The suspected driver has a Model X. He claims "that 'he doesn't remember' hitting the woman with his Tesla, but if he did, would have been driving on autopilot and checking emails". The man's attorney "explained it is probable his car would've been using Tesla's full self-driving capability". And talking with investigators he "maintained that he doesn't remember hitting Cathy Donovan with his Tesla, but if he did, he would have been alone in his Tesla driving on 'autopilot,' not paying attention to the road, while doing things like checking work emails."
What a terrible example of how to use AP or FSD Beta. These are the people who ruin it for everyone else. I don't know if there is any footage from the car or logs available any more because the accident happened many months ago. There haven't been references in the media articles I've seen to log data from Tesla regarding the location of the vehicle or the use of AP around the time of the accident.
 
In the linked article below a pedestrian was struck & killed while waling along a rural highway. The suspected driver has a Model X. He claims "that 'he doesn't remember' hitting the woman with his Tesla, but if he did, would have been driving on autopilot and checking emails". The man's attorney "explained it is probable his car would've been using Tesla's full self-driving capability". And talking with investigators he "maintained that he doesn't remember hitting Cathy Donovan with his Tesla, but if he did, he would have been alone in his Tesla driving on 'autopilot,' not paying attention to the road, while doing things like checking work emails."
What a terrible example of how to use AP or FSD Beta. These are the people who ruin it for everyone else. I don't know if there is any footage from the car or logs available any more because the accident happened many months ago. There haven't been references in the media articles I've seen to log data from Tesla regarding the location of the vehicle or the use of AP around the time of the accident.
I should hope that I would remember if my car hit a pedestrian. I'm thinking that intoxicating substances were involved - FSD or not.
 
  • Like
Reactions: rlsd and JB47394
In the linked article below a pedestrian was struck & killed while waling along a rural highway. The suspected driver has a Model X. He claims "that 'he doesn't remember' hitting the woman with his Tesla, but if he did, would have been driving on autopilot and checking emails". The man's attorney "explained it is probable his car would've been using Tesla's full self-driving capability". And talking with investigators he "maintained that he doesn't remember hitting Cathy Donovan with his Tesla, but if he did, he would have been alone in his Tesla driving on 'autopilot,' not paying attention to the road, while doing things like checking work emails."
What a terrible example of how to use AP or FSD Beta. These are the people who ruin it for everyone else. I don't know if there is any footage from the car or logs available any more because the accident happened many months ago. There haven't been references in the media articles I've seen to log data from Tesla regarding the location of the vehicle or the use of AP around the time of the accident.
It sucks anyway one looks at it. People are too distracted these days. States pass hands free laws and we still see people playing with the phone while they drive.
 
It sucks anyway one looks at it. People are too distracted these days. States pass hands free laws and we still see people playing with the phone while they drive.
It amazes me how many people I see on their phone while driving cars that do not have anything even remotely close to what our Teslas have. Tesla and autopilot are not the problem. People are. I’m convinced, if everybody owned a Tesla, there would be fewer deaths overall.
 
As a resident of Minneapolis that’s been in the news here. I’ve actually read that article and have several thoughts:
- FSD and AP will disconnect if you’re in your phone
- FSD will also disconnect if you’re distracted
- FSD and Teslas in general recognize and brake for pedestrians
- it would be impossible not to notice when your car hit a pedestrian hard enough to kill them.
- it appears the owner of the car actually repaired and tried to conceal the damage so he knew he had hit something.
- even if one takes his statement at face value, he’s admitting to vehicular manslaughter (killing someone by operating a vehicle in a negligent fashion.) I’m actually shocked that his lawyer would make such a statement or allow him to make such a statement.

In aggregate, my conclusion is that he was either drunk/impaired or simply distracted and now is trying to blame Autopilot. It’s a strange claim, since it doesn’t absolve him of any guilt whatsoever but maybe he thinks ‘checking emails while using autopilot’ is more acceptable than driving drunk?
 
  • Informative
  • Like
Reactions: rlsd and kabin
As a resident of Minneapolis that’s been in the news here. I’ve actually read that article and have several thoughts:
- FSD and AP will disconnect if you’re in your phone
- FSD will also disconnect if you’re distracted
- FSD and Teslas in general recognize and brake for pedestrians
- it would be impossible not to notice when your car hit a pedestrian hard enough to kill them.
- it appears the owner of the car actually repaired and tried to conceal the damage so he knew he had hit something.
- even if one takes his statement at face value, he’s admitting to vehicular manslaughter (killing someone by operating a vehicle in a negligent fashion.) I’m actually shocked that his lawyer would make such a statement or allow him to make such a statement.

In aggregate, my conclusion is that he was either drunk/impaired or simply distracted and now is trying to blame Autopilot. It’s a strange claim, since it doesn’t absolve him of any guilt whatsoever but maybe he thinks ‘checking emails while using autopilot’ is more acceptable than driving drunk?
You don't accept the tales spun by someone who would hit and run a pedestrian.

However, a Model X might not have a cabin camera so using a cell phone on autopilot would be plausible, though still not excusable.
 
As a resident of Minneapolis that’s been in the news here. I’ve actually read that article and have several thoughts:
- FSD and AP will disconnect if you’re in your phone
- FSD will also disconnect if you’re distracted
- FSD and Teslas in general recognize and brake for pedestrians
- it would be impossible not to notice when your car hit a pedestrian hard enough to kill them.
- it appears the owner of the car actually repaired and tried to conceal the damage so he knew he had hit something.
- even if one takes his statement at face value, he’s admitting to vehicular manslaughter (killing someone by operating a vehicle in a negligent fashion.) I’m actually shocked that his lawyer would make such a statement or allow him to make such a statement.

In aggregate, my conclusion is that he was either drunk/impaired or simply distracted and now is trying to blame Autopilot. It’s a strange claim, since it doesn’t absolve him of any guilt whatsoever but maybe he thinks ‘checking emails while using autopilot’ is more acceptable than driving drunk?
It is true that the X has had a cabin camera since 2021. The X in question is a 2022 & thus should have a cabin camera. However, the camera could have been covered if he was not using FSD. I believe that NoA & basic autopilot can still be used with a covered cabin camera & only FSD Beta requires it be uncovered. I'm also very surprised that the attorney would make such a statement. It's possibly meant to be a distraction to cast blame on the car.

What media coverage has mentioned that he repaired the car? I don't recall reading that in any of the local coverage I've seen.
 
It is true that the X has had a cabin camera since 2021. The X in question is a 2022 & thus should have a cabin camera. However, the camera could have been covered if he was not using FSD. I believe that NoA & basic autopilot can still be used with a covered cabin camera & only FSD Beta requires it be uncovered. I'm also very surprised that the attorney would make such a statement. It's possibly meant to be a distraction to cast blame on the car.

What media coverage has mentioned that he repaired the car? I don't recall reading that in any of the local coverage I've seen.
One of the earlier articles (I think in the Star Tribune) talked about the car being repaired.

Also, it's posible for and, and maybe even all of my assumptions to be false (he could have been hiding the cell phone while he was using it so the camera couldn't see, etc) but the combination becomes increasingly unlikely.

I think you're right, it's an attempt at distraction. The problem is it's legally meaningless and only serves to incriminate. Any halfway decent layer should know this.
 
I'm an attorney with some experience in criminal law. Not my full-time gig though. That said, I think the idea is that the lawyer knows the client will ultimately be found to have been the person driving the car who killed this victim. He now needs to create reasonable doubt as to his client's guilt, and hopefully guilt to a lesser charge. This is a solid way to begin that process. "Oh, yeah, he was in the driver's seat. But he had Tesla AutoPilot Full Self Driving turned on. He had no idea he would be responsible if something bad happened. After all, it's "Autopilot!""
 
I'm an attorney with some experience in criminal law. Not my full-time gig though. That said, I think the idea is that the lawyer knows the client will ultimately be found to have been the person driving the car who killed this victim. He now needs to create reasonable doubt as to his client's guilt, and hopefully guilt to a lesser charge. This is a solid way to begin that process. "Oh, yeah, he was in the driver's seat. But he had Tesla AutoPilot Full Self Driving turned on. He had no idea he would be responsible if something bad happened. After all, it's "Autopilot!""
So the warnings in the manual saying it isn't fully autonomous and that you need to pay attention as well as the warning when you initiate the system as well as the constant "pay attention to the road" warnings don't mean anything?

He admitted he wasn't paying attention when he was supposed to, admitted breaking the law by checking his email and admitted he wasn't paying attention well enough to know when he hit someone so I'm having a hard time understanding how this would create reasonable doubt.
 
So the warnings in the manual saying it isn't fully autonomous and that you need to pay attention as well as the warning when you initiate the system as well as the constant "pay attention to the road" warnings don't mean anything?

He admitted he wasn't paying attention when he was supposed to, admitted breaking the law by checking his email and admitted he wasn't paying attention well enough to know when he hit someone so I'm having a hard time understanding how this would create reasonable doubt.
I think those excuses firm up that the death was unintentional.
 
I think those excuses firm up that the death was unintentional.
@dansev can comment on this better than I can but the question isn't really of intentionality. I've seen nothing to indicate it was in any way intentional. Rather it's a question of negligence. Vehicular homocide requires an element of gross negligence - was the driver operating the vehicle in a grossly negligent manner when he caused the death?

My guess is the lower will try to argue that the driver through auto pilot was doing the driving and so failing to pay attention, checking email, etc doesn't constitute gross negligence. That's a very large stretch, IMO.
 
FWIW, I spoke with a lawyer at work about the article. She doesn't practice criminal law but she had actually seen the article, discussed it with another lawyer and they had the exact same thoughts - that the statement does more to admit liability than anything else.

Also, after re-reading the article it's not clear if the driver's lawyer was involved in making the statement or not. It actually seems like the man made the statement on his own. I'm guessing he made the statement when investigators followed up with him then later hired a lawyer (who's now trying to figure out how to undo his client's self incrimination!)
 
FWIW, I spoke with a lawyer at work about the article. She doesn't practice criminal law but she had actually seen the article, discussed it with another lawyer and they had the exact same thoughts - that the statement does more to admit liability than anything else.

Also, after re-reading the article it's not clear if the driver's lawyer was involved in making the statement or not. It actually seems like the man made the statement on his own. I'm guessing he made the statement when investigators followed up with him then later hired a lawyer (who's now trying to figure out how to undo his client's self incrimination!)
I think the claim is trying to establish that he didn't know he hit someone. If he knew he hit someone and then fled the scene, presumably that would have extra charges beyond the fact he hit someone in the first place.
 
This to me is the most significant factor. If he did do this, it shows knowledge of the event and an attempt to hide it.
To me, that speaks volumes.
Ladies and gentlemen of the jury, yes, of course he had his car repaired. As we have told you, he knew he hit something. But he had no idea what he hit because he was on his phone. And he was on his phone because he was in a Tesla, and had engaged the car's "Autopilot" feature, which is part of their "Full Self Driving" suite. Yes, my client was indeed in the car that hit that poor victim. But my client was not driving! Tesla's software was driving. And therefore, you cannot convict him of vehicular manslaughter!
 
Ladies and gentlemen of the jury, yes, of course he had his car repaired. As we have told you, he knew he hit something. But he had no idea what he hit because he was on his phone. And he was on his phone because he was in a Tesla, and had engaged the car's "Autopilot" feature, which is part of their "Full Self Driving" suite. Yes, my client was indeed in the car that hit that poor victim. But my client was not driving! Tesla's software was driving. And therefore, you cannot convict him of vehicular manslaughter!
Ladies and gentlemen of the jury, yes Mr. Dansev may claim he had no idea that he hit something but then why did he lie about the repairs to the investigators when asked? And we have heard the testimony of two separate witnesses who the victim all the way up on the hood of his car before she fell, fatally wounded by his inattention, yet he has no explanation why he failed to notice her.

He also admits that he has seen the warnings that are shown every time he engages autopilot stating he needs to pay attention to the road and had the system disconnect because he failed to pay attention. He also admits that he has access to the manual but ‘might not have read the numerous warnings’ contained in this manual. Furthermore, MN state law prohibits use of electronic devices while driving whether driving assist aids are in use or not.

Yes, when you look at the sum of his actions it is clear that there is no other explanation than he was grossly negligent while driving his 5000 pound car along a rural highway with pedestrians, mowing down the victim who was doing nothing more than enjoying the evening while walking her dogs!
 
Last edited:
Ladies and gentlemen of the jury, yes Mr. Dansev may claim he had no idea that he hit something but then why did he lie about the repairs to the investigators when asked? And we have heard the testimony of two separate witnesses who the victim all the way up on the hood of his car before she fell, fatally wounded by his inattention, yet he has no explanation why he failed to notice her.

He also admits that he has seen the warnings that are shown every time he engages autopilot stating he needs to pay attention to the road and had the system disconnect because he failed to pay attention. He also admits that he has access to the manual but ‘might not have read the numerous warnings’ contained in this manual. Furthermore, MN state law prohibits use of electronic devices while driving whether driving assist aids are in use or not.

Yes, when you look at the sum of his actions it is clear that there is no other explanation than he was grossly negligent while driving his 5000 pound car along a rural highway with pedestrians, mowing down the victim who was doing nothing more than enjoying the evening while walking her dogs!
Remember, it's proof "beyond a reasonable doubt" (generally considered to be >90%). Not "more probably than not" (51%). That's a high bar. Anyway, I'm not claiming to support any of these ideas, just playing devil's advocate! 👿