Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Tesla owner who falsely claimed car was in autopilot before Melbourne crash pleads guilty

This site may earn commission on affiliate links.

Don't lie about your tesla I suppose is the lesson - but does Tesla collects and stores the sort of data presented to court in their servers (which presumably they had to hand over under subpoena)?

Not a lawyer but if the data is stored locally in the car then hypothetically the driver should not be compelled to hand this over (self-incrimination)?
 

Don't lie about your tesla I suppose is the lesson - but does Tesla collects and stores the sort of data presented to court in their servers (which presumably they had to hand over under subpoena)?

Not a lawyer but if the data is stored locally in the car then hypothetically the driver should not be compelled to hand this over (self-incrimination)?
It can be subpoenaed and retrieved.
 
  • Like
Reactions: Chuq and Jules22
Even if it was true that the car was in autopilot, that only raises questions like "why did you think it was safe to pass a tram in autopilot?" and "where you paying attention to the road?". It is still dangerous driving in either case. Particularly as we don't have anything like Full Self Driving yet in Australia.
 
  • Like
Reactions: jq371
Just the go-to excuse for anyone who does something stupid in a Tesla, even if they were twice the speed limit, 3x blood alchohol limit, etc, they somehow think claiming it was on autopilot is a reasonable excuse despite the fact that it's easily proveable and 9 times out of 10 they weren't, the other 1 time doesn't make any difference as they are still 100% responsible.
 
Somewhat interesting that driver didn't even press the brake despite the warnings sounding.

The habit in a Tesla need to be to lift at first sign of danger or alerts.
(Well other than phantom braking in which case you need to hit the pedal)
 
  • Like
Reactions: paulp
Just the go-to excuse for anyone who does something stupid in a Tesla, even if they were twice the speed limit, 3x blood alchohol limit, etc, they somehow think claiming it was on autopilot is a reasonable excuse despite the fact that it's easily proveable and 9 times out of 10 they weren't, the other 1 time doesn't make any difference as they are still 100% responsible.
But it never can be "my" fault.
 
Just the go-to excuse for anyone who does something stupid in a Tesla, even if they were twice the speed limit, 3x blood alchohol limit, etc, they somehow think claiming it was on autopilot is a reasonable excuse despite the fact that it's easily proveable and 9 times out of 10 they weren't, the other 1 time doesn't make any difference as they are still 100% responsible.

I am just waiting for the accidents where the driver blames (whether rightly or wrongly) Full Sell Driving.

Which in turn will be used as evidence by some that Tesla obviously isn't ready for Full Self Driving yet.
 
  • Like
Reactions: Jules22
I am just waiting for the accidents where the driver blames (whether rightly or wrongly) Full Sell Driving.
It seems they wanted to get on the front foot of this - hence why they've started to call it "Full Self Driving supervised" for the recent wide release. It's very clear, even to the most intentionally obtuse journalists, what it means. My guess is it won't be called "Full Self Driving" until the accident rate is so low that they are happy to take responsibility (and there is a legal framework to allow them to do so).
 
  • Like
Reactions: Jules22
I am just waiting for the accidents where the driver blames (whether rightly or wrongly) Full Sell Driving.

Which in turn will be used as evidence by some that Tesla obviously isn't ready for Full Self Driving yet.
My recent Calais had automatic parking where it would do everything.
I’m pretty sure it would have still been my fault if it hit anything while autonomously parking.
Funny how the media never brings that up :)
 
  • Like
Reactions: MickyD
I am just waiting for the accidents where the driver blames (whether rightly or wrongly) Full Sell Driving.

Which in turn will be used as evidence by some that Tesla obviously isn't ready for Full Self Driving yet.
Waiting? There already have been a few, one of which was settled a few weeks ago.

Most everything that has been in the courts is using very old versions of FSD which were more like Autopilot.

But there's really no evidence needed to prove that FSD is ready yet. No one says it is, even Tesla, even Elon.
There are still a number of versions in the queue before it will ever be thought of as ready.