Elon Musk said it is like owning an elevator. When something's wrong, the owner has insurance to cover it.
He said, if there's an FSD mishap, your insurance would cover it first and if it's Tesla's system fault, Tesla would pick up the bill.
So, he did cover about financial coverage by Tesla if necessary but he did not cover who would go to jail.
I think when something goes wrong, an owner of a property would be liable whether there's a presence or absence of the owner at a property.
The money part is easy to cover but if someone needs to be held in a criminal case, it's usually the owner of the property first, then it's up to the owner to pass the buck to the manufacturer.
But manufacturers are seldom going to jail (For example current passenger deaths from Boeing 737 Max which were announced safe by both Boeing and the FAA for days after the world grounded them.).
Good post! Do you have a source for Elon saying that whatever your insurance didn't pick cover, Tesla would pick up? I believe you, but hadn't heard that.
Also, staying with the financial side of things for a minute, your (full) self-driving car is at fault for an accident while under FSD. Your insurance covers it. OK...now your insurance wants to raise your rates going forward. So that's an ongoing cost that doesn't concern Tesla but will concern owners.
The whole liability thing is going to get tricky soon...well before FSD. As soon as "confirm less" / no-stalk NoA turns on, it will get muddy. Because it's still considered driver assist and I'd imagine there will be an (additional) legal disclaimer saying that if you turn this option on, you take the risk and know it is in beta and that you have to be responsible at all times, etc, etc. But it will get muddy because you can be holding the wheel and looking ahead in traffic like a good driver should and all of a sudden, the car could make an AP-initiated lane change. If that is a bad lane change, and you sideswipe another car, Tesla
could say that you have all the responsibility because you agreed to the legal disclaimer. But it seems a bit muddy to me because while active cruise control has existed for quite some time (even if it is usually less advanced than Tesla's) and even now lane assist is becoming more common (and keeping hands on the wheel is somewhat intuitive to allow easy take-over if something goes wrong)...auto-changing lanes w/o confirmation is a brand new beast!
The car decides that it needs to change lanes. You are driving along holding the wheel and looking out to traffic - not inside the car to the IC. The car moves left (or right) to change lanes. You are momentarily surprised, and immediately check your rearview mirror, trying to be engaged, but in the second that you need to react and check the mirror - too late - boom, you give the car that AP didn't "see" a kiss and you have an accident. Is it really your fault? What could you have done in the scenario above to avoid this? I mean, other than turning NoA and confirmation less lane changes off. Which you totally could. But then Tesla doesn't get the additional billions of miles of data they need to progress the technology further. So it gets muddy in my mind. And of course FSD takes to this another level.
If it was a situation where someone has to go to jail - and of course we all hope we are never involved in that kind of accident, AP, FSD or just ourselves driving manually, yeah, that is an interesting point. I assume it would be the driver? But even that brings up issues: the
owner is the one who accepted the AP legal disclaimers. Let's say I am the owner. I disclaim AP and take full responsibility for NoA auto-lane change. I loan the car to a friend who is familiar with Tesla and AutoPilot. S/he clicks Nav on AP and uses it. It causes an accident during a lane change. Worse case, it is a major accident and someone needs to go to jail. Does my friend go to jail? After all, s/he was driving. Or do
I go to jail, even though I was sitting at home? Probably the legalese said I was responsible for telling any other drivers of the limitations of Nav on AP. But can my legal acceptance, even if I did inform them of all the same information, make them legally responsible, too? Hmmm.
When Kirsten Korosec from Tech Crunch asked about the problem of calling full self-driving capability and Elon Musk said:
"I think where we're very clear with you know when you buy the car it's meant by full self driving it means its feature complete. But feature complete requiring supervision and then as we get more - we really need billions of miles if not maybe 10 billion sort of miles or kilometers on that order collectively from the fleet then in our opinion probably at that point supervision is not required but that will still be up to regulators to agree. So we're just very quickly there's really three steps. This being:
1) feature complete portion of self driving but requiring supervision,
2) future complete but not requiring supervision, and
3) feature complete not requiring supervision and regulators agree.
Yes, the above three phases are exactly correct. But in general, Tesla and Elon haven't communicated these steps clearly/often enough.
But as I stated in my long example above, the "requiring supervision" part can get muddy as there will be times when the FSD system will do something fast enough (sometimes necessarily so) that no amount of "supervision" might allow the human to realistically intervene.