Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

CA DMV "revisiting" approach towards Tesla FSD

This site may earn commission on affiliate links.
Tesla can collect disengagement data from FSD Beta cars automatically and then organize and send that data to the CA DMV. There would be no need for drivers to fill out surveys.
How would they possibly know why the driver disengaged? Sure they could guess on some/most of the events, but I assume that the CA DMV requires it to be accurate. (And that would take a lot of man hours which I assume they wouldn't want to commit to.) Which means the driver would have to fill out a survey for each disengament.
 
  • Disagree
Reactions: diplomat33
How would they possibly know why the driver disengaged? Sure they could guess on some/most of the events, but I assume that the CA DMV requires it to be accurate. (And that would take a lot of man hours which I assume they wouldn't want to commit to.) Which means the driver would have to fill out a survey for each disengament.

If you read the CA DMV reports, the only disengagement cause that is provided is stuff like "perception error" or "undesirable path by the vehicle". It's all stuff that Tesla could get from the driving data and the video clips that are uploaded when a drive hits the camera button or disengages. So, I don't think there would be any need for drivers to fill out disengagement reports.
 
  • Disagree
  • Like
Reactions: Sporty and MP3Mike
If you read the CA DMV reports, the only disengagement cause that is provided is stuff like "perception error" or "undesirable path by the vehicle". It's all stuff that Tesla could get from the driving data and the video clips that are uploaded when a drive hits the camera button or disengages. So, I don't think there would be any need for drivers to fill out disengagement reports.
Again, that would mean a human would need to watch every disengament video and guess as to why the person disengaged. The 2020 disengagement report has 349 different "DESCRIPTION OF FACTS CAUSING DISENGAGEMENT".

For example how would they know if a disengament was "due to operator discomfort" or if you were just gripping the steering wheel a little too firmly?

Maybe you bumped the steering wheel while reaching for something elsewhere in the car. Maybe you decided you wanted to go somewhere else. Maybe you hit the brakes just because. Maybe you wanted to slow down to rubberneck at an accident on the other side of the road. Maybe you saw a friend and you wanted to flag them down. Maybe you heard a loud noise and panicked. Maybe a wasp flew in your window and stung you. Maybe the passenger reached over a grabbed the wheel because they wanted to scare you.

The only way to get accurate information is to have the driver report it.
 
I don't think anything will come from it because GM is essentially doing the same thing with Ultracruise.

Now they could limit what it can do without human input, but I don't think they'll put the brakes on the entire thing.

I think we need better Federal/State guidelines on just how far a manufacture of an L2 system can go with it. What's the limit of liability for the human behind the wheel who has to prevent it from doing something dumb? Whether it's autosteer on the street, or unconfirmed lane changes on the freeway.
 
If Tesla ends up having to disable FSD Beta on our Model Y due to our California location, I hope they'll make it easy to turn it back on whenever we leave California!

I think the FSD Beta program is great because it allows Tesla to constantly gather data from a wider range of scenarios. For example, I regularly contribute data and feedback from snowy mountains and out-of-the-way desert neighborhoods. To do this, I essentially had to pass Tesla's safety test (the "safety score"), and I must continue to remain engaged and alert. Seems pretty legitimate to me!
 
the DMV probably has the authority to regulate L2 systems anyway
I think this might be the source of confusion as legislators probably intended safety for autonomous vehicles while not affecting existing not-autonomous vehicles. If you're referring to California's "Testing of Autonomous Vehicles" regulation, it explicitly excludes "systems that provide driver assistance… but are not capable of… performing the dynamic driving task on a sustained basis without the constant control or active monitoring of a natural person."

I suppose CA DMV can try to prove that FSD Beta is capable enough to be considered autonomous?
 
  • Funny
  • Like
Reactions: EVNow and MP3Mike
I think this might be the source of confusion as legislators probably intended safety for autonomous vehicles while not affecting existing not-autonomous vehicles. If you're referring to California's "Testing of Autonomous Vehicles" regulation, it explicitly excludes "systems that provide driver assistance… but are not capable of… performing the dynamic driving task on a sustained basis without the constant control or active monitoring of a natural person."

I suppose CA DMV can try to prove that FSD Beta is capable enough to be considered autonomous?
I'm not saying they have the authority to regulate L2 systems under those rules. I was just speculating that they might have the authority, I don't know.
I think FSD Beta is as capable as Uber's system was in 2016.
You keep ignoring this part:
The level of a driving automation system feature corresponds to the feature’s production design intent. This applies regardless of whether the vehicle on which it is equipped is a production vehicle already deployed in commerce, or a test vehicle that has yet to be deployed. As such, it is incorrect to classify a level 4 design-intended ADS feature equipped on a test vehicle as level 2 simply because on-road testing requires a test driver to supervise the feature while engaged, and to intervene if necessary to maintain safe operation.
If CA DMV thinks FSD Beta is an "autonomous driving system" and not "ADAS" - they should drive it without intervention for a few hours in LA/SF.

Otherwise just tell their legacy auto masters that they can't do their dirty work for them.
If you look at the disengagement reports there are plenty of companies with disengagement rates on par with Tesla. Here's Cruise back in 2016:
1642102958617.png
 
  • Like
Reactions: Terminator857
If CA DMV thinks FSD Beta is an "autonomous driving system" and not "ADAS" - they should drive it without intervention for a few hours in LA/SF.

If you look at the CA DMV report from 2020, there are companies testing autonomous driving with just 2 miles per disengagement and other companies testing autonomous driving with 28,000 miles per disengagement. The CA DMV considers both systems to be autonomous driving systems. So I don't think Tesla's bad disengagement rate would be an excuse for it not being autonomous.
 
Last edited:
If you look at the disengagement reports there are plenty of companies with disengagement rates on par with Tesla. Here's Cruise back in 2016:
View attachment 755304

Not just that but there are companies with worst disengagement rates than Tesla and the CA DMV still counts them as autonomous. For example, in 2020, Valeo reported a disengagement rate of 0.49 miles per disengagement, Ridecell reported 0.78 miles per disengagement.
 
You keep ignoring this part: "it is incorrect to classify a level 4 design-intended ADS feature equipped on a test vehicle as level 2 simply because on-road testing requires a test driver to supervise the feature while engaged"
I thought this was already addressed with my asking of if the name matters. Even if "FSD Beta" wasn't a separate name with a separate toggle, does this mean Autopilot and Autosteer become Level 4/5 because Tesla has plans for those automation levels? What if "FSD Beta" was called "Level 2 FSD Beta" instead or maybe even clearer "This Is Not Autonomous Mode Beta (Completely Separate From Future Autonomous Full Self-Driving Capabilities)" would it matter? Yes it's blurrier for Tesla as they have the capability to update software (and replace hardware if necessary), but theoretically any Level 2 Mobileye-based system could be captured by this regulation because Mobileye has plans for Level 4.

But I brought up the "driver assistance not capable" as even if "FSD Beta" is considered Level 4, that is only one requirement and insufficient to apply California's Autonomous Vehicle regulations. Sure, Koopman can point out "design intent" as he assisted with the SAE document, but did he draft the bill and understand how it's used in law?
 
Last edited:
I thought this was already addressed with my asking of if the name matters. Even if "FSD Beta" wasn't a separate name with a separate toggle, does this mean Autopilot and Autosteer become Level 4/5 because Tesla has plans for those automation levels? What if "FSD Beta" was called "Level 2 FSD Beta" instead or maybe even clearer "This Is Not Autonomous Mode Beta (Completely Separate From Future Autonomous Full Self-Driving Capabilities)" would it matter? Yes it's blurrier for Tesla as they have the capability to update software (and replace hardware if necessary), but theoretically any Level 2 Mobileye-based system could be captured by this regulation because Mobileye has plans for Level 4.

But I brought up the "driver assistance not capable" as even if "FSD Beta" is considered Level 4, that is only one requirement and insufficient to apply California's Autonomous Vehicle regulations. Sure, Koopman can point out "design intent" as he assisted with the SAE document, but did he draft the bill and understand how it's used in law?
No, because those systems are not beta versions of L4 systems.
I don't think the name matters as anyone who uses FSD Beta can tell it's a beta version of Tesla's robotaxi software that is planned to be released later this year. Testing it is exactly the same thing as what every other AV company in California is doing.
The SAE J3016 definitions are literally included by reference in the regulations. I'm pretty sure he understands this:
(2) For the purposes of this article, an “autonomous test vehicle” is equipped with technology that makes it capable of operation that meets the definition of Levels 3, 4, or 5 of the SAE International's Taxonomy and Definitions for Terms Related to Driving Automation Systems for On-Road Motor Vehicles, standard J3016 (SEP2016), which is hereby incorporated by reference.
 
  • Disagree
Reactions: Sandor
If you look at the disengagement reports there are plenty of companies with disengagement rates on par with Tesla. Here's Cruise back in 2016:
Not just that but there are companies with worst disengagement rates than Tesla and the CA DMV still counts them as autonomous. For example, in 2020, Valeo reported a disengagement rate of 0.49 miles per disengagement, Ridecell reported 0.78 miles per disengagement.
But those companies claim they are testing autonomous testing !

And Tesla says they are not.

If I run my 100 meters in 15 seconds and say I've no interest in competing in Olympics, you better believe me. If someone else runs their 100 mtrs in 16 seconds but claims they are training for Olympics, you can humor them with a chuckle. There is enough stupid VC money going around.

ps : The last point can't be stressed enough. For some companies showing they are "testing on public streets" is a major milestone that they want to do even if the disengagement rate is ten per mile. Funding would be tied to achieving such milestones.
 
  • Like
Reactions: Sandor
But those companies claim they are testing autonomous testing !

And Tesla says they are not.

If I run my 100 meters in 15 seconds and say I've no interest in competing in Olympics, you better believe me. If someone else runs their 100 mtrs in 16 seconds but claims they are training for Olympics, you can humor them with a chuckle. There is enough stupid VC money going around.

ps : The last point can't be stressed enough. For some companies showing they are "testing on public streets" is a major milestone that they want to do even if the disengagement rate is ten per mile. Funding would be tied to achieving such milestones.
Uber also claimed they were not. Maybe those companies looked at that precedent when they interpreted the regulations?
And again, there is no real difference between other companies AV testing and FSD Beta testing.
 
But those companies claim they are testing autonomous testing !

And Tesla says they are not.

By that standard, everybody could just declare that they are not testing autonomous driving and not have to follow any regulations.

That is the whole point of this issue with the CA DMV. Sure, Tesla says they are not testing autonomous driving but are they lying? Are they really testing autonomous driving? Because if they are really testing autonomous driving than they should comply. It should not matter what they say. Companies should not get to declare if they are testing autonomous driving or not. The CA DMV should decide if they are testing autonomous driving or not.
 
  • Disagree
Reactions: Sandor
Uber also claimed they were not. Maybe those companies looked at that precedent when they interpreted the regulations?
Tesla probably looked too. How many companies have existing driver assist features that get updated in existing vehicles?

The adjacent requirement that you've highlighted of "(2) … Levels 3, 4, 5…" there's one before: "(1) … does not include vehicles equipped with… driver assistance… not capable of, singularly or in combination… dynamic driving task"

And surprise surprise in Tesla's communication to CA DMV:

… City Streets will continue to be an SAE Level 2, advanced driver-assistance feature…​
… City Streets (and all other existing FSD features), because the vehicle is not capable of performing the entire DDT…​
… neither Autopilot nor FSD Capability is an autonomous system, and currently no comprising feature, whether singularly or collectively, is autonomous or makes our vehicles autonomous…​
… the driver maintains responsibility for this part of the dynamic driving task…​

Maybe the lawyer should have been more explicit and copy/pasted that part of the law?
 
Tesla probably looked too. How many companies have existing driver assist features that get updated in existing vehicles?

The adjacent requirement that you've highlighted of "(2) … Levels 3, 4, 5…" there's one before: "(1) … does not include vehicles equipped with… driver assistance… not capable of, singularly or in combination… dynamic driving task"

And surprise surprise in Tesla's communication to CA DMV:

… City Streets will continue to be an SAE Level 2, advanced driver-assistance feature…​
… City Streets (and all other existing FSD features), because the vehicle is not capable of performing the entire DDT…​
… neither Autopilot nor FSD Capability is an autonomous system, and currently no comprising feature, whether singularly or collectively, is autonomous or makes our vehicles autonomous…​
… the driver maintains responsibility for this part of the dynamic driving task…​

Maybe the lawyer should have been more explicit and copy/pasted that part of the law?
Yes, those are the same claims that Uber made. How is what Tesla is doing different? Why do you think there are autonomous vehicle testing rules?
 
By that standard, everybody could just declare that they are not testing autonomous driving and not have to follow any regulations.

That is the whole point of this issue with the CA DMV. Sure, Tesla says they are not testing autonomous driving but are they lying? Are they really testing autonomous driving? Because if they are really testing autonomous driving than they should comply. It should not matter what they say. Companies should not get to declare if they are testing autonomous driving or not. The CA DMV should decide if they are testing autonomous driving or not.

Where is the line drawn?

On the freeway NoA is advertised as being able to handle automatic lane changes without confirmation, interchanges, merging, and taking the off ramp. This is all nicely falls under L2 without much debate.

Both FSD Beta, and Ultra Cruise are an attempt to have driver aids on City Streets.

What limitations do manufactures need put on L2 system so they're not seen as autonomous systems operating under a disguise of being L2?

With FSD Beta I'd say the real problem isn't that consumers are Beta testing an autonomous driving system, But the problem is consumers are Beta Testing a really sucky L2 system and accepting all the liability from it.

If FSD Beta didn't suck so badly, and had some pretty sensible limitations (like no uncontrolled lefts) then the Ca DMV wouldn't have much issue with it.