We know without a doubt they are testing things using the employee program in CA and there's no way you can develop a system without your engineers actually testing it on public roads. Can't believe the CA DMV will let them get away with not reporting disengagements.
California DMV autonomous vehicle testing regulations (for Testing with a Driver)
dmv.ca.gov/portal/wcm/connect/a6ea01e0-072f-4f93-aa6c-e12b844443cc/DriverlessAV_Adopted_Regulatory_Text.pdf
§ 227.02. Definitions.
Article 3.7 – Testing of Autonomous Vehicles
As used in this article, the following definitions apply:
(a) “Autonomous mode” is the status of vehicle operation where technology that is a combination of hardware and software, remote and/or on-board, performs the dynamic driving task, with or without a natural person actively supervising the autonomous technology’s performance of the dynamic driving task. An autonomous vehicle is operating or driving in autonomous mode when it is operated or driven with the autonomous technology engaged.
(b) “Autonomous test vehicle” is a vehicle that has been equipped with technology that is a combination of both hardware and software that, when engaged, performs the dynamic driving task, but requires a human test driver or a remote operator to continuously supervise the vehicle’s performance of the dynamic driving task.
.... (1) An autonomous test vehicle does not include vehicles equipped with one or more systems that provide driver assistance and/or enhance safety benefits but are
not capable of, singularly or in combination, performing the dynamic driving task on a sustained basis without the constant control or active monitoring of a natural person.
.... (2) For the purposes of this article,
an “autonomous test vehicle” is equipped with technology that makes it capable of operation that meets the definition of Levels 3, 4, or 5 of the SAE International’s Taxonomy and Definitions for Terms Related to Driving Automation Systems for On-Road Motor Vehicles, standard J3016 (SEP2016), which is hereby incorporated by reference.
------
Which means that until Tesla has configured "FSD" as aiming to fulfil >= L3 (i.e. does not need constant monitoring of a natural person) then it is still being tested as a L2 ADAS which requires no special permit, trained drivers or disengagement reporting. So they are not "being let get away with" anything.
Waymo/Uber are different in that they aim to start at L4 so have no L2 system on which to test new features piecemeal.
It is also practically guaranteed at this stage that Tesla's pseudo-FSD will remain at L2 (with nags) for a considerable time (measured in years rather than months) after its official release, which will presumably be in the latter half of this year in conjunction with HW3.
I just don't see any other way for Tesla to deploy this feature without risking being sued to oblivion after it would most probably without driver intervention make a spate of fateful decisions during the first week of use in a large fleet.
If Musk thinks NoAP in its current state = "highway FSD" then what else can one logically conclude but that nags are a more or less permanent feature?