Meant to say monotonically! The training data set (if organized properly) will primarily grow, never truly shrink.
So for example if a Tesla captured this weird star shaped road marking in the UK last year:
and sent it back to the mothership and if it was deemed important enough to become part of the training data, then it would be part of the "collective memory" from this point on, and would probably be labeled with a "slow down and disengage immediately" attribute or so.
(At least that's how I speculate Tesla uses disengagement events.)
What does that weird star marking mean? Is it just telling people to be careful as there is a speed bump, zebra crossing (it's for pedestrians!) and perhaps a cycle path converging in that area? I mean, I drive in the UK and have never seen anything like that before and it doesn't appear to be in the Highway Code. So if a Tesla can work that one out, it has one-upped normal drivers.