Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Recent content by diplomat33

  1. diplomat33

    Monolithic versus Compound AI System

    Koopman has a neat little video that explains the long tail. I think it begs the question: how much of the long tail actually needs to be solved for AVs to be safe/good enough? In other words, when does an edge case become so rare that we don't care anymore? Koopman makes the point that as...
  2. diplomat33

    Monolithic versus Compound AI System

    The discussion of bias vs variance was very interesting. I learned some new things. I thought their example of the long tail was especially interesting. In their example, it would take a fleet of 1M cars about 3 years to finish the last "9" in the march of 9s. They acknowledge that nobody...
  3. diplomat33

    Waymo

    That's because Tesla FSD is supervised so there is a safety driver to prevent a collision. I've had phantom braking that would have resulted in a collision had I not immediately intervened.
  4. diplomat33

    Waymo

    Yes that is the other theory. If true, then the answer would be to train the perception stack to understand this case. We've seen research examples from Waymo that deal with this type of case of a composite object (one object inside or connected to another object). So they are aware of the...
  5. diplomat33

    Waymo

    Waymo will also drive manually just to collect data for training, not necessarily for mapping. That is what they did on their "road trip" to Bellevue and DC for example. So the driving in Atlanta may have been that. And AFAIK, Waymo has not said anything about moving away from HD maps. I am...
  6. diplomat33

    Waymo

    Yes, I think this is another example that supports my thesis. If Waymo were driving "on rails" on the HD map, we would expect it to just stick dead center in the lane. The fact that it ping pongs in the lane is because the ML planner is given more freedom to maneuver. In this case, it looks like...
  7. diplomat33

    Waymo

    Makes perfect sense. NHTSA is just doing their job. When you have repeated incidents of Waymo AVs violating traffic laws, it requires an investigation. Hopefully, we get some more info.
  8. diplomat33

    Waymo

    It is possible that the Waymo or the other car tried to lane change at the last second to try to avoid the impact but was not able to because of the other's car's high speed. It is impossible to say for sure without more details. Agree 100%. Yes, the other car was speeding so the blame is...
  9. diplomat33

    Cruise

    Cruise begins supervised autonomous driving in Phoenix:
  10. diplomat33

    Waymo

    Waymo accident with a black Toyota sedan on the 10 Freeway. No details yet but from the look of the damage on the Waymo, it appears that the Waymo was hit from behind by the Toyota.
  11. diplomat33

    Wiki Consumer AV - Status Tracking Thread

    The video I posted is an independent video from the Zeekr owner. They are not videos by Mobileye.
  12. diplomat33

    Wiki Consumer AV - Status Tracking Thread

    I believe they are deployed in early access to some zeekr owners.
  13. diplomat33

    Elon: FSD Beta tweets

    I am not claiming the stat is 100% accurate. But since Tesla does not release official disengagement data, teslafsdtracker is the best we got. And it at least gives us a ball park. And I agree that people may be over reporting issues as safety disengagements when they are not. So what do you...
  14. diplomat33

    Elon: FSD Beta tweets

    It is based on how people report the disengagement to the website.