Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Recent content by diplomat33

  1. diplomat33

    Monolithic versus Compound AI System

    Yes the blog is wrong about the latest GPT model not being E2E. But I think you are missing the blog's argument. The blog is not arguing that E2E is not capable or not generalized. The argument is about safety. The argument is that AVs are safety critical applications, ChatGPT is not. So yes...
  2. diplomat33

    Monolithic versus Compound AI System

    Shashua talks about the ability for AI to "reason" outside of it's training data. He says we are not quite there yet. As I understand it, the way FSD E2E training works is that sensor input is mapped to a control output. Simple examples might be something like "red light" is mapped to "stop"...
  3. diplomat33

    Monolithic versus Compound AI System

    Koopman has a neat little video that explains the long tail. I think it begs the question: how much of the long tail actually needs to be solved for AVs to be safe/good enough? In other words, when does an edge case become so rare that we don't care anymore? Koopman makes the point that as...
  4. diplomat33

    Monolithic versus Compound AI System

    The discussion of bias vs variance was very interesting. I learned some new things. I thought their example of the long tail was especially interesting. In their example, it would take a fleet of 1M cars about 3 years to finish the last "9" in the march of 9s. They acknowledge that nobody...
  5. diplomat33

    Waymo

    That's because Tesla FSD is supervised so there is a safety driver to prevent a collision. I've had phantom braking that would have resulted in a collision had I not immediately intervened.
  6. diplomat33

    Waymo

    Yes that is the other theory. If true, then the answer would be to train the perception stack to understand this case. We've seen research examples from Waymo that deal with this type of case of a composite object (one object inside or connected to another object). So they are aware of the...
  7. diplomat33

    Waymo

    Waymo will also drive manually just to collect data for training, not necessarily for mapping. That is what they did on their "road trip" to Bellevue and DC for example. So the driving in Atlanta may have been that. And AFAIK, Waymo has not said anything about moving away from HD maps. I am...
  8. diplomat33

    Waymo

    Yes, I think this is another example that supports my thesis. If Waymo were driving "on rails" on the HD map, we would expect it to just stick dead center in the lane. The fact that it ping pongs in the lane is because the ML planner is given more freedom to maneuver. In this case, it looks like...
  9. diplomat33

    Waymo

    Makes perfect sense. NHTSA is just doing their job. When you have repeated incidents of Waymo AVs violating traffic laws, it requires an investigation. Hopefully, we get some more info.
  10. diplomat33

    Waymo

    It is possible that the Waymo or the other car tried to lane change at the last second to try to avoid the impact but was not able to because of the other's car's high speed. It is impossible to say for sure without more details. Agree 100%. Yes, the other car was speeding so the blame is...
  11. diplomat33

    Cruise

    Cruise begins supervised autonomous driving in Phoenix:
  12. diplomat33

    Waymo

    Waymo accident with a black Toyota sedan on the 10 Freeway. No details yet but from the look of the damage on the Waymo, it appears that the Waymo was hit from behind by the Toyota.
  13. diplomat33

    Wiki Consumer AV - Status Tracking Thread

    The video I posted is an independent video from the Zeekr owner. They are not videos by Mobileye.
  14. diplomat33

    Wiki Consumer AV - Status Tracking Thread

    I believe they are deployed in early access to some zeekr owners.