That! on-point.So for example, I don't think FSD will hallucinate an entire bus that is not really there. More likely, the hallucinations will be prediction-planning.
You can install our site as a web app on your iOS device by utilizing the Add to Home Screen feature in Safari. Please see this thread for more details on this.
Note: This feature may not be available in some browsers.
That! on-point.So for example, I don't think FSD will hallucinate an entire bus that is not really there. More likely, the hallucinations will be prediction-planning.
Rules will need to change to the convenience/to accomodate the LLM/AV/FSD, just like how street rules were changed when automobiles become prevalent on the roads.The point is not "ML is bad and rules are good". This is a super hard engineering problem. In my view you need to use all techniques and technology to the fullest so that they complement each other if you want any kind of functional system with some amount of reasonable safety guarantees. My position is that in life-or-death-type of applications there are no magic shortcuts. You need to limit the ODD (primarily geo and speed) to be able to validate the system.
The benefits with rules is that they can be reviewed and understood. The benefits with ML is that it is likely, to some extent, to handle the cases for which you cannot write rules.
Rules are great for interpreting the actual traffic laws, whereas ML is great for interpreting behavior for example. Rules are great imho as a kind of general safety net around the ML.
There comes a time when you get to situations that extremely rare and machine learning cannot handle. To have a driverless deployment you need to carefully think through your stack to make sure that even if machine learning does not solve everything 100%, your full self-driving product does.
If you are to release a driverless service you need to also answer the question of how how to make sure the whole thing is fully robust and I believe that adds a level of complexity that Tesla is yet to tackle.
It's impossible to regression test "the world" for every point release that might cause horrible accidents because of some weird bug. So I don't see wide-ODD happening anytime soon regardless of approach.
2023LRY v12.3.6. Not an hallucination: I'm on a five lane crowded freeway with everyone traveling fast. I'm in the middle lane, with cars in lanes on both sides. All of a sudden my car moves over from the center of its lane to its inner right edge. Then two lane splitter motorcycles pass in between me and the car to my left. I never heard nor saw them until they were right next to me. This happened a second time for two motorcycle cops following them. FSD may have saved several lives.My understanding is that yes, FSD can "hallucinate", just like any large model. I am sure Tesla does their best to try to minimize hallucinations with quality data and the right training but the risk always there. One question is whether the hallucination is safety critical or not. If it is not safety critical, then you can ignore it. You only need to worry about hallucinations that are safety critical.
The risk of safety critical hallucinations is one reason why FSD requires supervision. It is also why many experts argue that a pure vision end-to-end model cannot achieve the 99.99999% reliability needed to remove driver supervision and why some heuristic code is needed to serve as a guardrail against hallucinations.
Had the same sort of thing happen when pulling off of a freeway. The car saw someone being an idiot and passing us on the right when mom and I didn't notice. The car steered to the left to avoid him. We didn't even have FSD or Advanced Autopilot at the time.2023LRY v12.3.6. Not an hallucination: I'm on a five lane crowded freeway with everyone traveling fast. I'm in the middle lane, with cars in lanes on both sides. All of a sudden my car moves over from the center of its lane to its inner right edge. Then two lane splitter motorcycles pass in between me and the car to my left. I never heard nor saw them until they were right next to me. This happened a second time for two motorcycle cops following them. FSD may have saved several lives.
2023LRY v12.3.6. Not an hallucination: I'm on a five lane crowded freeway with everyone traveling fast. I'm in the middle lane, with cars in lanes on both sides. All of a sudden my car moves over from the center of its lane to its inner right edge. Then two lane splitter motorcycles pass in between me and the car to my left. I never heard nor saw them until they were right next to me. This happened a second time for two motorcycle cops following them. FSD may have saved several lives.
I, too, find that phantom breaking now rarely occurs, although the car sometimes slows as it approaches bicycle images painted on the street as though they were obstacles.There are a few forms of FSD hallucinations. One might be when relying on inaccurate map data and attempting to drive over newly constructed medians, curbs, roadway hardware. Or stopping on bridges when there might be intersections below. Chuck had a case related to 2 traffic lights although it was recently resolved. And phantom braking which is mostly resolved. Maybe even dry wipes qualifies.
How could the car steer to the left if neither FSD or Autopilot were on?Had the same sort of thing happen when pulling off of a freeway. The car saw someone being an idiot and passing us on the right when mom and I didn't notice. The car steered to the left to avoid him. We didn't even have FSD or Advanced Autopilot at the time.
Not sure, but it did. I was surprised too.How could the car steer to the left if neither FSD or Autopilot were on?