Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Can FSD hallucinate?

This site may earn commission on affiliate links.
The point is not "ML is bad and rules are good". This is a super hard engineering problem. In my view you need to use all techniques and technology to the fullest so that they complement each other if you want any kind of functional system with some amount of reasonable safety guarantees. My position is that in life-or-death-type of applications there are no magic shortcuts. You need to limit the ODD (primarily geo and speed) to be able to validate the system.

The benefits with rules is that they can be reviewed and understood. The benefits with ML is that it is likely, to some extent, to handle the cases for which you cannot write rules.

Rules are great for interpreting the actual traffic laws, whereas ML is great for interpreting behavior for example. Rules are great imho as a kind of general safety net around the ML.

There comes a time when you get to situations that extremely rare and machine learning cannot handle. To have a driverless deployment you need to carefully think through your stack to make sure that even if machine learning does not solve everything 100%, your full self-driving product does.

If you are to release a driverless service you need to also answer the question of how how to make sure the whole thing is fully robust and I believe that adds a level of complexity that Tesla is yet to tackle.

It's impossible to regression test "the world" for every point release that might cause horrible accidents because of some weird bug. So I don't see wide-ODD happening anytime soon regardless of approach.
Rules will need to change to the convenience/to accomodate the LLM/AV/FSD, just like how street rules were changed when automobiles become prevalent on the roads.

1714321068642.png
 
There are certainly changes I’d like to see for AV (rolling stops, for instance, with strict limitations).

It’s not clear this MUST happen for AV adoption though. It’s more for comfort and fitting in with other traffic.
 
My understanding is that yes, FSD can "hallucinate", just like any large model. I am sure Tesla does their best to try to minimize hallucinations with quality data and the right training but the risk always there. One question is whether the hallucination is safety critical or not. If it is not safety critical, then you can ignore it. You only need to worry about hallucinations that are safety critical.

The risk of safety critical hallucinations is one reason why FSD requires supervision. It is also why many experts argue that a pure vision end-to-end model cannot achieve the 99.99999% reliability needed to remove driver supervision and why some heuristic code is needed to serve as a guardrail against hallucinations.
2023LRY v12.3.6. Not an hallucination: I'm on a five lane crowded freeway with everyone traveling fast. I'm in the middle lane, with cars in lanes on both sides. All of a sudden my car moves over from the center of its lane to its inner right edge. Then two lane splitter motorcycles pass in between me and the car to my left. I never heard nor saw them until they were right next to me. This happened a second time for two motorcycle cops following them. FSD may have saved several lives.
 
2023LRY v12.3.6. Not an hallucination: I'm on a five lane crowded freeway with everyone traveling fast. I'm in the middle lane, with cars in lanes on both sides. All of a sudden my car moves over from the center of its lane to its inner right edge. Then two lane splitter motorcycles pass in between me and the car to my left. I never heard nor saw them until they were right next to me. This happened a second time for two motorcycle cops following them. FSD may have saved several lives.
Had the same sort of thing happen when pulling off of a freeway. The car saw someone being an idiot and passing us on the right when mom and I didn't notice. The car steered to the left to avoid him. We didn't even have FSD or Advanced Autopilot at the time.
 
2023LRY v12.3.6. Not an hallucination: I'm on a five lane crowded freeway with everyone traveling fast. I'm in the middle lane, with cars in lanes on both sides. All of a sudden my car moves over from the center of its lane to its inner right edge. Then two lane splitter motorcycles pass in between me and the car to my left. I never heard nor saw them until they were right next to me. This happened a second time for two motorcycle cops following them. FSD may have saved several lives.

I was recently on the freeway with a 75mph speed limit. V12.3.6 enabled, in the #1 lane, traveling maybe less than 1mph faster than a vehicle in the adjacent middle lane. Wide open lanes otherwise and FSD needs to change lanes right now with the vehicle in my blind spot. Very awkward and nonhuman like.

In the last 2 days, on a surface street, FSD didn't properly plan for lane selection and starts a right lane change nearly bumper to bumper with the lead vehicle in the right lane. My front bumper, their back bumper. Not safe. Not human like.

There's much variability in the quality of FSD decision making. What's important isn't that it can do it correctly once in a while. Rather it needs to be be correct nearly all the time. The team has a lot more wood to chop.
 
There are a few forms of FSD hallucinations. One might be when relying on inaccurate map data and attempting to drive over newly constructed medians, curbs, roadway hardware. Or stopping on bridges when there might be intersections below. Chuck had a case related to 2 traffic lights although it was recently resolved. And phantom braking which is mostly resolved. Maybe even dry wipes qualifies.
 
I,
There are a few forms of FSD hallucinations. One might be when relying on inaccurate map data and attempting to drive over newly constructed medians, curbs, roadway hardware. Or stopping on bridges when there might be intersections below. Chuck had a case related to 2 traffic lights although it was recently resolved. And phantom braking which is mostly resolved. Maybe even dry wipes qualifies.
I, too, find that phantom breaking now rarely occurs, although the car sometimes slows as it approaches bicycle images painted on the street as though they were obstacles.