In order to argue that radar + vision is safer, you need to demonstrate that the benefit of the radar is greater than the loss in vision processing capability from losing compute cycles to process the radar.
Would love to see your math on this.
Wait... Tesla
designed the FSD computer.
Are you saying they, knowing at the time of design it was going to need to have enough cycles to process both radar and vision- somehow failed to design a system capable of doing both without starving either?
I'd
really love to see your math on THAT
I don't believe Tesla would remove a $100 device just to save some bucks - a possible explanation is that radar input is confusing rather than helping the NN vision based conclusions, and it may have taken that long to figure out because it took time to assemble enough evidence from all the corner cases. By definition rare, and the benefit of radar in normal cases isn't enough to justify the troubles it causes in corner cases. Pure speculation of course. Based on my personal NNN (Natural Neural Network) trying to figure out Tesla's decisions on their NN LOL
In that situation why not just have the NNs ignore the radars input EXCEPT when it's reporting on things impossible for the cameras to see.
The cameras know when it's a clear day, and the thing the radar is sending back is a bounce off an overpass that the road gradient is making seem like it's in their path but the cameras make clear it isn't- so it can ignore that input. But by the same token-
The cameras know when they can't see beyond a car in front of them.... so if radar reports a stopped or unexpectedly braking vehicle ahead of that one blocking their view- they use that info. Otherwise it's not needed.
The cameras know when fog or snow are obscuring vision... so if radar reports stuff beyond what the cameras can see in fog they use that info. Otherwise it's not needed.
I agree radar should not be a PRIMARY sensor as it was originally on AP2. But it absolutely provides info vision can't as a secondary one.
Misses the point. Heavy rain, which you described as needing to slow down or even pull over, is not an issue with a properly treated windshield, AutoPilot, radar, FSD or not.
If I can see every detail of the road and traffic in the heaviest of rain, so can a Tesla with cameras only.
Edit:
@Knightshade also missed the point...
Except I just quoted you Tesla themselves saying that when you made that video the primary sensor AP was using was...radar.
So it wouldn't have cared if the camera could see perfectly or not.
They've since moved to vision being primary... (and now suggest they'll be moving to vision being the ONLY one, apart from the ultrasonics)
TODAY in fairly moderate rain I sometimes get NoA disabling itself because of the impact the rain has on cameras not being cleared.[/B]