Just wow. This will be part of.a future legal case, for sure. Desperation much?
You can install our site as a web app on your iOS device by utilizing the Add to Home Screen feature in Safari. Please see this thread for more details on this.
Note: This feature may not be available in some browsers.
Just wow. This will be part of.a future legal case, for sure. Desperation much?
Smart. Have you seen how much ambulance rides cost? This is a much better business than robotaxis.
This, and other examples, are a good reason to think that eventually signs will need to be held to some sort of "machine-recognizable" standards to make self-driving cars safer.Given that 12.3.4 initially followed the road to the right, it seems to have understood that it didn't need to stop:
View attachment 1037821
But then it realized navigation says to go left/straight, and I suppose by that point, it was already past the stop sign?
It’s not possible to drive well without map data - humans do that from memory(or GPS / Nav). You need to know which lane to be on even before turning if the next turn is a short distance away.Musk wants to use as little map data as possible (as the goal per his statement) even though map data currently dictates what FSD has to do. I do feel V11 follows map data way more stricter than V12. So I'll be curious to see if V12 hwy will fix some of the missed turns. Perhaps this is why they are not bothering pouring resources into mapdata sourcing? Who knows, maybe they will find that stupid and change directions, or V14 no longer need map data.
That's not happening anytime soon, not in 20 years.This, and other examples, are a good reason to think that eventually signs will need to be held to some sort of "machine-recognizable" standards to make self-driving cars safer.
I see it make plenty of weird lane choices even when the nav map shows correctly. For instance, there is a freeway onramp near my house with a 2-lane left turn at a light at the top of the ramp. Twice (two different revs of FSD, I think) my car signaled for left turn, then simply chose the 3rd lane from the left (which goes straight) instead of getting into one of the turn lanes with the other cars I'd been leading and following to that point. I took over each time before it could attempt to turn there, but I do wonder what it was going to do. The nav map showed two turn lanes with cars, and me in the straight lane next to them. There is another straight lane to the right as well.It’s not possible to drive well without map data - humans do that from memory(or GPS / Nav). You need to know which lane to be on even before turning if the next turn is a short distance away.
Here is an interesting example. Near my home when FSD makes a right turn, it incorrectly drives on the broad shoulder instead of the single lane. If it knew there is a single lane from the map data and gave it primary importance it wouldn’t do it
The various responses by user/owners on Twitter and other placesWhat FSDS results are you referring to?
I have seen it do that. It is a problem when there are ambiguous lane markings such as a highway ramp lane marked with a solid white and FSD is wondering how to get to that laneIn the absence of map data or incorrect map data, fsd needs to be able to read the lane markings and act appropriately. That's something it doesn't do now and causes a lot of incorrect behavior.
I think what Musk mainly means is that one day FSD will be so good at planning that they can finally allow it to overrule map data if there's a disagreement. Currently map data still rules over FSD. It may just be a weight/slider of how much FSD V11 and V12 can overrule map data.It’s not possible to drive well without map data - humans do that from memory(or GPS / Nav). You need to know which lane to be on even before turning if the next turn is a short distance away.
Here is an interesting example. Near my home when FSD makes a right turn, it incorrectly drives on the broad shoulder instead of the single lane. If it knew there is a single lane from the map data and gave it primary importance it wouldn’t do it
I think the issue is that because its using NN and training data and not explicit hardwired code, we don't know exactly what FSD is doing.I have seen it do that. It is a problem when there are ambiguous lane markings such as a highway ramp lane marked with a solid white and FSD is wondering how to get to that lane
It is algorithmic. If it had to match video clips in real time, we would be dead by now.I think the issue is that because its using NN and training data and not explicit hardwired code, we don't know exactly what FSD is doing.
I don't know anything about AI, so somebody with more knowledge can correct me, but what i think fsd is doing is that its taking the real time data and matching it with its database of training video and figuring that which specific videos match the current situation. Once fsd gets that match it takes the appropriate action. Maybe in certain scenarios there is conflicting data and thats where it gets confused and instead of turning left it goes straight even though the lane markings show a left arrow.
My only regression with 12 vs 11 is dirt roads or unmarked roads/lanes.Today I was visiting a friend in rural Ohio and FSD route planning was to turn off a rural road without a center line and take a right into his driveway. It slowed to under 5mph but instead of slowly turning into his driveway, it attempted to turn into a 3 foot ditch that overshot his driveway. It would have taken literally several seconds, and I had plenty of time to think about intervening or not as the passenger front tire crept slowly within a foot of the ditch. At this point I realized it was likely to continue driving into the ditch, intervened and reported it. Bummer
The various responses by user/owners on Twitter and other places
I'm not saying fsd is matching the video clips in real time, the training video clips are processed and transformed into vectors and placed in a database that can be used to match the input vectors taken from the current drive.It is algorithmic. If it had to match video clips in real time, we would be dead by now.
As they say to people with photographic memory, it does not equate to intelligence.
Intelligence comes from being able to interpolate and extrapolate with the limited amount of dataset that you have been trained with. You don't need to remember anything.
What is their internal take on these reports is what I was seeking..more like they are worried? or intrigued? or laughing? or not concerned?So are you asking if Waymo is responding to users saying FSD V12 is great? No, they do not respond to users on Twitter. Why would they? Probably the closest to a response was at SXSW, Tekedra said that Tesla FSD is L2/L3 but they don't focus on Tesla FSD because Tesla is not doing L4.
What is their internal take on these reports is what I was seeking..more like they are worried? or intrigued? or laughing? or not concerned?
Haha, that's because Waymo sees losing multiple billions a year a threat. Tesla is the last thing they are worried about.So are you asking if Waymo is responding to users saying FSD V12 is great? No, they do not respond to users on Twitter. Why would they? Probably the closest to a response was at SXSW, Tekedra said that Tesla FSD is L2/L3 hybrid. They have said that Tesla is doing a driver assist while they are doing full autonomy. So Waymo does not see Tesla FSD as a threat, if that is what you are asking.