Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
I don't have a good technical understanding of it either, but we've seen ~3 hours of driving from V12, and in none of those videos did it freak out or do something nonsensical. Perhaps you have a different view.

Obviously 3 hours is nothing when it needs to drive thousands without intervention.
I only watched 7 minutes and saw it go straight in a right turn only lane.
 
  • Like
Reactions: AlanSubie4Life
do something nonsensical.
It tried to pass someone for no good reason, to their great displeasure. And it had highly mysterious behavior at stop signs. I have watched a total of about 10 minutes (with a few repeats of that material). I guess I watched a bit more of an early video but was only half watching. That was the one where it had terrible anticipation of the cyclist coming up the hill which caused the lead car to stop.
 
Semantics: The study of meanings in a language. (from the dictionary.)

You (above): What the word means has not just major engineering but also legal significance.

You (to me): It's NOT about semantics!


Hilariously, the problem here was people were dismissing there being any noteworthy meaning in one term versus another, actually-has-substantially-different-meaning term.

So it seems the real issue was people screaming SEMANTICS while not being interested in the very thing you define semantics as :)


You don't understand DDT. DDT means it is capable of doing driving tasks (lane keeping, making turns, detecting objects, reading signs, obeying stop signs and traffic lights, going around double parked vehicles etc...) Waymo can do the complete DDT since it can do all driving tasks in its ODD. But perfection is not a requirement for L4. Are you trolling or just clueless?


Seems like both at this point? Especially since he's the dude who 4 hours ago was jumping up and down about how SAE Levels don't guarantee perfect performance and now he's trying to point out an accident as evidence of there being a connection between SAE level and performance.
 
  • Like
Reactions: diplomat33
Do people think even 11.x would have performed better than these autonomous Waymo Jaguar I-Paces in this case of hitting a tow truck pulling an angled pickup? And even if not current 12.x, do people think end-to-end still as driver assist would be unable to learn how to not hit the tow truck? Seems like a nice example of how "lower" can outperform "higher."

I wonder if Jaguar has considered repackaging basically all of Waymo hardware and software for I-Pace and sell it as a driver assist with a different user interface? Jaguar can be extremely clear that it's intended for use where the driver must pay attention at all times, and the software can remind the driver of such on every activation and even add nags, etc. While Waymo has a design intent of higher automation, Jaguar could sell a package with design intent of lower automation maybe because they don't want to take on the liability of running into tow trucks.
 
Do people think even 11.x would have performed better than these autonomous Waymo Jaguar I-Paces in this case of hitting a tow truck pulling an angled pickup?

Not a chance.

And even if not current 12.x, do people think end-to-end still as driver assist would be unable to learn how to not hit the tow truck? Seems like a nice example of how "lower" can outperform "higher."

Theoretically, E2E could probably learn how to not hit the two truck but it would require specific data to train on that scenario.

And Waymo outperforms FSD Beta in every performance metric. So even if FSD beta was better than Waymo in this one case (highly doubtful), it would still be behind Waymo 99.9999% of other cases.

I wonder if Jaguar has considered repackaging basically all of Waymo hardware and software for I-Pace and sell it as a driver assist with a different user interface? Jaguar can be extremely clear that it's intended for use where the driver must pay attention at all times, and the software can remind the driver of such on every activation and even add nags, etc. While Waymo has a design intent of higher automation, Jaguar could sell a package with design intent of lower automation maybe because they don't want to take on the liability of running into tow trucks.

I doubt Waymo would go for it since they reject L2. It's why they are focused on L4 only. But I've argued that Waymo should license a driver assist system based on the Waymo Driver. It would be a great way for Waymo to generate revenue to help fund their robotaxi efforts. And all the cars with the Waymo driver assist could collect more data too to help Waymo.
 
Do people think even 11.x would have performed better than these autonomous Waymo Jaguar I-Paces in this case of hitting a tow truck pulling an angled pickup? And even if not current 12.x, do people think end-to-end still as driver assist would be unable to learn how to not hit the tow truck? Seems like a nice example of how "lower" can outperform "higher."

I wonder if Jaguar has considered repackaging basically all of Waymo hardware and software for I-Pace and sell it as a driver assist with a different user interface? Jaguar can be extremely clear that it's intended for use where the driver must pay attention at all times, and the software can remind the driver of such on every activation and even add nags, etc. While Waymo has a design intent of higher automation, Jaguar could sell a package with design intent of lower automation maybe because they don't want to take on the liability of running into tow trucks.
Waymo claims they have no interest in supervised driving automation.
 
I only watched 7 minutes and saw it go straight in a right turn only lane.
I wondered about this because, In CA, you can cross single white lines, however

CHP spokeswoman Roxanne Anderson says that in nine out of 10 occasions, crossing the line is against the law because adjacent signs and striping on the pavement take precedent and make what otherwise might be a legal maneuver a no-no.

Once a motorist in a turning or exit lane passes a sign saying “exit only” or drives over a turn arrow, that motorist may not legally veer back into the other lane, Anderson said. Because such signs and arrows are present almost every time there’s a white line, the correct procedure is usually clear, …
 
Not a chance.
Not sure what you are saying. But specifically I think it is pretty likely FSD Beta 11.x would not have hit this pickup being towed. That would be better handling than Waymo in this specific situation.

FSDb 11.x:
It is fairly good at not hitting other vehicles on the road no matter how they present. Usually.

FSDb might have slowed down or gotten confused or whatever.

But I doubt it would have hit it. Has occupancy network, etc.

It certainly can hit other towed items. I just don’t think on this one it would.

Anyway, I don’t have a lot of doubt that Waymo knew exactly where the vehicle was. It seemed to know exactly where it was. Unfortunately it was (inadvertently) programmed to run into the vehicle in that sort of situation. That is how I understand it anyway. Just a bug. That is why they fixed it! It may well be more complicated than that but it doesn’t matter; it’s a bug. Could happen to anyone. Probably someone just missed a minus sign or a negative somewhere, converting divergence to convergence. Those things can be confusing.

Probably FSD Beta 11.x does not have that specific bug. It has other undiscovered bugs instead.
 
Not sure what you are saying. But specifically I think it is pretty likely FSD Beta 11.x would not have hit this pickup being towed. That would be better handling than Waymo in this specific situation.

FSDb 11.x:
It is fairly good at not hitting other vehicles on the road no matter how they present. Usually.

FSDb might have slowed down or gotten confused or whatever.

But I doubt it would have hit it. Has occupancy network, etc.

Maybe it would not have hit the truck, but FSD beta likely would have acted confused, braked too hard, swerved etc... So yeah, FSD Beta might not have hit the truck but only because it drives worse than Waymo and the driving worse part happened to prevent a collision. That's not really a great argument IMO.

Unfortunately it was (inadvertently) programmed to run into the vehicle in that sort of situation. That is how I understand it anyway. Just a bug.

No. The Waymo was not programmed to run into the vehicle. On the contrary, the error is that it mistakenly thought that it was not going to hit the truck. Here is what Waymo said:

"We determined that due to the persistent orientation mismatch of the towed pickup truck and tow truck combination, the Waymo AV incorrectly predicted the future motion of the towed vehicle."


So the prediction stack incorrectly determined that the truck would move out of the way and therefore there would not be a collision.
 
With all these people putting Waymo up on a pedestal, it's gonna be funny on that day they announce they're scaling down their services. I'll be around to witness the silence or crying.

Waymo will scale up. Mark my words. It will be funny when V12 is still not driverless after 5 years and Waymo has scaled to even more cities. Heck Waymo's next expansion includes the Tesla HQ. So Waymo will offer driverless to Tesla HQ before Tesla is driverless.
 
The Waymo was not programmed to run into the vehicle.
Real world results differ!

On the contrary, the error is that it mistakenly thought that it was not going to hit the truck. Here is what Waymo said:

So the prediction stack incorrectly determined that the truck would move out of the way and therefore there would not be a collision.
Now THIS is a semantic argument.

It is not possible to determine from the verbiage who is closer to correct. The prediction stack could have incorrectly determined trajectory due to a sign error. They don’t say why it did this.

This is just a bug.
 
  • Like
Reactions: FSDtester#1