Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Waymo

This site may earn commission on affiliate links.
That is a pretty crazy move. Waymo performed it safely but I am not sure it was wise though. Ideally, I think you want to drive in a predictable way for other drivers. So if you start to make a right turn, you should complete the right turn as expected, not suddenly make a left turn in the middle of the intersection when the oncoming traffic has a green light to go.
Only problem is the Waymo was turning left not right. Stopping a right turn and continuing straight isn't bad. Stopping a left turn 90% of the way through and deciding to go straight...well I think Waymo would much prefer the car to get stuck than to drive head on towards traffic while on the wrong side of the road...
 
Looks like their collisions are about the same. 71 in 2022, 50 in 2023 and 14 (x3 = 42) so far this year. Though I didn’t look to see if how many are with a safety driver driving (safety drivers seem to get in dumb low speed collisions).
But "failure rate" <> "collision". That weird way of aborting the turn will not come in any publicly available stat ...
 
Only problem is the Waymo was turning left not right. Stopping a right turn and continuing straight isn't bad. Stopping a left turn 90% of the way through and deciding to go straight...well I think Waymo would much prefer the car to get stuck than to drive head on towards traffic while on the wrong side of the road
The video doesn't give the whole story - we don't really know why the Waymo initially changed its mind - maybe it got stuck in the intersection because of traffic ahead of it, and decided to find a safe solution rather than blocking the intersection.

A human driver may have done exactly the same thing, in which case 10 points to the Waymo.

However, a good human driver would have presumably ensured the path into the left turn street was clear before proceeding into the intersection - and so should have Waymo, in which case -10 points.

What do we get more annoyed about? Waymo blocking traffic due to a dumb decision, or Waymo rescuing itself from a dumb decision? Either is not good - better not to make the dumb decision in the first place. But then we would get annoyed about it driving too slowly and being too slow about decisions and holding up traffic. A certain amount of prediction is involved in efficient driving by humans, we don't always get it right, and if we want these machines to drive with us and around us we need them to drive at least partially like a human.

A fine line, I guess, that will be constantly finessed.

But if a human had done what this Waymo did, I don't think it would have been newsworthy.
 
Right, nor would the incident of driving on the wrong side of the road nor the one driving in the bus lane and doing an illegal left turn, if the car didn't disengage.
Right - thats the issue with "zero disengagement" drives on FSD too. If it does something obviously not right - but if you don't disengage - is it really a "zero disengagement" drive ?
 
If Waymo is starting to switch over to ML/AI and replace rule based code while you get smother rides doesn't that meant less predictability since it won't follow strict rules as much? Sorta the way X12 does now.

So maybe some of Waymo current problems is a conflict between the newer ML/AI code and the legacy rule based code.
 
The video doesn't give the whole story - we don't really know why the Waymo initially changed its mind - maybe it got stuck in the intersection because of traffic ahead of it, and decided to find a safe solution rather than blocking the intersection.
You can see in the video that the intersection had no traffic in the path of the Waymo.
1715259308664.png
 
  • Like
Reactions: Max Spaghetti
Right - thats the issue with "zero disengagement" drives on FSD too. If it does something obviously not right - but if you don't disengage - is it really a "zero disengagement" drive ?

Yeah, I've been arguing this for a bit.

I could have FSD hit a hive of Africanized bees, and kill 10 puppies but if I didn't disengage FSD it doesn't matter for stats. Same with Waymo and Cruise for all of these incidents.
 
It looks like Waymo is programmed to be aggressive, impatient, and wrong.

It wanted to turn left but was blocked on its front by cars waiting in the median. Instead of waiting for those to clear out, ít went ahead and turned left into the wrong way lane and could have had a head on collision if there were high speed inattentive right way drivers in that lane;

 
It looks like Waymo is programmed to be aggressive, impatient, and wrong.

It wanted to turn left but was blocked on its front by cars waiting in the median. Instead of waiting for those to clear out, ít went ahead and turned left into the wrong way lane and could have had a head on collision if there were high speed inattentive right way drivers in that lane;

Not sure if impatient is fair. They’re driving in LA, they could literally be waiting there for hours. I’m wondering how far they drove the wrong direction and whether it knew no one was coming. Technically you’re always driving against traffic when making a turn like that.
 
Not sure if impatient is fair. They’re driving in LA, they could literally be waiting there for hours. I’m wondering how far they drove the wrong direction and whether it knew no one was coming. Technically you’re always driving against traffic when making a turn like that.
1. It's not LA, it's Tempe, AZ.
2. Then it should turn right and reroute to the destination.

This incident is not a good example for robots.
 
  • Informative
Reactions: Daniel in SD