Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Waymo

This site may earn commission on affiliate links.
Very little was actually actionable for an AV in this example. An AV should always be looking out for surrounding vehicles and people. If it sees emergency vehicles, it should already know to slow down. Whether the disabled car is on its side as claimed (it wasn't), upside down, or right side up is not germane. It's an obstruction in the road that must be avoided. It could be an alien spaceship or a pile of bricks for that matter.

It's nice that an LLM can decipher a photograph. But, this wasn't all that good of an example of how an LLM might enhance an AVs capabilities.

Perhaps the other example of reading parking signs was a better example.
 
Waymo issued software "recall" after two collisions:

Waymo is voluntarily recalling the software that powers its robotaxi fleet after two vehicles crashed into the same towed pickup truck in Phoenix, Arizona, in December. It’s the company’s first recall.

Waymo chief safety officer Mauricio Peña described the crashes as “minor” in a blog post, and said neither vehicle was carrying passengers at the time. There were no injuries. He also said Waymo’s ride-hailing service — which is live in Phoenix, San Francisco, Los Angeles, and Austin — “is not and has not been interrupted by this update.” The company declined to share video of the crashes with TechCrunch.

Waymo said it developed, tested, and validated a fix to the software that it started deploying to its fleet on December 20. All of its robotaxis received that software update by January 12

The incidents:

The crashes that prompted the recall both happened on December 11. Peña wrote that one of Waymo’s vehicles came upon a backward-facing pickup truck being “improperly towed.” The truck was “persistently angled across a center turn lane and a traffic lane.” Peña said the robotaxi “incorrectly predicted the future motion of the towed vehicle” because of this mismatch between the orientation of the tow truck and the pickup, and made contact. The company told TechCrunch this caused minor damage to the front left bumper.

The tow truck did not stop, though, according to Peña, and just a few minutes later another Waymo robotaxi made contact with the same pickup truck being towed. The company told TechCrunch this caused minor damage to the front left bumper and a sensor. (The tow truck stopped after the second crash.)

 
Seems weird that they would ignore the radar/lidar information showing something was in the path just because their planner thinks it might move out of the way by the time the Waymo got there. (Assuming of course that the radar/lidar were properly returning that it was there.)
This is one of the primary reasons I jump all over people who claim that sensors are the answer. Tesla can solve their problems if they just put radar and lidar all over their cars. Waymo/cruise are loaded with sensors and still crash. Turns out that NNs and planning are hard problems to solve.
 
Seems weird that they would ignore the radar/lidar information showing something was in the path just because their planner thinks it might move out of the way by the time the Waymo got there. (Assuming of course that the radar/lidar were properly returning that it was there.)

Well, if the prediction stack was confident that the truck would move out of the way in time, there is no reason why the Waymo would not proceed since it thinks the path will be clear. But presumably, when the software detected an imminent collision, it applied automatic emergency braking which mitigated the collision.

But that is why it was a software issue that required a fix since the prediction as making a mistake that would definitely result in a collision. It was clearly a reproducible error since it happened twice in a row.
 
This is one of the primary reasons I jump all over people who claim that sensors are the answer. Tesla can solve their problems if they just put radar and lidar all over their cars. Waymo/cruise are loaded with sensors and still crash. Turns out that NNs and planning are hard problems to solve.

That is a strawman. Nobody says that if you put radar and lidar on cars, it will automatically solve all collisions. Radar and lidar are sensors that like cameras, can help the car perceive the world. If used properly, radar and lidar can help make perception more reliable. Of course, you still need effective software. And yes, even with excellent perception, you still need to solve prediction and planning problems. No sensor automatically solves autonomous driving. Autonomous driving requires hardware and software.
 
  • Like
Reactions: DanCar
Good Lord Diplo! How long have you been on TMC? 😁
The number of posts from people who demand radar and lidar be added to solve PBs and give us L4...

There is a difference between saying radar and lidar is needed for L4 and saying radar and lidar will solve all collisions. You don't need to solve all collisions to do L4. I certainly believe that Tesla needs radar and lidar do to reliable L4. But I don't believe that adding radar and lidar would automatically solve all collisions.
 
There is a difference between saying radar and lidar is needed for L4 and saying radar and lidar will solve all collisions. You don't need to solve all collisions to do L4. I certainly believe that Tesla needs radar and lidar do to reliable L4. But I don't believe that adding radar and lidar would automatically solve all collisions.
I couldn't agree with you more, which is why I find it funny when people keep saying Tesla needs to add radar and lidar to solve their problems. The sensors alone won't solve it.
 
My concern with Waymo right now is the glacial rollout .. they seem only to do at most one city a year. Sure, you could attribute this to caution, or perhaps finances (Alphabet are not as generous as they once were). But I wonder if the entire infrastructure setup is rather more onerous then they realized (HD maps, bureaucracy, setup of local manual take-over support staff etc). I'm not clear what Waymo's business model is atm (or even if they are).

I share that concern. I would personally like to see Waymo scale faster. It seems to me that if they had more vehicles on the road, it would allow them to encounter more edge cases faster, and speed up development. And they could put safety drivers back in initially to intervene when the Waymo Driver encounters a new or difficult edge case. So I think safety could be addressed. Of course, there would be a huge cost to this. Deploying test cars in large numbers with safety drivers would cost billions. They are already losing billions just with the fleet of a few hundred vehicles they have now. So perhaps they can't afford to deploy a fleet of say 10k or 100k cars.

Companies like Tesla or Mobileye have the advantage of getting data from large fleets without having to maintain the fleet of cars. In Tesla's case, they produce and sell the cars directly to the consumer and collect the data. In Mobileye's case, they sell the hardware and software and let OEMs produce and sell the cars to consumers and they just collect the data. But Waymo has to spend a lot of money retrofitting cars and maintaining the cars, and paying the safety drivers, with no revenue in return, other than ride fares. That is why I think Waymo should follow Mobileye's business model. Get out of the robotaxi business. I think Mobileye was smart to give up on managing their own robotaxi service. They recognize that the up front cost is too high. Instead, Mobileye focuses on developing the tech and making money licensing the hardware and software in various autonomous driving products to the OEMs.

I think Waymo should do the same. Waymo has custom hardware (sensors and compute) they could sell to OEMs. They would not need to sell the whole package, just sell the cameras to OEMs that want to vision-only ADAS or just sell the long range radar to OEMs that want a good front radar for their ADAS. And the Waymo Driver is very capable. I feel like Waymo could sell the Waymo Driver software to OEMs that want good perception/prediction/planning for their ADAS. And Waymo could offer various ADS products similar to Mobileye's SuperVision/Chauffeur/Drive. So Waymo could offer a basic L2 system, a vision-only eyes-on system or a more expensive eyes-off highway system depending on what OEMs are interested in. It seems to be working for Mobileye as they have gotten design wins from dozens of interested OEMs so far.

Adopting Mobileye's business, Waymo could make money selling hardware and software, collect large data from a larger fleet "for free" and use their existing test fleet and engineering staff to continue to develop their autonomous driving. Frankly, I am not really seeing any downside.
 
Last edited:
  • Like
Reactions: DanCar

HIGHLIGHTS / LOWLIGHTS:

1:13 Narrow alley with pedestrians and cyclist
7:58 Multi-point turn for dead end
13:49 Unprotected left turn with blocked intersection
15:37 Extremely tight turn + hard brake for unknown reason
17:44 Highly contested lane change + aggressive oncoming driver

FULL LIST:

0:18 Pull out
0:54 Stop for map error?
1:13 Narrow alley with pedestrians and cyclist
1:47 Pedestrians in street and oncoming skateboarder
2:15 Tourist violates my personal space
4:28 Nudge for pedestrians in street
5:41 Right on red
6:15 Entering parking lot
6:55 Permit required for drop-off
7:58 Multi-point turn for dead end
8:53 Indecisive route
9:50 Pedestrian crossing at parking lot exit
10:12 Awkward trajectory over traffic control spikes
12:24 Slow for midblock crosswalk
12:38 Last minute lane change
13:49 Unprotected left turn with blocked intersection
15:00 Narrow alley with children crossing
15:30 Narrow alley with cyclist
15:37 Extremely tight turn
15:53 Hard brake for unknown reason
17:44 Highly contested lane change to follow route
18:38 Unprotected left turn with assertive oncoming vehicle
19:34 Awkward trajectory in roundabout
19:48 Shared turn lane
20:09 Pull over
 
Well, if the prediction stack was confident that the truck would move out of the way in time, there is no reason why the Waymo would not proceed since it thinks the path will be clear. But presumably, when the software detected an imminent collision, it applied automatic emergency braking which mitigated the collision.

But that is why it was a software issue that required a fix since the prediction as making a mistake that would definitely result in a collision. It was clearly a reproducible error since it happened twice in a row.
It sounds like another aggressive programming similar to Cruise's pedestrians in crosswalk prediction (by accelerating toward the pedestrians, they would be scared and run away from the Cruise's path).

In this case, the tow truck ahead was oriented forward, away from Waymo, and the towed pickup truck was also oriented away, pointing its head toward another lane, away from Waymo.

Even a dumb cruise control wouldn't hit those 2 obstacles regardless of how both obstacles were oriented.
 
Last edited:
Even a dumb cruise control wouldn't hit those 2 obstacles regardless of how both obstacles were oriented.

It is not a fair comparison because a driverless car and a "dumb cruise control" have very different jobs. You are comparing a system that needs advanced perception, prediction and planning to a system that just needs basic perception and driving control. The Waymo can drive you all around SF on its own, "dumb cruise control" cannot do that. The "dumb cruise control" only has one simple job which is to maintain a safe distance from the object in front. A cruise control only has to brake to avoid hitting stuff, that's it. But that does not work for a driverless car because it has a much bigger, more complex job, It has to be able to navigate streets, maneuver around obstacles like double parked cars, obey traffic laws, yield to pedestrians and cyclists, do unprotected turns, navigate parking lots and much more. In that context, simply braking to avoid hitting stuff is not enough. There are instances where you have to be assertive to go around a stopped car or accelerate to get out of the way of traffic. The job of a robotaxi is more complicated because it needs to drive confidently and assertively while also not hitting stuff. In order to do that, robotaxis need advanced perception, prediction and planning. They need to be able to predict where objects will be in the future so that they can maneuver confidently and not just stop in the middle of the road every time there is a object in front of them. That would be terrible driving. And keep in mind that Waymo has excellent prediction. The vast majority of the time, it is able to maneuver safely without hitting objects in ways that dumb cruise control could never do since it is not its function. But there was this specific edge case where the prediction was wrong. Hence, why Waymo needed to issue a software fix.

It sounds like another aggressive programming similar to Cruise's pedestrians in crosswalk prediction (by accelerating toward the pedestrians, they would be scared and run away from the Cruise's path).

Waymo is not aggressive around pedestrians like Cruise is. Also, Cruise does not accelerate to scare pedestrians out of its path (lol), it accelerates because the prediction says it is "safe" to do so.
 
Last edited:
  • Like
Reactions: Tam
Interesting we've redefined "dumb cruise control" as adaptive cruise control with lane keeping. I always thought dumb cruise control was keeping a set speed while the human steered. I'll adjust my definition 😁

Good point. I kind of assumed @Tam was referring to adaptive cruise control since adaptive cruise control could maybe avoid the collision as he suggests. Actual dumb cruise control would certainly hit the towed truck since it is only programmed to keep a fixed speed and cannot avoid collisions. Maybe @Tam meant to say Automatic Emergency Braking, not dumb cruise control as AEB is specifically designed to avoid or mitigate collisions?

I would also add that some adaptive cruise control would also not register the towed truck (old radar ignores stopped objects) and thus would also hit the towed truck.
 
  • Like
Reactions: Dewg
Interesting we've redefined "dumb cruise control" as adaptive cruise control with lane keeping. I always thought dumb cruise control was keeping a set speed while the human steered. I'll adjust my definition 😁
You are right that I used the incorrect cruise terminology.

I meant Adaptive Cruise Control that's programmed to adjust the speed, including stop and go, to avoid colliding with an obstacle in front.
 
  • Like
Reactions: Dewg and diplomat33