Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Autonomous Car Progress

This site may earn commission on affiliate links.
The butthurt from the anti-Tesla FSD'ers is going to be so intense when fanbois are uploading hour long clips with no driver interventions...

that basically match Waymo and Cruise curated clips.

They will cry afoul. "That was cherrypicked!" they will say.

Except deep down they they Cruise and Waymo clips are cherrypicked too.

I don't really understand why there is so much contention.

The biggest challenge self-driving cars face is not competition, but from the established way of doing things and the uneasiness of the public when it comes to autonomous cars.

Any kind of L4 driving even in a geofenced area like what Waymo does is a win to me.
Any kind of L4 driving in a more complicated geofenced area like what Cruise does is a win to me.

Any time autonomous cars are mixed with humans will result in unexpected outcomes. Anything from the human driver getting angry at the autonomous car to the autonomous car not knowing how to deal with some human doing weird things. Geofencing helps to cut down on the unexpected, but it doesn't remove it.

I fail to understand why Geofencing is a big deal. There is no question that I'm more nervous driving in areas I'm not familiar with. I've made countless driving mistakes in my life due to failing to recognize some road situation, and doing the wrong thing. Why did I get away with doing the wrong thing for so long without ever crashing? Probably luck, and probably the fact that there wasn't enough cars on the road to clue me. Even driving manually I would benefit from having more up to date information on recent construction changes. Like this summer Apple, and Google disagreed with where a road was closed due to fire, but they were both wrong. In fact neither one could give me proper instructions to get around it so I had to get around it manually.

We need to accept the fact that autonomous cars are going to be geofenced for the foreseeable future. It won't be just geofenced, but it will have limited weather conditions that it's allowed to drive in.

Over time the geofenced areas will grow. It allows the each region to give their okay, and to be prepared for it before allowing it.

The key difference between Tesla, and Waymo/Cruise is their approach and who their intended customer is.

Tesla is much more interesting to me as someone who wants to own a self-driving car. I don't think I'll ever be allowed to own one, but I admire that Tesla is attempting it. As an owner I'm going to ride this ride out till the bitter or sweet end. I think it will fail due to lack of redundancy of the sensors, but I also hope I'm wrong.

Tesla is also facing the much tougher battle. The tougher battle is getting the public to accept autonomous driving, and a safety threshold lower than what Waymo/Cruise is trying to set.

Whether its 10X safer to 2x safer the overall reality is that it's safer. But, the public isn't ready even for 10x safer let alone 2x safer.

Tesla is going to do this by having millions of people driving around in L2, but L4 capable vehicles. Where these millions of people will put pressure on regulators to allow for the switch to be turned on.

Tesla is brute forcing autonomous driving. Hopefully other car manufactures will follow with MobileEye solutions to help out.

The war is not with each other.
 
Last edited:
Exactly, they cannot do it, because they need the maps!

Nobody can do reliable FSD without maps. I would remind you that Tesla cannot do reliable FSD without maps either. Tesla has a FSD disengagement rate of 1 per 15 miles. With maps, Waymo has a FSD disengagement rate of 1 per 11,000 miles. Think about that! Waymo's FSD with HD maps is roughly 733x more reliable than Tesla's FSD without HD maps!
 
The butthurt from the anti-Tesla FSD'ers is going to be so intense when fanbois are uploading hour long clips with no driver interventions...

that basically match Waymo and Cruise curated clips.

They will cry afoul. "That was cherrypicked!" they will say.

Except deep down they they Cruise and Waymo clips are cherrypicked too.
This is silly. I fully expect Tesla to be able to drive an hour without intervention by the end of next year.
I am in full agreement about the value of one hour unedited video clips, they tell you absolutely nothing useful about reliability. They give some idea of the implemented features.
Now, when someone uploads a continuous 20,000 hour video clip I will be impressed. That's the benchmark, not an hour.
Driverless operation is all that matters in the end. We'll see how Waymo and Cruise do.
 
Whether its 10X safer to 2x safer the overall reality is that it's safer. But, the public isn't ready even for 10x safer let alone 2x safer.
Sad but true. Tesla won't be allowed to save lives early on.

... put pressure on regulators to allow for the switch to be turned on.
People hate the idea of robots killing people. So until it has almost no chance of killing, then people will start to put pressure on law makers. At least 5 years away maybe 10.
 
Tesla has a FSD disengagement rate of 1 per 15 miles.

You are just making that up from one example drive. That isn't a reliable, or relevant, statistic. Using that logic we have a different example of a drive longer than 20 miles with 0 disengagements. So it must be an infinite number of miles per disengagement. :rolleyes:

With maps, Waymo has a FSD disengagement rate of 1 per 11,000 miles.

It seems like just earlier today @Bladerskb said their disengagement rate was 1 per 1,000,000 miles, now they are 90x worse. My how they have fallen. :eek:

Of course there is still the issue that the CA disengagement numbers are worthless. Since each company gets to decide under what conditions a disengagement counts. Some may count one situation while another won't count that same situation. So you can't compare them across companies and come to any useful conclusion.
 
Last edited:
You are just making that up from one example drive. That isn't a reliable, or relevant, statistic. Using that logic we have a different example of a drive longer than 20 miles with 0 disengagements. So it must be an infinite number of miles per disengagement. :rolleyes:



It seems like just minutes ago @Bladerskb said their disengagement rate was 1 per 1,000,000 miles, now they are 90x worse. My how they have fallen. :eek:

Of course there is still the issue that the CA disengagement numbers are worthless. Since each company gets to decide under what conditions a disengagement counts. Some may count one situation while another won't count that same situation. So you can't compare them across companies and come to any useful conclusion.
He said one accident per million miles (humans do one police reported accident every 500k miles).
Disengagement rate becomes meaningless once it gets to 1 per 10,000 miles. Just because there was a disengagement doesn't mean it was necessary. True safety is determined by going back and simulating the counterfactual.
Sad but true. Tesla won't be allowed to save lives early on.
I think we're getting a little bit ahead of ourselves here! Let's wait until a one accident per 500k miles performance is plausible (unless there are people who think this build is capable of that? :eek:).
 
He said one accident per million miles (humans do one police reported accident every 500k miles).

Nope, he was very clearly comparing disengagement rates:

Moving the goal post? What? everything we told you came true. You are looking at ~10 milers per disengagement versus waymo's ~1 million miles per disengagement.
 
  • Like
Reactions: mikes_fsd
... That's an especially bad disengagement. If the driver had no intervened promptly, it could have been a serious accident.
I wouldn't use those words although true. I'm more inclined to think of it as an expensive video game and this is just part of the roller coaster ride.

When I was a wee lad, I remember we described my Mom's driving as a roller coaster ride.
 
  • Funny
Reactions: scottf200
You are just making that up from one example drive. That isn't a reliable, or relevant, statistic. Using that logic we have a different example of a drive longer than 20 miles with 0 disengagements. So it must be an infinite number of miles per disengagement. :rolleyes:

Ok. Let's add your data point. Now we have a total of 80 miles and 4 disengagements. That's a total disengagement rate of 1 per 20 miles. Even adding in other trips that were better, I doubt that Tesla's disengagement rate would be anywhere near Waymo's.

Now, I realize that it is still a tiny sample size, with a large error but we don't have a lot of data so far. Maybe Tesla could release a disengagement report over millions of miles so that we can get a more accurate disengagement rate over a good sample size?

Of course there is still the issue that the CA disengagement numbers are worthless. Since each company gets to decide under what conditions a disengagement counts. Some may count one situation while another won't count that same situation. So you can't compare them across companies and come to any useful conclusion.

I realize that disengagement rates are not a perfect metric. But I use the CA DMV disengagement rates because they are the most official numbers we have to quantify FSD reliability.
 
  • Disagree
Reactions: mikes_fsd
This is silly. I fully expect Tesla to be able to drive an hour without intervention by the end of next year.
I am in full agreement about the value of one hour unedited video clips, they tell you absolutely nothing useful about reliability. They give some idea of the implemented features.
Now, when someone uploads a continuous 20,000 hour video clip I will be impressed. That's the benchmark, not an hour.
Driverless operation is all that matters in the end. We'll see how Waymo and Cruise do.


Most of the espoused superiority of Waymo / Cruise here is based on those clips.

Obviously I agree that those clips don't really mean much.

No one has "won" the race to autonomous vehicles.

The "race" is when a massive of amount of AVs are deployed on the road.

Waymo / Cruise are ahead in some areas, but behind in others vs. Tesla.
 
  • Helpful
Reactions: mikes_fsd
Now, I realize that it is still a tiny sample size, with a large error but we don't have a lot of data so far. Maybe Tesla could release a disengagement report over millions of miles so that we can get a more accurate disengagement rate over a good sample size?

The problem is that Tesla can't really report meaningful numbers for disengagements with the data they are collecting because they aren't asking the driver why they disengaged FSD. (Maybe they do on their builds for employees, but we don't know about that.)

I realize that disengagement rates are not a perfect metric. But I use the CA DMV disengagement rates because they are the most official numbers we have to quantify FSD reliability.

But using a completely worthless number just because it is the best we have doesn't make it more valuable.
 
  • Like
Reactions: EinSV and mikes_fsd
Most of the espoused superiority of Waymo / Cruise here is based on those clips.

Obviously I agree that those clips don't really mean much.

No one has "won" the race to autonomous vehicles.

The "race" is when a massive of amount of AVs are deployed on the road.

Waymo / Cruise are ahead in some areas, but behind in others vs. Tesla.
But those clips are far superior to anything demonstrated by Tesla so far. Obviously if Waymo One was as bad as the current FSD build there would be clips of that on the internet, it's not.
Waymo has deployed driverless vehicles and Cruise claims that they will shortly. Until Tesla does that there really isn't much to compare.
 
The problem is that Tesla can't really report meaningful numbers for disengagements with the data they are collecting because they aren't asking the driver why they disengaged FSD. (Maybe they do on their builds for employees, but we don't know about that.)

I was under the impression that our cars send back short video clips every time we disengage AP. So presumably, Tesla could analyze those video clips to figure out why the driver disengaged.

But regardless, Tesla must have some way of determining when FSD is safe enough to remove driver supervision. Tesla must have data to determine FSD safety. Tesla could share that.
 
Sad but true. Tesla won't be allowed to save lives early on.

People hate the idea of robots killing people. So until it has almost no chance of killing, then people will start to put pressure on law makers. At least 5 years away maybe 10.
It's funny that you keep claiming that Waymo is too safe when we have no idea how safe it is yet. I suppose Cruise and Waymo could present a problem for Tesla if regulators insist on using them as the benchmark even though a lower level of safety would still be far better than a human. I'm a little confused about why you think public pressure is necessary when you live in the center of autonomous vehicle testing and where Waymo and Cruise both have permits for testing without a safety driver.