Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
  • Want to remove ads? Register an account and login to see fewer ads, and become a Supporting Member to remove almost all ads.
  • Tesla's Supercharger Team was recently laid off. We discuss what this means for the company on today's TMC Podcast streaming live at 1PM PDT. You can watch on X or on YouTube where you can participate in the live chat.
This site may earn commission on affiliate links.
I rarely encounter those. I actually don't remember anything like that in the last 5 years of my driving.
It is interesting how varied driving experiences can be. In some parts of the country encountering school busses is a near daily event.

I would think all those encounters would have to be in the NN training material for V12 so maybe soon we’ll get some comments.

Thanks for your quick response🙂
 
It is interesting how varied driving experiences can be. In some parts of the country encountering school busses is a near daily event.

I would think all those encounters would have to be in the NN training material for V12 so maybe soon we’ll get some comments.

Thanks for your quick response🙂
Agree. Where I live two things are common. First, school buses with flashing lights and other drivers who flash their headlights telling me to go. The latter happens every day. Or some people wave you to go. And once winter has passed then it's the daily or multiple times a day police or road crews using hand gestures to manage traffic. Will be fun to see how that progresses.
 
Are there any non-CA based YouTube stars with v12?
Seems like this person started in California and is headed towards New York stopping by Texas and Indiana (roundabouts) as part of their "Road Rally." Here's some videos from Texas: Georgetown part 1 and part 2 as well as Austin. There was a pretty sketchy situation where it seems like 12.2.1 did not react at all to a truck reversing from a parking spot into the road:


12.2.1 reversing truck.jpg


Maybe end-to-end knew it didn't need to, but I would think there hasn't been as many examples of needing to react to reversing vehicles.
 
Maybe end-to-end knew it didn't need to,
It's impossible for it to know that. That's not how this works.

This was obviously unsafe, and action on the part of FSD or the driver was required.

A hint of how FSD eliminating all those bad human behaviors & all resultant accidents (drunk, sleepy, etc.) could still dramatically increase accident rates.

This sort of thing happens all the time, and people sometimes actually back into the road, but in most cases there is no accident, because humans are involved. Humans are pretty amazing.
 
Last edited:
Seems like this person started in California and is headed towards New York stopping by Texas and Indiana (roundabouts) as part of their "Road Rally." Here's some videos from Texas: Georgetown part 1 and part 2 as well as Austin. There was a pretty sketchy situation where it seems like 12.2.1 did not react at all to a truck reversing from a parking spot into the road:



View attachment 1026329

Maybe end-to-end knew it didn't need to, but I would think there hasn't been as many examples of needing to react to reversing vehicles.
I've seen many cases of FSD just missing hitting something by inches. People always react as if this was dangerous but the fact is -- the car did NOT HIT anything. Granted that the tolerances it sometimes accepts are uncomfortable for humans and that needs to be adjusted but it doesn't make the situation unsafe. I've never seen a case where the FSD car actually hits something. (Except, well, the recent parking incident. But when that happened Tesla stopped the FSD release until they solved the issue that caused the accident. They don't do that often which probably indicates that actual FSD caused collisions are exceedingly rare.)
 
I've seen many cases of FSD just missing hitting something by inches. People always react as if this was dangerous but the fact is -- the car did NOT HIT anything. Granted that the tolerances it sometimes accepts are uncomfortable for humans and that needs to be adjusted but it doesn't make the situation unsafe. I've never seen a case where the FSD car actually hits something. (Except, well, the recent parking incident. But when that happened Tesla stopped the FSD release until they solved the issue that caused the accident. They don't do that often which probably indicates that actual FSD caused collisions are exceedingly rare.)
Safety is not determined by the outcome but the odds of the action achieving a good outcome. Just because it didn’t hit anything didn’t mean it was safe.

Example - I tear down a residential road with kids playing in the street going 60 mph and don’t hit anyone. Was it safe?

I’ve often wondered how many disconnects are because we don’t trust FSD and if it would do fine if we just let it. The problem is, as the driver supervising the software, it’s my job to make sure it’s safe and not let it go so far that I can’t prevent an accident, so I’ll continue to disconnect.
 
Safety is not determined by the outcome but the odds of the action achieving a good outcome. Just because it didn’t hit anything didn’t mean it was safe.

Example - I tear down a residential road with kids playing in the street going 60 mph and don’t hit anyone. Was it safe?

I’ve often wondered how many disconnects are because we don’t trust FSD and if it would do fine if we just let it. The problem is, as the driver supervising the software, it’s my job to make sure it’s safe and not let it go so far that I can’t prevent an accident, so I’ll continue to disconnect.
The problem with your position is that there is no objective standard for "Is it safe" It's just someones opinion and everyone has one..
 
The problem with your position is that there is no objective standard for "Is it safe" It's just someones opinion and everyone has one..
We don't need an objective standard. We just need a majority opinion, which is what training data is supposed to provide. In truth, that data should provide a significant majority for moving farther away, given that few drivers would blithely let a reversing, large, lifted truck get that close.
 
v12.2.1 was eager to show off it's U turn skills and ignored no U turn sign yesterday. Disengaged.

It also wanted to impress me with its skills to negotiate with other drivers to change lanes. Last week, when making left turn to go to a restaurant on the right side of the perpendicular road and 300ft from the intersection, it switched from the rightmost left turn lane to the leftmost left turn lane because this lane has no car. After the turn it quickly made signal and change lane to turn to the parking lot of the restaurant. The driver slowed down and let it go smoothly. People would crazily honk if I drive myself like this.

It also try to cut people to move from the leftmost turn lane to the rightmost turn lane when it was already near the intersection and there were only one car in front. Surprisingly, the driver on the right most lane stopped to wait for it to go when red light changed to green. I disengaged though. That's naughty.

Besides those not so good behaviors, I saw V12 had some good things:

1. Waited for two parents pushing two strollers at the intersection.
It did not run over the babies.

2. Patiently stopped and wait to let a pick up truck change lane when seeing left turn blinking light in slow traffic.

3. Easily moved to the left to pass a car stopped in the middle of the road.
 
Last edited:
v12.2.1 was eager to show off it's U turn skills and ignored no U turn sign yesterday. Disengaged.

It also wanted to impress me with its skills to negotiate with other drivers to change lanes. Last week, when making left turn to go to a restaurant on the right side of the perpendicular road and 300ft from the intersection, it switched from the rightmost left turn lane to the leftmost left turn lane because this lane has no car. After the turn it quickly made signal and change lane to turn to the parking lot of the restaurant. The driver slowed down and let it go smoothly. People would crazily honk if I drive myself like this.

It also try to cut people to move from the leftmost turn lane to the rightmost turn lane when it was already near the intersection and there were only one car in front. Surprisingly, the driver on the right most lane stopped to wait for it to go when red light changed to green. I disengaged though. That's naughty.

Besides those not so good behaviors, I saw V12 had some good things:

1. Waited for two parents pushing two strollers at the intersection.
It did not run over the babies.

2. Patiently stopped and wait to let a pick up truck change lane when seeing left turn blinking light in slow traffic.

3. Easily moved to the left to pass a car stopped in the middle of the road.
Sounds like it watched Pushing Tin a few too many times...
 
Granted that the tolerances it sometimes accepts are uncomfortable for humans and that needs to be adjusted but it doesn't make the situation unsafe.
All public FSD releases to date have had substantially slower reaction time than human drivers.

So that makes tighter tolerances even more dangerous.

I've never seen a case where the FSD car actually hits something.

There have been accidents. (Very old version of FSD.) Whether or not there have been collisions in this sort of scenario is not clear. Very few accident reports would make it into public forums, and it is likely the SGO (standing general order) also misses some. There are ~400k FSD Beta users as I recall.
And how do you know it's impossible?
It would have to have vehicle-to-vehicle communication.

Even the best drivers don’t know when the truck will move. So nothing in the training data, even if it contained all of historical human behavior, would help here.

I don’t think telepathy is an emergent behavior.
 
The problem with your position is that there is no objective standard for "Is it safe" It's just someones opinion and everyone has one..
So your standard would be “well, nobody died, did they?”

Of course there’s no objective standard - there can’t be. The standard is based on reasonable judgement, just like many other things in life.

Think about my example - would you consider driving 60 mph down a street full of kids safe? Why not? No one got hurt, right? If it’s not safe, what speed is?