Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FSD Beta Videos (and questions for FSD Beta drivers)

This site may earn commission on affiliate links.
Should beta testers make sure to disengage to avoid running red lights? This is one of the trickier intersections in San Francisco crossing Market St on 3rd as it curves just before the intersection with nearby intersections having lights. It doesn't look like the driver pressed on the accelerator, so it's just FSD beta choosing to enter the intersection (and crosswalk) even though a red light was detected and visualized?

run red.jpg
 
Should beta testers make sure to disengage to avoid running red lights? This is one of the trickier intersections in San Francisco crossing Market St on 3rd as it curves just before the intersection with nearby intersections having lights. It doesn't look like the driver pressed on the accelerator, so it's just FSD beta choosing to continue through on a red light that was detected and visualized?

View attachment 609278
Visualization says stopping, driver must have overrode accelerator
 
Should beta testers make sure to disengage to avoid running red lights? This is one of the trickier intersections in San Francisco crossing Market St on 3rd as it curves just before the intersection with nearby intersections having lights. It doesn't look like the driver pressed on the accelerator, so it's just FSD beta choosing to enter the intersection (and crosswalk) even though a red light was detected and visualized?

View attachment 609278

At the point the light turned yellow, the car was right at the intersection edge and going 15mph, so I think the car determined a stop to 0 would put the car in a bad position:

Screen Shot 2020-11-17 at 11.50.13 AM.png
 
Another pet peeve I have noticed in almost all the videos. Signal Lights keep tuning on and off and on and off. They may flash 3x and then off for a couple of seconds then flash 1x, then off, then flash 4x, then off, etc. FSD needs to commit to on or off but not constantly switching on and off. It must be a little disconcerting to see the car in front of you doing this with the signal lights. Very UN-human like behavior.

When in doubt just cut them on and LEAVE them on. :eek: Now that is WAY more human like.:D
 
At the point the light turned yellow
Ha ha. ;) I didn't notice earlier but the vehicle to the right is a Zoox vehicle!

Here's the frame where it decided to start braking presumably it detected the yellow light, but unfortunately from the recording camera angle, we can't actually see when it turned from green to yellow. The crosswalk signal going across Market did change from the countdown to solid stop, and I believe when the countdown displays "0" the traffic light turns yellow, so Autopilot should have been able to see it was yellow at this point going 11mph with at least a car length to stop before the crosswalk.

zoox stopping.jpg


The Zoox vehicle ends up completely in the crosswalk blocking the pedestrians wanting to cross both ways, so I'm not really sure if that's the better outcome.
 
  • Like
Reactions: powertoold
Drive 2 | Busy Parking Lots, Highways, Dirt Roads, and More! - 25:06 - Dirty Tesla
Car did fairly well, but had a couple of worrisome moments. The worst was it over steering on a right turn and almost turning into the curb, which he stopped. The car also needs to be adjusted so it doesn’t treat people and parked cars like slaloms and instead move over earlier instead of doing so later. I’ve seen that behavior in other videos too, and while it’s not hitting anything that kind of movement could scare other people on the roads.
 
  • Like
Reactions: n.one.one
A few observations from a BETA tester for AP1, self-park, summon, etc in a model S and then Founders X.
Your collective videos, data sharing and discussions are great and accomplish Tesla’s design to further the gigantic achievements toward full FSD faster than could be achieved alone in their labs. Don’t think for a parasec that your discussions and vehicle reactions are being missed by Tesla. Their guys are intensely interested in this data. There is a secondary safety thread permeating this discussion. PLEASE remember this is a BETA system and YOU are still responsible for any results that may be caused by the software/hardware and YOUR lack of ability to counteract the anomaly. The catastrophic death resulting from a BETA AP1 driver viewing a DVD with hands off the wheel as his S crashed under a semi as well as drivers documenting their “less than intelligent” escapades were collectively responsible for the imposition of the nanny nag to the point that it is very challenging now for AP1 users to use it productively. Be responsible, keep you hands ON the wheel and ENJOY THE RIDE , it’s why you bought a TESLA!
 
A few observations from a BETA tester for AP1, self-park, summon, etc in a model S and then Founders X.
Your collective videos, data sharing and discussions are great and accomplish Tesla’s design to further the gigantic achievements toward full FSD faster than could be achieved alone in their labs. Don’t think for a parasec that your discussions and vehicle reactions are being missed by Tesla. Their guys are intensely interested in this data. There is a secondary safety thread permeating this discussion. PLEASE remember this is a BETA system and YOU are still responsible for any results that may be caused by the software/hardware and YOUR lack of ability to counteract the anomaly. The catastrophic death resulting from a BETA AP1 driver viewing a DVD with hands off the wheel as his S crashed under a semi as well as drivers documenting their “less than intelligent” escapades were collectively responsible for the imposition of the nanny nag to the point that it is very challenging now for AP1 users to use it productively. Be responsible, keep you hands ON the wheel and ENJOY THE RIDE , it’s why you bought a TESLA!
Extremely well said!
 
Screenshot 2020-11-18 105233.jpg

There are problems with this recent drive up Mount Umunhum by Tesla Owners Silicon Valley (see link in previous post). The biggest problem is not that it unexpectedly crossed the double yellow line, approaching a curve, in the fog, on a wet road. The biggest problem (in my opinion) is that the driver has a young child in the back seat as he has for numerous FSD beta test drives.

There are those who don't find any problem with taking the child along but I think the term risk management should be considered. Our first thought is obviously for the safety and well being of the child (and the father). Beyond that, think what would happen to the whole FSD program in the event of an accident, regardless of the cause. The press would run with the story and ask very loudly why a carefully selected beta test driver had a young child in the car while running a test under these conditions.

I know some say this is acceptable and in most cases it would be, however, the risk/reward ratio is out of whack and even the slight chance of a bad event is not acceptable given the huge negative consequences. What are your thoughts?
 
View attachment 609477
There are problems with this recent drive up Mount Umunhum by Tesla Owners Silicon Valley (see link in previous post). The biggest problem is not that it unexpectedly crossed the double yellow line, approaching a curve, in the fog, on a wet road. The biggest problem (in my opinion) is that the driver has a young child in the back seat as he has for numerous FSD beta test drives.

There are those who don't find any problem with taking the child along but I think the term risk management should be considered. Our first thought is obviously for the safety and well being of the child (and the father). Beyond that, think what would happen to the whole FSD program in the event of an accident, regardless of the cause. The press would run with the story and ask very loudly why a carefully selected beta test driver had a young child in the car while running a test under these conditions.

I know some say this is acceptable and in most cases it would be, however, the risk/reward ratio is out of whack and even the slight chance of a bad event is not acceptable given the huge negative consequences. What are your thoughts?
The child will be fine, I'm worried about other people who don't have 6000 pounds of metal around them.
This guy is the most useless beta tester. He never presses the report button!