It happened like I said.Gotcha. I'm talking about the driver pressing the accelerator and the car refusing to accelerate. That never happens, right?
You can install our site as a web app on your iOS device by utilizing the Add to Home Screen feature in Safari. Please see this thread for more details on this.
Note: This feature may not be available in some browsers.
It happened like I said.Gotcha. I'm talking about the driver pressing the accelerator and the car refusing to accelerate. That never happens, right?
Having scrubbed back and forth I think I agree .. the audio alarm kicks off at 0:11, the dash warning appears at 0:13. In my experience these are usually simultaneous. You can hear the brake pedals applied too, but it's tricky to see when the car starts slowing. I'd say the car doesnt see the dummy till 0:13, which makes the reaction time of the car pretty good imho.I had to edit my initial post as I honestly can't tell if the audio is early. Something seems off though.
I’ve never had a bad Uber or Lyft driver (sample size about 20 maybe). Can’t ever remember being jerked around or my wife being anxious.The question still stands. I find a lot of Uber / Lyft drivers jerky and non-anticipatory too !
I'd compare FSD to an inexperienced, anxious driver. But might still pass the driving test ...I’ve never had a bad Uber or Lyft driver (sample size about 20 maybe). Can’t ever remember being jerked around or my wife being anxious.
Never tried Waymo.
I would imagine that is the intent .. though a lot of ppl are NOT going to be pleased since what is now "FSD (Supervised)" was once positioned as much more L5.It works so good that it really makes me want it to be just 1 step up better so I can use it without looking out the window or holding the steering wheel
I didn’t feel that way before until it started getting good
I wonder since FSD supervised has replaced enhanced autopilot that they might release a level 3 FSD unsupervised for 15k
So it will be free auto pilot
8,000 supervised fsd
15,000 unsupervised
I hope they don’t do this
Have you ridden in the backseat with someone else driving V12 FSD? It's an interesting experience and provides a different perspective. Of course if you just seat in the backseat and try and critique the driver and FSD it's not that worthwhile.I’ve never had a bad Uber or Lyft driver (sample size about 20 maybe). Can’t ever remember being jerked around or my wife being anxious.
Never tried Waymo.
Can this be a fair test? Can the mannequin represent a human? FSD is trained to recognize humans by head direction, hand and leg movements. Is the mannequin recognized by FSD as an object or a human?I'm gonna edit this one as I can't tell if the audio is early...
Here's an interesting test although the video is terribly overexposed. The results seem impressive but I can't tell if something is awry.
Is the audio track early to the mannequin's initial presence, when the vehicle initially responds, and the UI alert. Her grunting sounds as if from heavy decel yet the displayed mph didn't change until later? Or is the grunting from the shock of a mannequin pulling out in front?
Unfortunately it's a washed-out poor resolution video to draw much conclusion from.
Well a "fair" test would use a real human (even a child!) .. and there you are drifting into an ethical nightmare. But even is it wasn't seen as a human, it stopped for the obstacle, which seems to be a good outcome.Can this be a fair test? Can the mannequin represent a human? FSD is trained to recognize humans by head direction, hand and leg movements. Is the mannequin recognized by FSD as an object or a human?
Why it would matter that it's software? If it has no resale value I suppose it should have no first sale value either.Of course it has no trade-in value. Why would you expect otherwise? Since it's software, Tesla has an infinite inventory of FSD upgrades at zero cost to them, so how could it have any trade-in value? You might as well try to sell your copy of Windows back to Microsoft.
sorry, we should always state the version v12.3.4First, Love SFSD V12. Driving every day for multiple drives, very min disengagements if any. Today, usual good right hand pass on a single lane road for a left turning person, enough room in the shoulder. By my house the main road is single and usually the T running SFSD does this perfectly. This time the short term memory loss kicked in, once it passed on the right, it stayed in that part of the road thinking it was in a lane, as the shoulder line was a substantial pained line. But come on, you just went to the right, get back in the main road. I disengaged.
There are two possibilities for the car reactions:Well a "fair" test would use a real human (even a child!) .. and there you are drifting into an ethical nightmare. But even is it wasn't seen as a human, it stopped for the obstacle, which seems to be a good outcome.
We have 4 (truck doesn’t work yet) but all three of the model s and x vehicles drive differently on FSD, two of them have the same hardware (HW4) and exhibit different issues in different places. Since v12 I’ve often wondered if the training videos coming from all different models is “interpreted” differently by each model and the slight change in camera position and height causes miscalculations (a model x thinks it’s a model 3) and that’s why there’s more curb rashing issues than with v11 and prior.Does anyone else in this thread have two Teslas? If so, are you noticing this: I have two 2020 Teslas, an AWD 3 and a MYP. Both have been on identical versions of V12.x, now both on V12.3.4. The Y is much, much better at FSD. I have no idea why, but I very rarely have a safety related intervention with the Y, but with the 3 it's a much more common experience. I have re-calibrated the cameras on the 3, no help. I swear, it's like entirely different versions of FSDS on each.
I have had some pretty terrible Uber drivers.I’ve never had a bad Uber or Lyft driver (sample size about 20 maybe). Can’t ever remember being jerked around or my wife being anxious.
Never tried Waymo.
While Tesla have not said so explicitly, it's pretty clear that the car takes into account the probability of being rear-ended when it handles emergency braking (remember is always knows the distance to the car behind, and the software can handle the decision making process in a minute fraction of a second).There are two possibilities for the car reactions:
If it's a child then the FSD needs to save the child by braking or swerving the car. These actions may cause injuries to the passengers in the car (rear ended for example).
If it's a small object that cannot cause severe accidents then FSD may just run over the object instead of using emergency braking that may cause injuries to the passengers.
If the tester uses a mannequin that can walk like a human (robot with human makeup) then the test result will be better.
The key word here is HW4. I think it has infrared cameras on the side rearward facing ones. My HW3 Model 3 continuously thinks one of those cameras is blocked or blinded.I use FSD at night on the highway all the time. No problems for me. Rain or shine doesn't matter.
Model Y (HW4)
Literally just yesterday was following a Tesla through my neighborhood which was taking FOREVER at every stop sign, and it was repeated over and over. Saw it do a multistep creep and hesitant right turn. Followed by what some might term jack rabbit start - certainly seemed out of place given other behavior. .I have been trying to look see other Tesla drivers lately to see if they are using FSDS and I have yet to confirm anyone using it. Got me thinking abut the "human nature/perception" conundrum. Seems sitting behind the wheel completely changes human perception of what is happening and what we SHOULD be doing. I suspect this would be the case even in a L4 car. It may be humans will continue to drive no matter how bad/dangerous as long as they have a steering wheel and sit behind it. All you can see is what you would do and not except what the system is doing like you would sitting in a passenger seat.
But I couldn't trust it not to do that when there was someone behind me. And I wasn't about to put myself into that situation to find out if I can trust it.Remember phantom braking? Sure it was a problem but everyone was panicking about "It was lucky there was no-one behind me when it braked so hard" .. what was missed here was it only braked that hard because there WAS no-one behind the car.