Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
My eyes are pretty sensitive, so I wear sunglasses just about any time I'm not wearing my yellow night-vision glasses. But I can see being in a mood for hands-free and just sacrificing the sunglasses for that period. OTOH, keeping some torque on the yoke has never been all that much of an inconvenience.

It's an unfortunate but probably necessary requirement as long as FSD is L2 supervised.

When the idea of removing the wheel-torque attention sensor for the cabin camera was first mentioned, Reddit was flooded with comments like "Sweet. I'll just wear dark sunglasses and take a nap."

FSD is accessible to enough people now that Tesla needs to treat its customers as if they're adversarial, instead of being able to trust us.

That being said, the release notes don't say anything about eye-glasses, so you might be able to get away with something like yellow-tinted driving glasses.
 
If the cabin camera can't see your eyes, makes sense to default back to the steering nag. I think a lot of people saw the tweet about the steering nag removal and got super excited thinking it was absolute instead of only under the right conditions. Assuming it works as intended, this is a big step forward in my opinion and will happily not wear sunglasses or find some with only a light tint so the car can stare deeply into my eyes just like a Jane Austen novel.

The question ultimately will be how sensitive and how quickly the cabin camera triggers a nag based on where you are looking. While I haven't personally experienced it, have seen many others complain that it only takes a few seconds of looking at the screen to trigger a nag. What I wonder though is if that triggers much faster when it sees your hand going towards the screen instead of just your eyes looking at the screen for traffic visualization or something else. If so, this is when pressing the mic button and using voice commands may work better. Certainly something to test.

The rate of new releases and speed of progress on FSD however is extremely exciting.
I was kinda hoping that the IR based sensor can still determine where you are looking without introducing no-sunglass limitation.
I would keep my hands on the wheel regardless, but this would remove the infrequent remaining nags, in my case.

For me, the most relaxing drive would be one of my hand on the wheel without any deliberate force, while looking roughly towards front, nag free of course.
 
That's unfortunate. Don't most people wear sunglasses during the day while driving?

Looks good, except for the sunglass requirement...
Well, it’s still a huge improvement and if the camera can’t see your eyes behind sunglasses there’s no way for it to effectively assess gaze.

FaceID on iPhones can see through at least some sunglasses. Not sure if they’re using a different wavelength of IR or if they have some other method they use.
 
Even on the current version 2024.3.25 I have noticed that in camera vision is reacting more to screen usage, either in dash screen OR phone usage.. I can essentially trigger a flashing blue or RED “pay attention” alert just by searching for SC on the dash screen or looking at phone for ~ 5 seconds.
As your not supposed to use your phone for Any seconds the system is doing its job.
 
If the cabin camera can't see your eyes, makes sense to default back to the steering nag.
I had the FSDS for that month trial. I always wear a baseball cap (the cap is much better than the pull down sun visor) and polarized sun glasses. In that whole month the car never gave me a single warning. But of course, I played by the rules and kept my hands on the yoke.

Looking at eyeball pupils, I don't think the system is doing that (only guessing like the rest of us). This can be tested by keeping the head straight and moving the eyes off the road. It's more likely (and even easier with a cap) the system looks at head direction.
 
It's an unfortunate but probably necessary requirement as long as FSD is L2 supervised.

When the idea of removing the wheel-torque attention sensor for the cabin camera was first mentioned, Reddit was flooded with comments like "Sweet. I'll just wear dark sunglasses and take a nap."

FSD is accessible to enough people now that Tesla needs to treat its customers as if they're adversarial, instead of being able to trust us.

That being said, the release notes don't say anything about eye-glasses, so you might be able to get away with something like yellow-tinted driving glasses.
Exactly. There are already enough idiots doing stupid stuff. They’ve proved beyond all doubt that Tesla can’t trust the general public to be smart. (Like we didn’t already know that!) on top of this, Tesla gets blamed for others’ stupidity. Yes, this is why we can’t have nice things.
 
I had the FSDS for that month trial. I always wear a baseball cap (the cap is much better than the pull down sun visor) and polarized sun glasses. In that whole month the car never gave me a single warning. But of course, I played by the rules and kept my hands on the yoke.

Looking at eyeball pupils, I don't think the system is doing that (only guessing like the rest of us). This can be tested by keeping the head straight and moving the eyes off the road. It's more likely (and even easier with a cap) the system looks at head direction.
We know exactly what they are looking for. With the older system, without sunglasses, they were looking at your eyes, but with sunglasses, it was the direction of your head. There's no reason to guess, there's videos of what the system is looking for.
 
It feels like the v12.4 'over the weekend' release was by the skin of their teeth. Hopefully it lives up to expectations as everything TSLA is heavily dependent on FSD success these days.

We've been waiting for v12 e2e magic to materialize for a while now. Aside from the usual sycophants, if v12.4 doesn't demonstrate significant improvements it might finally be time to stick a fork in the TSLA approach. 🎤(drop)
I'm sad that it's not on the 2024.14 branch, which my mylr is running.

By the time 12.4 is pushed to 2024.14, 12.5 would be out soon.

Maybe I should skip 12.4 altogether and wait for 12.5 instead.
 
  • Like
Reactions: enemji
The problem with OpenSteet Maps is shown below. See if you can find it.

IMG_0109.jpeg
 
If the cabin camera can't see your eyes, makes sense to default back to the steering nag. I think a lot of people saw the tweet about the steering nag removal and got super excited thinking it was absolute instead of only under the right conditions. Assuming it works as intended, this is a big step forward in my opinion and will happily not wear sunglasses or find some with only a light tint so the car can stare deeply into my eyes just like a Jane Austen novel.

The question ultimately will be how sensitive and how quickly the cabin camera triggers a nag based on where you are looking. While I haven't personally experienced it, have seen many others complain that it only takes a few seconds of looking at the screen to trigger a nag. What I wonder though is if that triggers much faster when it sees your hand going towards the screen instead of just your eyes looking at the screen for traffic visualization or something else. If so, this is when pressing the mic button and using voice commands may work better. Certainly something to test.

The rate of new releases and speed of progress on FSD however is extremely exciting.
"have seen many others complain that it only takes a few seconds of looking at the screen to trigger a nag."
That's correct. I like to look at the ego and other cars on the screen but I cannot look longer than 5 or 10 seconds. If I look longer than that I will see a camera nag. Also the nag happens more at night.
 
Regarding the video quality, I think it's more likely that HW4 video output downrezzed (aka down-sampled aka decimated) to HW3 will be of somewhat higher quality. I haven't seen any direct comparisons but:
  • There are theoretical and practical benefits from the original image being taken at a higher resolution i.e. spatial sampling frequency. I wouldn't put too much on that without testing and knowing more about the level of anti-aliasing measures taken in both hardware setups.
  • Aside from pixel resolution, I believe the new cameras can operate at a higher frame rate. I don't know if they are simply being run at the HW3 rate now, or if the video is being processed to normalize the rate.
  • Perhaps the larger effect, in the Tesla HW3-HW4 case, is that the newer-generation cameras supposedly do a better job of suppressing sensor overload artifacts from bright sun in the frame. It's also likely that the low-light performance and noise floor is superior.

I agree that the video down-sampling is not the whole story regarding Elon's comment. Aside from the question of video input quality, I think he means that clips from both HW3 and HW4 are used to train the same model, and then both in-car computers are running the same inference model. Not taking any real advantage of the more powerful computer in the HW4 vehicles.

While in general, there can be a loss of performance in having one computer type run another's native code using an emulation layer, I don't think that itself is a significant problem here. I would be shocked if the HW4 computer had not been engineered to run HW3 code directly at full performance; I think it was essentially a mandatory design requirement for the rollout of functional HW4 vehicles in 2023. Perhaps "compatibility mode" would have been a clearer term.

So I don't think the "emulation" represents a loss of quality in video or in compute - more likely some modest actual gain in the former. Elon's comment should mostly be taken to mean that HW4 hasn't achieved its full potential, because they haven't yet deployed a version customized for it.
I'm curious how they're stitching the HW4 feeds to the NN in an emulated fashion. My understanding is that HW4 has 1 less front camera - HW3 had 3 forward cameras, 1 wide angle, 1 standard, and 1 narrow focus camera. HW4 has two front facing cameras (not sure if they are both standard, or is one wide angle). There are 3 cameras on HW4, but several articles have come out saying the 3rd camera is a dummy placeholder and non-operational.

Since v12 is trained on 3 front cameras stitched, how is HW4 being "emulated" to match that config?
 
I'm getting popcorn to watch the cage match between Tesla, with less nags, and NHTSA who's pissed about there not being enough driver attention confirmation.
I don't see those release notes as less nags at all. They are being more strict on camera nags and if you don't satisfy them, the wheel nag will come back. They're removing the ability to use sunglasses as an advantage as well