Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Cabin camera covered with tape and now I can't use my $5,000 Autopilot option. version 2024.3.25

This site may earn commission on affiliate links.
I bought my Model 3 new in 2018 and paid $5,000 for the autopilot option. I had been using AP all the time until I installed the 2024.3.25 update. I use a piece of tape to cover the networked cabin camera because I am tired of living in a surveillance state (I'm not paranoid, just tired). I am really upset because I was under the impression that the $5,000 I spent on the AP option gave me ownership of the option in my car that I paid for and continue to pay for.

I'm thinking this is similar to the "Right to Repair" court case that was brought against John Deere. I would like to continue using the option I paid for or get my money back. Does anyone else feel the same? Are there any workarounds or software reversions? Are there any legal remedies?
 
  • Funny
Reactions: APotatoGod
The only thing I can say is that you are perhaps a bit paranoid, because the in-cabin camera is not sending data out of the car (unless you use it to monitor the in-cabin view yourself). So there is no surveillance going on..at least supposedly. If you think there is, then that starts to sound like paranoia.

That aside, I suspect that legally the $5000 you paid for the autopilot option was in fact a perpetual license to use that system. You don't actually actually "own" the option. Unfortunately that's the way the world is headed these days with any kind of software. And because the license is subject to whatever terms & conditions the licensor sees fit (even if they change, as your continued use of it implies that you agree to those amended terms), I suspect you would not have much luck battling this legally. I'm not saying I agree with the practice, but just being pragmatic about it.
 
Actually, your problem is likely less with Tesla and more with the NHTSA. The NHTSA wants drivers to have their eyes out the window and paying attention. Other car companies have instituted IR eyeball tracking for that function; Tesla’s has a regular camera pretty much doing the same thing, as well as some fol-der-ol with torque on the steering wheel.

I don’t know if you’ve been watching the recent news, but the NHTSA wants proof that this is actually working, with actual numbers on the alerts the car is making. They’re insinuating that Tesla’s methods aren’t working and there may be deaths (i.e., idiots not paying attention and reading books or playing games etc) as a result.

It doesn’t help that there are idiots around who use your taping scheme as well as weighted defeat devices.

FWIW Tesla states pretty blame clearly that the video used for the eyeball tracking doesn’t leave the car.. but I think there might be an exception for when airbags get deployed.
 
I bought my Model 3 new in 2018 and paid $5,000 for the autopilot option. I had been using AP all the time until I installed the 2024.3.25 update. I use a piece of tape to cover the networked cabin camera because I am tired of living in a surveillance state (I'm not paranoid, just tired). I am really upset because I was under the impression that the $5,000 I spent on the AP option gave me ownership of the option in my car that I paid for and continue to pay for.

I'm thinking this is similar to the "Right to Repair" court case that was brought against John Deere. I would like to continue using the option I paid for or get my money back. Does anyone else feel the same? Are there any workarounds or software reversions? Are there any legal remedies?


"I was under the impression that the $5,000 I spent on the AP option gave me ownership of the option in my car that I paid for and continue to pay for."
Nope, you only paid for the rights to use it. You do NOT own the software.

Can you access your car via the Tesla app?

You own a Tesla, one of the most connected vehicles in the world.

The software that implemented the change was a part of a recall.
 
I bought my Model 3 new in 2018 and paid $5,000 for the autopilot option. I had been using AP all the time until I installed the 2024.3.25 update. I use a piece of tape to cover the networked cabin camera because I am tired of living in a surveillance state (I'm not paranoid, just tired). I am really upset because I was under the impression that the $5,000 I spent on the AP option gave me ownership of the option in my car that I paid for and continue to pay for.

I'm thinking this is similar to the "Right to Repair" court case that was brought against John Deere. I would like to continue using the option I paid for or get my money back. Does anyone else feel the same? Are there any workarounds or software reversions? Are there any legal remedies?

Screenshot 2024-05-07 at 2.39.24 PM.png
 
You are not paranoid, it is frustrating. And didn’t Tesla get into some hot water some months ago because employees WERE watching the cameras and passing around some compromising photos?
As it happens, I was reading Tesla's statements on what data they collect and what they do with it recently. A friend of the SO's bought a MY last week (yea! We got a referral!), so I was very carefully going over the buttons and settings on the car. Especially since, since she was on the receiving end of the referral, she was getting three months of FSDS with the car. (FWIW, the car came with 12.3.3.)

Big, interesting point: A very short statement that Tesla does not sell any information collected from the cameras to anybody. Or give it to their 5,000 nearest buddies or something. Very direct statement: Let me look it up:
-------------------------
Your Tesla generates vehicle, diagnostic, infotainment system, and Autopilot data. To protect your privacy from the moment you take delivery, Tesla does not associate the vehicle data generated by your driving with your identity or account by default. As a result, no one but you would have knowledge of your activities, location or a history of where you’ve been. Your in-vehicle experiences are also protected. From features such as voice commands, to surfing the web on your touchscreen, your information is kept private and secure, ensuring the infotainment data collected is not linked to your identity or account.
Tesla vehicles are equipped with a camera suite designed from the ground up to protect your privacy while providing advanced features such as Autopilot, Smart Summon, and Autopark. To recognize things like lane lines, street signs and traffic light positions, Autopilot data from the camera suite is processed directly without leaving your vehicle by default. In order for camera recordings for fleet learning to be shared with Tesla, your consent for Data Sharing is required and can be controlled through the vehicle’s touchscreen at any time. Even if you choose to opt-in, unless we receive the data as a result of a safety event (a vehicle collision or airbag deployment) — camera recordings remain anonymous and are not linked to you or your vehicle.
Additionally, from Powerwall to Solar Roof, your Energy products are designed to protect your privacy. Tesla aims to collect a minimum amount of personal data necessary for displaying your in-app energy experience, providing services to you, and for improving your energy products. We are also committed to only share your personal data when needed to operate or service your product, or we will ask for your permission.
--------------------------
Heck, my ISP's so-called Privacy (well, non-Privacy) notice is about ten times bigger than the above. Note that data gets processed locally only, unless you give explicit permission for it to be sent.

Now, with respect to that hot water: People on FSD or just driving around are politely asked if they would, please, send Important Video Clips to Tesla, with the intent of improving the car's software. You don't have to do this. Many people do. And expected that data to be securely handled, which was promised.

Turns out that access to the database of clips wasn't controlled as well as the people who wrote those statements thought. Word, within Tesla, got out that there were some "fun" clips in there and some Tesla internal idiots started sharing the better ones around.

Finally, somebody with sense realized what was going on, reported it, and, internally, all heck broke loose. Don't know if people got fired, but the databases were thoroughly secured with access strictly on a need-to-know basis, which is what it should have been in the first place. And Tesla confessed and did a mea culpa.

The usual click-baiters made up headlines saying that "OMG Tesla Data Is Totally Insecure!" because, well, that's what click-baiters and anti-Tesla types do. There have been zero reports that the video clips made it out of the company; and I haven't heard of any lawsuits. Yet.

Compare the above with GM's and Ford's selling of user data to anybody with a nickel, to the point of congressional hearings and the like.
 
Tesla might know. I get that. (One can turn that off, though.)

Did you see the line above where location data is kept separate from the identity data, and neither leaves the company?
Yes, and I'm not really concerned either way. My point is if the company or an employee of the company violates their privacy policy, it would seem that they do have the ability to associate your car with your account and it's location which would be more concerning than having video of me driving.
 
The camera is used for surveillance, but not the way you're suggesting. It's an integral part of the AP system and is used to ensure the driver is paying attention to the road while AP is in use.
I'd like to follow-up on this statement for the OP's benefit.

First, I don't think you're paranoid. When we get used to and dependent on our technology, it's primed for abuse. A statement that a company "would never" use/send/sell your data is applicable to the moment in time when that statement was made. Things change. The question you'd have to ask is, "do they have the ability to abuse said data?" But, the big question I ask, "is the company warrant-proof?" They may not want to provide your data to outside entities (i.e. the government), but that doesn't mean the can't be compelled to provide that data.

Second, and to the statement made by zoomer0056, I agree that the cabin camera is an integral part of the AP system. Let's take your argument that you own the AP software (which I agree with). So, you bought the AP software with the vehicle and you're then able to completely disconnect the vehicle from any communication networks. You essentially would now have a 'stand-alone' Tesla. I believe it still would disable AP if the cabin camera were blocked. It could be there is some pattern (people) recognition algorithm running as part of the code and a blank screen would cause it to fail that check. Your car doesn't have to be connected to a network for that to work. You still have/own the AP software, it just fails that check and won't enable it.

Joe
 
I believe it still would disable AP if the cabin camera were blocked. It could be there is some pattern (people) recognition algorithm running as part of the code and a blank screen would cause it to fail that check. Your car doesn't have to be connected to a network for that to work. You still have/own the AP software, it just fails that check and won't enable it.
That's correct. There is an algorithm running on the car that performs pattern recognition to figure out driver attentiveness.

Here's a 23 minute video from 2021 by greentheonly showing the signals coming from the onboard software as he goes on a drive at night. It gives a sense of the difficulty that Tesla has with driver monitoring without infrared lighting in the car.