Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Model 3 has dedicated rain sensor, a bad sign

This site may earn commission on affiliate links.
They may have been doing high-speed high high-acceleration runs prior which would explain the high consumption. That also may have heated the battery which would explain the slow charge rate. These are mules so they're going to be out stressing them to try and get them to break, not tiptoeing through the tulips.
The charge screens don't base their range gain estimates on your driving average. They use the same static # the rated range estimator does.

I think the earlier suggestion that the car was getting far less than 70KW initially due to stall pairing is more likely.
 
The tiny sensor on AP2 people think is a rain sensor is kind of small for sending rain (raindrops may not fall on it), but most importantly, if it really was such a sensor, why on earth would Tesla not deliver this functionality in the last 8+ months? Unless of course the sensor is usless, or Tesla cannot keep get their developers to get it done somehow (if it is a dedicated rain sensor, how difficult would it be? And if they cannot get that done in 8 months, how on earth are they going to do FSD in the next year or even two????).

We have both AP1 and AP2 car, AP2 car light sensing sucks (lights turn on even when bright sunny weather) and rain sensing does not exist at all. AP1 lights work well, rain sensing not perfect but it's there.
 
  • Like
Reactions: Matias
This article Rain sensor - Wikipedia shows a picture of a rain sensor:
1280px-Regensensor1.JPG

And explains they work by reflecting infrared off the glass/evironment interface:
288px-Rain_sensor_en.svg.png

It seems to me that doing rain sensing with a camera is difficult because of focal length. Perhaps detecting distortion as the view changes due to motion might be detectable?
 
Last edited:
It seems to me that doing rain sensing with a camera is difficult because of focal length. Perhaps detecting distortion as the view changes due to motion might be detectable?
Exactly, if you ever worn glasses in the rain, you will know that it's not hard to not see droplets on the glasses. Unfortunately Elon doesn't wear glasses or live somewhere where it rains much, hence we are where we are with no rain sensor on AP2.
 
Have you noticed the headlights come on after running the wipers for AP2 cars?

That doesn't happen with our pre-AP S P85.
I know this is getting slightly tangential, but I do have this behavior with my pre-AP S85 (early 2013 flavor)

That is: if the wipers trigger (either from rain sensor or manually initiated), the headlights do automatically come on when in "Auto" mode.
 
  • Like
Reactions: croman
I'm not photographer or AI vision expert, but I think lense flare can be detected

View attachment 232437
Even if it could be done (remember in this case you need to detect rain on the windshield, not rain falling in the background or the fact that you are driving towards a water fountain), at what cost in terms of processing power? I think it's possible that whoever made the decision to replace the rain sensor with the cameras did not do a full cost-benefit analysis. Save $x on an infrared sensor, in exchange for how much processing power in terms of computer vision processing?
 
Even if it could be done (remember in this case you need to detect rain on the windshield, not rain falling in the background or the fact that you are driving towards a water fountain), at what cost in terms of processing power? I think it's possible that whoever made the decision to replace the rain sensor with the cameras did not do a full cost-benefit analysis. Save $x on an infrared sensor, in exchange for how much processing power in terms of computer vision processing?

Most likely the ability to detect such distortion is also useful in driving, meaning that during training a network would likely learn to identify it anyway(and forcing it to may actually improve learning generally). It's not uncommon to have a network perform double duty on closely related tasks, as the combination can do better on both tasks than two networks would be able to be at doing each one independently.
 
Most likely the ability to detect such distortion is also useful in driving, meaning that during training a network would likely learn to identify it anyway(and forcing it to may actually improve learning generally). It's not uncommon to have a network perform double duty on closely related tasks, as the combination can do better on both tasks than two networks would be able to be at doing each one independently.
So the reason we don't have it yet is simply it hasn't rained in California much since AP2 introduction? The network is up and running, learning, yet no auto wipers and even auto-headlights on/off sucks - I recently noticed our AP2 car turns lights on even in perfectly sunny weather (it was after 6pm so maybe Tesla hard-coded when when the lights should go on, but haven't taken into consideration that sun goes down much later in the summer).
 
Last edited:
  • Funny
Reactions: davidc18