Forgive me if this has already been discussed... I'm curious as to what most people have their cars set to for the default charge rate at home.
You can install our site as a web app on your iOS device by utilizing the Add to Home Screen feature in Safari. Please see this thread for more details on this.
Note: This feature may not be available in some browsers.
I've got dual chargers but I scale mine back to 20amps to reduce line losses...
I've got dual chargers but I scale mine back to 20amps to reduce line losses...
Curious about that and whether there is an "optimal" amperage for charging. I primarily charge from a NEMA 14-50 outlet and have been using the default 40A setting to minimize charging time. Recently I've been experimenting with dialing it back as the UMC cable gets quite warm while drawing 40A continuously (something I'm sure everyone else has discovered). Using 32A or less seems to solve that problem. So, assuming there is plenty of time overnight, is 20A going to be more efficient than 30A in terms of minimizing both energy draw and energy loss? The UMC manual says that the outlet must be rated for at least 15A, so that's probably the lower bound, but I always thought that was supposed to be much less efficient than a higher setting.I've got dual chargers but I scale mine back to 20amps to reduce line losses...
Curious about that and whether there is an "optimal" amperage for charging. I primarily charge from a NEMA 14-50 outlet and have been using the default 40A setting to minimize charging time. Recently I've been experimenting with dialing it back as the UMC cable gets quite warm while drawing 40A continuously (something I'm sure everyone else has discovered). Using 32A or less seems to solve that problem. So, assuming there is plenty of time overnight, is 20A going to be more efficient than 30A in terms of minimizing both energy draw and energy loss? The UMC manual says that the outlet must be rated for at least 15A, so that's probably the lower bound, but I always thought that was supposed to be much less efficient than a higher setting.
Is that really more efficient? I thought charging at a higher rate lost less. Also, slower charging means running the battery cooling longer, etc.
Would love to see this data on the S! It exists for the Roadster and the results were...The lower bound is 5 amps... electrically lower is ALWAYS more efficient; I'm not sure about the chargers themselves... there might be a 'sweet spot' where 20 amps is more efficient than 10. Line losses are Current² x Wire resistance; so the amount of energy lost to heat increases exponentially with higher current. You loose 4x as much energy at 80 amps that you loose at 20 amps.
I've also heard rumors that certain charge levels are better for the battery than others but that's likely temperature dependent and I've never seen anything 'official'.
Lower can also mean you don't run the battery cooling AT ALL... as far as I'm aware battery cooling only runs when its needed. I could be wrong... the pump might circulate coolant continuously but the fans certainly don't run when I charge @ 20 amps unless its REALLY hot outside.
...there's not much variation in charging efficiency when charging at or above 240V at 32A, but energy use rises noticeably at lower power levels.
Would love to see this data on the S! It exists for the Roadster and the results were...
Tesla Roadster Charging Rates and Efficiency - Tom Saxton's Blog
I have 80 Amps available, but charge at 56 Amps for half the resistive power everywhere and sqrt(2) the total resistive losses for the charge. Using more than 40 Amps, tests both chargers in the car.
Hmmm... that's a good idea... the car doesn't split it? If I charge @20 amps it's 20/0 not 10/10?
If you have dual chargers, the car uses one module up to 40 Amps and then splits the load from 41 to 80 Amps.
Edited this to correct my error which was wrong information.I've got dual chargers but I scale mine back to 20amps to reduce line losses...