Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Displayed Range and Seasonality

This site may earn commission on affiliate links.
Do you mean they reset the BMS range algo? I heard that SvC don't normally do this on customer cars.

I was missing all energy history - trip counters, energy graph, etc. It was wiped. Can't confirm it was done, but it falls in the same place as the divergence of temp vs. range (then comes together about a month later).
 
In a quest to find tighter correlation, I did some more work. This time, I pulled temperature samples only within 30 seconds of the 100% SOC range samples I had pulled.

View attachment 101429

The correlation is now tight, with only two explainable areas of disconnect: 3/2014, just after the new pack had been installed; and 8/2015, just after my car's energy data had been reset due to the accident repairs from colliding with a deer.

I haven't done the thorough analysis you have but I live in Canada where temperatures have recently dropped as we head into winter. After doing a range charge and switching my range display from energy to distance, I noticed the biggest range drop I've seen since I've owned the car (got it after last winter).

Now, my car sleeps and charges in a heated garage so the difference in temperature during charging wouldn't be dramatic but wondered if perhaps rated range doesn't take into account the recent temperature readings as the car was out and about. It would be logical when you think about it. If range drops alongside temperature, why not factor it in as you estimate range.

I also wondered if displayed rated range gets affected by tires and wheels. I noticed the change in range shortly after I had my winter tires put on.
 
I apologize in advance if people on this thread find that anything I am about to say is just stating the obvious, but I want to share my view on range seasonality.

I have never kept the battery meter on miles of range because I don't know how they calculate the number and thus don't trust it. I keep the meter on percent SOC. So instead of watching the range vary seasonally, I watch Wh/mi of consumption vary seasonally. The capacity of the battery does not vary seasonally or with temperature, with one caveat I'll get to. You can figure out the capacity of your battery: use the trip meter, or even better the app on the v7.0 IP that resets automatically for each leg driven. Multiply miles driven for a leg by Wh/mi for the leg and divide by 1000 to get kWh consumed, and then divide by the change in % SOC between the beginning and end of the leg. So if you start to drive at 66% SOC, end at 57, drive 18.8 miles using 362 Wh/mi, which multiplies to 6.81 kWh, and divide by the .09 change in SOC, you get 76 kWh, which is what 100% capacity means for your pack (the numbers are from a drive I did yesterday). Because the SOC meter rounds to the nearest percent, the rounding errors throw off the result, so you have to average multiple legs. Measure each individual drive so the calculation isn't thrown off by vampire losses while parked.

The conclusion is that 100% on my pack is about 73 kWh consistently, cold weather and hot. I have a P85D and the difference between the rated 85 kWh and the 73 kWh I get is headroom, anti-bricking margins, etc. This does not change seasonally. It would be almost exactly consistent if SOC was measured in terms of charge (integrated current) rather than energy (kWh of integrated power): that is the caveat I mentioned above. But pack capacity is consistent enough that pretty much all the seasonal change in range is due to energy consumed per mile of driving, which is Wh/mi on the display and is roughly 300 for me in summer and closer to 400 in winter. On that basis, winter loss of range for me looks to be roughly 1-300/400=25%.

Why is more energy consumed per mile in the cold? I don't think it's motor oil or transmission fluid viscosity! (since there isn't any) I doubt other frictional, or rolling resistance losses are significantly higher, although air drag is a tiny bit higher due to higher air density in the cold. I believe the main differences are 1) energy for heating the battery and 2) loss of regen to partly replenish the pack when it's cold. I think I saw on the forum somewhere that the MS has a roughly 5 kW pack heater. If you drive 50 mph while this is on, it adds 100 Wh/mi to consumption (in an hour it consumes 5 kWh while you travel 50 mi, so that's .1kWh/mi = 100 Wh/mi). That's the difference between 300 Wh/mi and 400 Wh/mi right there.

If you drive short legs and park in the cold in between, you probably use the heater most of the time you are driving and range is reduced by the full 25% I calculated. If you drive a single long leg, the heater is probably on for only the first part of the trip, and likewise for the loss of regen. Then cold-weather range will fall by less. For the rest of the trip the battery is warm and it's as if you are driving in the summer. Driving after being parked at a given cold temperature there is probably a specific amount of energy -- a once per trip energy penalty (kWh or % SOC) for warming the pack. Some point this winter I might take experimental drives to measure what the relationship between temp and this energy penalty is.

That's the story as I understand it.
 
Why is more energy consumed per mile in the cold? I don't think it's motor oil or transmission fluid viscosity! (since there isn't any)
There is certainly fluid in the reduction gear set and although less than a traditional drive train, it's not going to be zero, which will account for some of it. Cold air is denser and at highways speeds most of the power is used pushing air (I suspect air density is the major player). If you haven't adjusted your tire pressures to compensate for the lower temperatures, that will add a lot. The heating certainly adds a lot although if you preheat and time charging to end when you are about to start driving, it's effects can be minimized. On days with rain or snow covered roads, it's easy to use 10% - 20% more just from the increased rolling resistance.
 
I have never kept the battery meter on miles of range because I don't know how they calculate the number and thus don't trust it. I keep the meter on percent SOC. So instead of watching the range vary seasonally, I watch Wh/mi of consumption vary seasonally. The capacity of the battery does not vary seasonally or with temperature, with one caveat I'll get to. You can figure out the capacity of your battery: use the trip meter, or even better the app on the v7.0 IP that resets automatically for each leg driven. Multiply miles driven for a leg by Wh/mi for the leg and divide by 1000 to get kWh consumed, and then divide by the change in % SOC between the beginning and end of the leg. So if you start to drive at 66% SOC, end at 57, drive 18.8 miles using 362 Wh/mi, which multiplies to 6.81 kWh, and divide by the .09 change in SOC, you get 76 kWh, which is what 100% capacity means for your pack (the numbers are from a drive I did yesterday). Because the SOC meter rounds to the nearest percent, the rounding errors throw off the result, so you have to average multiple legs. Measure each individual drive so the calculation isn't thrown off by vampire losses while parles.

Thanks for this. Helpful calculation... Just did for my last leg and the result is slightly above 69 KWh for my P85D... So hope averaging out several legs will bring this up otherwise, it looks like I have seen a big loss in 8 just months.
 
Thanks for this. Helpful calculation... Just did for my last leg and the result is slightly above 69 KWh for my P85D... So hope averaging out several legs will bring this up otherwise, it looks like I have seen a big loss in 8 just months.
I have had individual legs come in as low as 64 and as high as 80. The problem is that if there is only, say, a five percentage point difference in the SOC at the beginning and end of the leg, the car's rounding error when the meter reports the SOC can throw off the calculated capacity by 20% either way, which is a lot. What I do is keep each leg as a line in a spreadsheet. The way to do the average is to sum up the SOC used (in percentage points ) for all the legs and divide that into the summed up kWh used for all the legs. The more are summed up, the more the rounding errors wash out.
 
I just created a sample spreadsheet to log this information, I added additional column for NOTES as well as TEMPERATURE as there may be some correlation to notice in the future.
Thanks, ARTinCT
By the way, I just looked at the energy app on the touchscreen of my car in the garage. The chart said 336 Wh/mi of average consumption over 30 miles and predicted 190 miles range. The SOC meter said 84%. Wh per 100% charge = (Wh/mi) * (mi @ 84%) / .84 = 336 * 190 / .84 = 76000 Wh = 76 kWh. In other words, Tesla is assuming my pack contains 76 usable kWh at 100% SOC. This is the standard number I have seen on a company graphic, and is apparently the way they calculate projected range. If our packs hold a little less, then our range will be a little less.
 
There is certainly fluid in the reduction gear set and although less than a traditional drive train, it's not going to be zero, which will account for some of it. Cold air is denser and at highways speeds most of the power is used pushing air (I suspect air density is the major player). If you haven't adjusted your tire pressures to compensate for the lower temperatures, that will add a lot. The heating certainly adds a lot although if you preheat and time charging to end when you are about to start driving, it's effects can be minimized. On days with rain or snow covered roads, it's easy to use 10% - 20% more just from the increased rolling resistance.

So since total drivetrain losses are probably only 5 - 10%, the difference in drivetrain losses at different temps due to lubrication viscosity is probably only a percent or two.

Yes I keep my tires inflated properly, don't you?

Your point about snow covered roads is certainly a good one, when those circumstances obtain.

I had considered the difference between winter cabin heat and summer air conditioning. That probably doesn't contribute much to the winter summer difference.

Then there's air drag, which is your prime candidate. I thought it was tiny. So (having the compulsion to burn half a day) I worked it through, checked it in various ways and ended up doing the calculation in Wolfram Mathematica 10, a very versatile technical computing system which also includes scientific data like standard atmospheric density. Along the way I discovered something that floored me, which is that the Wh/mi "consumption" our cars display works out to have units of force (length * mass / time^2), the same as drag itself. One Wh/mi "consumption" (after subtracting battery heating, drivetrain losses and the like) = 2.237 Newtons of propulsive force at the contact patches of the tires. Because of the unit conversion features in Mathematica I could use mph and degrees fahrenheit for the inputs to the drag calculation. I got frontal area and Cd from a June 2014 Car and Driver article. The conclusion is that driving at 60 mph at 80 degrees fahrenheit costs 106 Wh/mi of drag, and at 32 degrees it is 10.4 Wh/mi higher. So while I was wrong to say "tiny", and I shouldn't have said "For the rest of the trip the battery is warm and it's as if you are driving in the summer" because there is a little more drag, etc., I believe the main factor accounting for most of the nearly 100 Wh/mi difference is battery heating and regen as I said, with drag only a 10th of that.

Just for fun, here is the (cleaned up) Mathematica code:


View attachment 103454
 
Last edited:
Sillydriver, I don’t think that you explained the original finding that the rated miles seems to vary based on the outside temperature as detected at the end of charging. I think the original point was to determine what is the calculation Tesla is using that results in the varying rated range. I do think that you have demonstrated that the SOC is fixed and does not contribute to this variation.
 
So since total drivetrain losses are probably only 5 - 10%, the difference in drivetrain losses at different temps due to lubrication viscosity is probably only a percent or two.

Yes I keep my tires inflated properly, don't you?

I'm pretty fanatical about that, but many folks aren't.


I had considered the difference between winter cabin heat and summer air conditioning. That probably doesn't contribute much to the winter summer difference.

To me it seems there is quite a bit of difference between A/C and heating power. The reason is that the A/C is variable speed, while the heat is only on or off.

Then there's air drag, which is your prime candidate. I thought it was tiny. So (having the compulsion to burn half a day) I worked it through, checked it in various ways and ended up doing the calculation in Wolfram Mathematica 10, a very versatile technical computing system which also includes scientific data like standard atmospheric density. Along the way I discovered something that floored me, which is that the Wh/mi "consumption" our cars display works out to have units of force (length * mass / time^2), the same as drag itself. One Wh/mi "consumption" (after subtracting battery heating, drivetrain losses and the like) = 2.237 Newtons of propulsive force at the contact patches of the tires. Because of the unit conversion features in Mathematica I could use mph and degrees fahrenheit for the inputs to the drag calculation. I got frontal area and Cd from a June 2014 Car and Driver article. The conclusion is that driving at 60 mph at 80 degrees fahrenheit costs 106 Wh/mi of drag, and at 32 degrees it is 10.4 Wh/mi higher. So while I was wrong to say "tiny", and I shouldn't have said "For the rest of the trip the battery is warm and it's as if you are driving in the summer" because there is a little more drag, etc.,

My guess here is that you've never piloted a private aircraft. The difference in takeoff time spent on the runway between winter and summer is significant. You'd likely have a different point of view if you had that experience.
 
Sillydriver, I don’t think that you explained the original finding that the rated miles seems to vary based on the outside temperature as detected at the end of charging. I think the original point was to determine what is the calculation Tesla is using that results in the varying rated range. I do think that you have demonstrated that the SOC is fixed and does not contribute to this variation.
At least on the energy app, and probably also on the instrument panel (although I'm not sure, I have the indicator on my IP set to % SOC), they appear to divide 76kWh by recent actual consumption in Wh/mi, perhaps over the last 30 miles, to get the range on a full charge, and then multiply by % SOC to get the car's current projected range. So the fully charged rated range varies inversely with recent consumption in Wh/mi, where the factors that affect that consumption are what make range very seasonally. My posts above talk about the factors that I think cause consumption to be higher in the winter: mostly battery heating and the lack of regen in a cold battery, with a small effect from the higher density of cold air increasing drag.
 
To me it seems there is quite a bit of difference between A/C and heating power. The reason is that the A/C is variable speed, while the heat is only on or off.

My guess here is that you've never piloted a private aircraft. The difference in takeoff time spent on the runway between winter and summer is significant. You'd likely have a different point of view if you had that experience.
You are right, heating the cabin a given number of degrees should take more power than cooling it that same number of degrees, but the reason is not variable speed A/C versus fixed heat. The reason is that heating is done with a resistive element that just turns electricity into heat, producing more entropy than it needs to for the amount of heating. Cooling is done with a refrigerator that pumps a larger amount heat out of the cabin with that same electrical power, operating closer to the maximum efficiency that thermodynamics allows. Overall, I don't think the power consumption delta between heat and A/C is big in the overall energy budget of the car.

A plane taking off in the cold is helped by a double effect. First, the engine breaths denser air, burns more fuel and produces more power to accelerate the aircraft quicker. Second, the denser air means a greater mass of air is moving over the wings at a given speed, producing more lift. Drag force on the MS is proportional to density (analogous to the second effect helping lift) but there is nothing like the first effect increasing aircraft engine horsepower when it comes to the MS. So experience with aircraft that have internal combustion engines is likely to lead one to overestimate the effect of higher air density on an electric car.
 
Last edited:
At least on the energy app, and probably also on the instrument panel (although I'm not sure, I have the indicator on my IP set to % SOC), they appear to divide 76kWh by recent actual consumption in Wh/mi, perhaps over the last 30 miles, to get the range on a full charge, and then multiply by % SOC to get the car's current projected range. So the fully charged rated range varies inversely with recent consumption in Wh/mi, where the factors that affect that consumption are what make range very seasonally. My posts above talk about the factors that I think cause consumption to be higher in the winter: mostly battery heating and the lack of regen in a cold battery, with a small effect from the higher density of cold air increasing drag.

Mine doesn't vary with a sample that small, or my captured numbers would be far more variable. There are times when my last 30 miles were spent running my son to high school and back, a bunch of very short trips that generate a very high Wh/mi average (sometimes > 400), and there are times when my last 30 miles were spent going to/from STL, which tends to average 280ish. Perhaps it's > 100 mi or so, but I doubt it.

I don't believe recent consumption affects the rated range displayed at all. The API does have a "est_battery_range" which is based upon consumption, but "battery_range" appears to be disconnected from any recent consumption. If there is any connection, it is going to be based on a very large moving average sample.
 
Mine doesn't vary with a sample that small, or my captured numbers would be far more variable. There are times when my last 30 miles were spent running my son to high school and back, a bunch of very short trips that generate a very high Wh/mi average (sometimes > 400), and there are times when my last 30 miles were spent going to/from STL, which tends to average 280ish. Perhaps it's > 100 mi or so, but I doubt it.

I don't believe recent consumption affects the rated range displayed at all. The API does have a "est_battery_range" which is based upon consumption, but "battery_range" appears to be disconnected from any recent consumption. If there is any connection, it is going to be based on a very large moving average sample.

You are right. I just looked in my car again. The formula I gave was the way "projected range" is calculated in the "consumption" tab in the energy app on the touch screen. It says 190 miles "projected range" based on the last 30 miles history. I then went to settings and switched the battery meter on the IP from % SOC to range and it currently estimates 208 miles of range. So the meter on the IP does use a different formula than I said. My apologies to Marcira! It might be, as you say, a longer moving average. Or it adjusts history by the length of trips. Or it might make individual estimates for factors affecting consumption like battery heating. Nevertheless, I stand by my underlying point. A range meter, if it is to do any good at all, has to reflect the range the car is likely to actually get. And if the capacity of the pack in kWh is stable over the year, then the range that the car will actually get will be inversely proportional to the average consumption in Wh/mi that it will actually get. I have tried to identify the factors that drive the seasonal change in consumption in the posts above, and I think they are correct.
 
Last edited:
I have had individual legs come in as low as 64 and as high as 80. The problem is that if there is only, say, a five percentage point difference in the SOC at the beginning and end of the leg, the car's rounding error when the meter reports the SOC can throw off the calculated capacity by 20% either way, which is a lot. What I do is keep each leg as a line in a spreadsheet. The way to do the average is to sum up the SOC used (in percentage points ) for all the legs and divide that into the summed up kWh used for all the legs. The more are summed up, the more the rounding errors wash out.

Thanks. Spreadsheet and formulas for average created. Starting to gather data. I'll report back on kWh I seem to have left in my pack.
 
You are right. I just looked in my car again. The formula I gave was the way "projected range" is calculated in the "consumption" tab in the energy app on the touch screen. It says 190 miles "projected range" based on the last 30 miles history. I then went to settings and switched the battery meter on the IP from % SOC to range and it currently estimates 208 miles of range. So the meter on the IP does use a different formula than I said. My apologies to Marcira! It might be, as you say, a longer moving average. Or it adjusts history by the length of trips. Or it might make individual estimates for factors affecting consumption like battery heating. Nevertheless, I stand by my underlying point. A range meter, if it is to do any good at all, has to reflect the range the car is likely to actually get. And if the capacity of the pack in kWh is stable over the year, then the range that the car will actually get will be inversely proportional to the average consumption in Wh/mi that it will actually get. I have tried to identify the factors that drive the seasonal change in consumption in the posts above, and I think they are correct.

See this is what confuses me - if rated range is different from actual projected range, why should it vary at all? I would have assumed rated range referred to the EPA rating, but that doesn't vary ever. I definitely agree that if a range meter is to be the most useful is has to reflect the range the car will actually get, but I figured the useful range meter was the estimated/project range, not the rated range.
 
See this is what confuses me - if rated range is different from actual projected range, why should it vary at all? I would have assumed rated range referred to the EPA rating, but that doesn't vary ever. I definitely agree that if a range meter is to be the most useful is has to reflect the range the car will actually get, but I figured the useful range meter was the estimated/project range, not the rated range.
Yes, they should call it something other than 'rated range'. They should also disclose how they calculate it, so we understand the factors that make the range we will actually get different from their number. This is why I have the little battery symbol on the instrument panel display % charge rather than rated range. At least knowing that I can make my own estimate of range based on temperature and other conditions. Having rated range on the dash is like having an ICE car use a trip odometer combined with EPA rated milage to tell you how much gas is left in your tank.