Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Actual energy (from the wall) usage vs Tesla figures

This site may earn commission on affiliate links.
I've owned my 85kWh Model S for about 2 months, and have tried to monitor the number of miles driven and the energy used through-out this time. During this time, I've driven a bit over 3500 miles and have averaged about 295 Wh/mi. I've also been able to track the "vampire drain" at a fairly consistent level (approx. 0.33mi/hr or 8mi/day - some worse than other).

What I'm having a hard time calculating is how much energy I'm actually pulling from my energy supplier (as I have to pay them for usage), vs what Tesla states the energy used for the miles driven is.

Breaking it down a bit further, my battery usage for miles driven averages around 290 wh/mi often. But, if I factor in the vampire drain, I've been coming in at about 340 wh/mi. Roughly a 20% premium.

Has anyone else figured out what the premium from this to the actual energy usage for charging the battery is?

Thanks.
 
I believe you also need to factor in charging loss, since there is some loss between the wall outlet and the battery. I think it's around 10%.

When my Model S turned over 8,000 miles recently, I estimated that I'd spent a grand total of $360 in electricity. About 1,000 of those miles came from free charging stations.
 
I have a dedicated meter on my charging circuit and record stats each month. Here are mine for August:

Car:

1997.4 miles
605.3 kWh
303.0 Wh/mi

Meter:

715.673 kWh
358.3 Wh/mi (calculated)

So, a bit over 110 kWh lost to charging inefficiencies, vampire losses etc.
 
Though I agree with the 3kWh per day loss to vampire drain, I think 10-15% loss from the wall to the car's battery is fairly steep.

Spent some time to do a bit of research yesterday while charging the car. I had 88 miles left, and a charge to standard (90%) would require 152 miles of charge. So, if 265 is 100%, then the 152 miles is roughly 57.4% of the 85kWh, which is approximately 49kWh. (the 152 figure factors in the vampire loss)

My charge was set at about 235v @ 30amps, which works out to just about 7kWh. When I started the charge, the car said 7 hours, and was done at 7 hour mark. So, that is approximately 49 kWh as well. There may be some rounding errors, but nowhere close to 10% loss. Why do you use 10-15% as rule of thumb?

What am I missing about this that I don't understand?
If you are talking about 10-15% loss from the energy that was produced at a power generation plant, to my house, then I can understand. But, that doesn't impact the cost of the energy my Tesla uses, as I get billed on the amount used on my meter.
 
HiTech: I think the 10% loss was meant as part of the charging process dissipated as heat. If you had turned off everything else you could have looked at your meter before charging and after and seen if it's 49kWh more or say 54kWh more. With everything else working as well you'd have to monitor what your average usage is and remove that amount, but that's quite error prone and might not show you the 10% extra component within the errors of the method...
 
I've measure my "vampire" losses to be 2.5 kWh daily (while I was on vacation, the car would top-up exactly 5 kWh every other day).

If I take 2.5 x 31 days = 77.5 kWh off the 110 kWh discrepancy noted in my post above, that leaves about 33 kWh unaccounted for, or a hair over 1 kWh a day.

My August daily average useage (from my energy meter on the wall plug) is 23 kWh, so that translates to about a 4% charger loss.

These are my numbers based on my driving patterns in summer. Winter will be a lot "worse".
 
Though I agree with the 3kWh per day loss to vampire drain, I think 10-15% loss from the wall to the car's battery is fairly steep.

Spent some time to do a bit of research yesterday while charging the car. I had 88 miles left, and a charge to standard (90%) would require 152 miles of charge. So, if 265 is 100%, then the 152 miles is roughly 57.4% of the 85kWh, which is approximately 49kWh. (the 152 figure factors in the vampire loss)

My charge was set at about 235v @ 30amps, which works out to just about 7kWh. When I started the charge, the car said 7 hours, and was done at 7 hour mark. So, that is approximately 49 kWh as well. There may be some rounding errors, but nowhere close to 10% loss. Why do you use 10-15% as rule of thumb?

What am I missing about this that I don't understand?
If you are talking about 10-15% loss from the energy that was produced at a power generation plant, to my house, then I can understand. But, that doesn't impact the cost of the energy my Tesla uses, as I get billed on the amount used on my meter.

10%-15% charger losses are not that far off. On my Chevy Volt it typically took about 11.7-12 kwh to fill up 10.3 kwh of usage which is about 14%-17% in losses. This does impact what you get charged on your bill. You're not just getting charged for what your car indicates it is using. It's what the car used plus the charging losses + vampire losses + battery thermal management.
 
Last edited:
Get a used electric meter and hook your charger into it. You might be able to find one locally thru an electrical contractor. Not something I would want to have shipped, however.
A couple options:
EKM Metering makes a number of meters, including "smart" meters that you can interface with: EKM Metering
Hialeah Meter sells refurb utility grade meters - parts are cheap (meter/socket for as low as $25): Hialeah Meter Company

I have a dedicated meter on my charging circuit and record stats each month. Here are mine for August:

Car:
1997.4 miles
605.3 kWh
303.0 Wh/mi

Meter:
715.673 kWh
358.3 Wh/mi (calculated)

So, a bit over 110 kWh lost to charging inefficiencies, vampire losses etc.
Nice data - so looks like about 15% loss compared to in-car meter for 2,000 miles/month which is fairly typical. Would be interesting to see how this varies depending on how many miles are driven / month.
 
Though I agree with the 3kWh per day loss to vampire drain, I think 10-15% loss from the wall to the car's battery is fairly steep.

Spent some time to do a bit of research yesterday while charging the car. I had 88 miles left, and a charge to standard (90%) would require 152 miles of charge. So, if 265 is 100%, then the 152 miles is roughly 57.4% of the 85kWh, which is approximately 49kWh. (the 152 figure factors in the vampire loss)

My charge was set at about 235v @ 30amps, which works out to just about 7kWh. When I started the charge, the car said 7 hours, and was done at 7 hour mark. So, that is approximately 49 kWh as well. There may be some rounding errors, but nowhere close to 10% loss. Why do you use 10-15% as rule of thumb?

What am I missing about this that I don't understand?
If you are talking about 10-15% loss from the energy that was produced at a power generation plant, to my house, then I can understand. But, that doesn't impact the cost of the energy my Tesla uses, as I get billed on the amount used on my meter.

Why did you try and use rated miles to estimate energy used when the trip meter shows it directly? Like mknox I compare energy used as reported by the car to what the meter shows it took to charge. My normal commute uses about 19.5 kWh by the trip meter. On days I don't commute my car uses 3 kWh at night. On days I do commute it uses 25 kWh. If you subtract the 3 kWh it would have used anyway than it took 22 kWh to replace 19.5 kWh. This is entirely normal for charging batteries. As a previous poster said the energy is lost because the batteries heat up during charging and more energy is used to keep them from heating up too much (pumping the coolant).
 
As a previous poster said the energy is lost because the batteries heat up during charging and more energy is used to keep them from heating up too much (pumping the coolant).
Lithium batteries are very efficient at charging like 97% so there's not much energy lost there under normal conditions. Most of the energy is lost in the AC/DC conversion (chargers are typically low 90% efficient at best) and the rest is in running cooling pumps and other associated electronics. Overall 75-90% charging efficiency is pretty typical.

So seeing mknox's 360 Wh/mi is good confirmation that at least if you drive a lot vampire and other losses aren't affecting your overall efficiency that much - 360 Wh/mi lines right up with the US EPA estimates on efficiency.
 
Okay... Got it. As gregincal realized, you guys are talking about figures based on the car reported energy usage (typically around 300 wh/mi), while I used a figure that already calculated the "vampire drain" for the month (rated miles). But, in the end my actual usage also come out to about 360 wh/mi. That is the figure you are saying is about 20% higher. Makes sense. I was thinking we are talking about on top of the 360!
I'll blame that to being sick for the last few days! :)