Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Utilities companies taking a beating while I charge my S

This site may earn commission on affiliate links.

wk057

Former Tesla Tinkerer
Feb 23, 2014
6,504
17,139
X
So, I just finished a good chunk of the AC side wiring for my off-grid solar project. I'll post details of that elsewhere, but I moved the wiring for my HPWC in the process.

All of the wiring I used for my project is copper, and probably a gauge larger than needed. My sub-panel to HPWC run is 2 gauge copper, roughly 40 ft. The sub panel is fed from a 200A breaker through 3/0 gauge copper through a 200A transfer switch then over from the utility service panel. (The transfer switch will let me cut off the utility later...)

So anyway... I measured my voltage drops trying to figure out my resistive losses.

At the HPWC before starting a charge I was at 249V. It's only me and one other house on a rather large transformer currently, so, the good starting voltage is pretty normal. Only about 2kW of other loads on the 320A (400A) service.

Measuring the voltage at the HPWC and at my service panel, they were virtually identical under no load (within a couple tenths of a volt, within a reasonable margin of error).

After starting to charge at 80A, voltage at the car dash sagged to 237V, at the inside of the HPWC about 238-239V (I know the cable for the HPWC is less than ideal already so this was expected).

At my 200A sub panel where the HPWC is connected, 239-240V, and at my service panel 240V.

So, between the car and my service panel I have roughly a 3V drop. This comes out to only ~240W of heat radiating off of the 25' HPWC cable, 40' of 2 gauge from the HPWC to the sub panel, and ~30' of 3/0 between the sub panel and service panel. The majority of it being the charge cable itself. This is pretty good and shows that all of my connections are pretty solid on my wiring. After 90 minutes, confiming with my FLIR cam shows a pretty negligible temperature rise on that run. Virtually nothing on the 3/0, about a 20F rise on the 2 gauge, and the 100A breaker sitting at about 35F above room temp. All normal or better than normal.

So... we started with 249V.... that means there is a 9V drop between my service panel's mains and the transformer. 9V @ 80A is 720W of heat coming off of that wiring... and that wiring is on the utility's side of the meter, sans ~4 feet of it (a staight line on Google maps from my meter to the transformer is about 170').

So, lets say I charge for 3 hours (what I'm doing now since my HPWC was offline for a couple of days for my rewire job). Thats 3*720W = 2.16kWh in heat losses in the service conductors. 2.16kWh that the utility is paying for, not me, since it is on their side of the meter. The losses on my side of the meter are negligible. That is a 3-4% "toll" the utility is paying.

Granted, that comes out to about 1 kWh they lose for every 80 miles I drive, so it's not a whole lot. But I'm guessing if you added this up for every Model S owner the utility companies are probably losing a measurable amount of power here. For the average driver at 12000 miles per year, that's 150 kWh of heat losses on the utility side. Let's say there are 1,000 Model S in the state (out of the air figure)... that's 150 megawatt hours per year, or about $18,000 in electricity at the national average rate, in power that just heats wires between your house and the transformer.

Now I'm sure others have better or worse losses, but still... seems interesting to me and something that I bet utilities will eventually start taking into account. However, I bet the losses over X years are still less than the cost of using copper service entrance conductors over aluminum... so maybe not.

-wk
 
Last edited:
720 Watt loss of 20 kW (when charging at full 80 Amp) is 3.6%. The total losses in power distribution and transportation is about 7% (USA national average). Seems pretty normal.

Well, I figure it is substantially higher than a non-Tesla owner's losses for a residence, since I don't think there are too many other loads at a regular residence that will pull 20kW for hours. Even my auxiliary resistive heating only runs for short periods.

I'd expect the losses between my meter and transformer would, on average, be substantially less if I did not have the HPWC.
 
In many service areas, at least here in California, 20 kW or thereabouts is the maximum power that a business can draw before demand charges kick in. Some have speculated that utilities might one day institute demand charges for residences. In any event, with enough 10-20 kW charging stations out there, utilities might seek permission to structure the rates to prefer lower, continuous draws. Of course, "smart grid" technology would be helpful here - during periods of "excess" renewable energy generation, your car's charge rate could be throttled up to the full 20 kW at a good price.
 
The resistance of the wiring on the utility side of the meter is essentially a constant. If you pull 80 amps for an hour, 40 amps for 2 hours, or 10 amps for 8 hours, the total power lost is the same on those wires. It doesn't matter if you're charging your S or running your microwave, the percentage lost before the meter is the same. The utility is aware of the power lost in transmission and they account for it in the price we pay per unit of electricity.
 
720 Watt loss of 20 kW (when charging at full 80 Amp) is 3.6%. The total losses in power distribution and transportation is about 7% (USA national average). Seems pretty normal.

This is exactly it. Have you ever seen those huge oil filled radiators on the sides of transformers? Those are pretty hot, and 100% of that heat is lost. When the power company figures out what the "cost of transmission" is that they factor all that loss in. That's a neat way for you to discover all this, though. I always tend to run a thicker gauge wire on circuits I know are going to be hammered just for that purpose. It's all about efficiency ;)

- - - Updated - - -

The resistance of the wiring on the utility side of the meter is essentially a constant. If you pull 80 amps for an hour, 40 amps for 2 hours, or 10 amps for 8 hours, the total power lost is the same on those wires. ..

Actually, if the power lines get hot because of the higher draw (say 80A) they heat up and cause more resistance. That extra resistance increases the power losses.
 
The resistance of the wiring on the utility side of the meter is essentially a constant. If you pull 80 amps for an hour, 40 amps for 2 hours, or 10 amps for 8 hours, the total power lost is the same on those wires. It doesn't matter if you're charging your S or running your microwave, the percentage lost before the meter is the same. The utility is aware of the power lost in transmission and they account for it in the price we pay per unit of electricity.
Please be careful citing incorrect things as fact.

Resistance absolutely increases as a function of Amperage. This is the same reason why you need to increase gauge as amperage increases, and if you oversize the wire, the resistance will be less than if you use the minimum code size of the wire. @wk057's issue suggests that either the wire or the transformer is undersized on the utility side.
 
Resistance absolutely increases as a function of Amperage. This is the same reason why you need to increase gauge as amperage increases, and if you oversize the wire, the resistance will be less than if you use the minimum code size of the wire. @wk057's issue suggests that either the wire or the transformer is undersized on the utility side.

I'm pretty sure resistance (measured in ohms) is a property of the write/metal. Resistance increases with temperature. wk057 said that wire outside the meter was not noticeably hotter though.

- - - Updated - - -

That copper 3/0 write should have about 0.2 ohms/km resistance, per conductor at normal temperature.
http://en.wikipedia.org/wiki/American_wire_gauge#Tables_of_AWG_wire_sizes
 
Last edited:
Curious, which model FLIR do you use?

I have the E5

- - - Updated - - -

FLIR0284.jpg


Junction box used for extending the 2 gauge wire for my HPWC to the new panel after ~2 hours of use.

- - - Updated - - -

FLIR0284- photo.jpg


Visible

- - - Updated - - -

Can do a ton with the software too. I like it.

FLIR0284a.jpg
 
Unfortunately it is ~35F outside, and the wiring is underground. I plan on FLIRing the transformer next time I charge for a while though. :)

I misread your original post for that. I took your 3/0 gauge copper to be the wire at the meter that was not warming up at all, but that was your wire to the sub-panel. I would imagine that the wire feeding your 400A panel is at least as large (probably larger) so we would not expect that to be any warmer.

The distance to the transformer is 170 feet, let's say 0.1 km at most. The resistance for even a 3/0 wire should be 0.2ohm/km*0.1km = .02 ohm. Power lost to heat for 80 amps should be (P=I^2*R); 80^2 * 0.02 = 128 Watts. Your wires are probably larger than that, so this is probably an overestimate. The 9V drop you're seeing could be because:

1. Significantly more resistance than what should be there (5x more to get to your 720W calculation) because of some problem with the wire or connections.
2. The voltage regulator at the transformer just isn't perfect. This is the more likely culprit. The regulators generally allow the voltage to move quite a bit. As long as you're within 10% (maybe even more) of the rated voltage the power company would say it's working fine.
3. Something else?

IMO, you're not losing nearly that much power to heat in the feeder wires. The regulator just lets the voltage move. In fact, for 240V service, since it comes down to that 240V under a normal-ish load for that size transformer it's probably working as well as it ever could. The 249V reading you got is probably because the regulator was operating on the very light load side of its specs.

The FLIR looks very cool. I wish I had one of those or, better yet, someone to borrow from. I can think of a few places that would be handy but only once or twice.
 
The resistance of the wiring on the utility side of the meter is essentially a constant. If you pull 80 amps for an hour, 40 amps for 2 hours, or 10 amps for 8 hours, the total power lost is the same on those wires. It doesn't matter if you're charging your S or running your microwave, the percentage lost before the meter is the same.
Ohm's law says that the power lost in a resistor goes up with the square of the current.

For example, if you draw 80A and are losing 1600W to resistance in the wire, if you cut your draw to 40A, you will only lose 400W to resistance in the wire.

P = I^2 * R
1600W = 80^2 * R
R = 0.25 Ohm

0.25 Ohm * 40^2 = 400W

So really, the easiest thing to do is to charge at a slower rate unless you need to. It will also be easier on your equipment, too.
 
So really, the easiest thing to do is to charge at a slower rate unless you need to. It will also be easier on your equipment, too.
Intuitively this makes sense, but is there any evidence that slower charge rates are easier on the car's chargers?

Once past the meter, I've found no statistical difference in kWh per Ideal Mile needed to charge at any amperage (but the experiment continues and comments are welcome):

What is the Most Efficient Charging Amperage?

Also, though I'm not tracking the value daily, I lose ~3 volts when charging at 80 amps.
 
Intuitively this makes sense, but is there any evidence that slower charge rates are easier on the car's chargers?

Generally with any electronic items, the cooler they run and the more constant the power, the longer they last. Computers and storage devices in a data centre that is cool last far longer than in one that is warm. Light bulbs last longer if they are not turned on and off frequently. I'd see no reason why the chargers in the Tesla would behave different. On a practical level does it make a difference? I don't know--probably not. If the design lifespan is twenty years and charging them at a lower rate makes them last thirty would it make a difference?

Someone with a Fluke could measure the heat at various charging rates and then estimate the component life based on temperature and power cycling.
 
Ohm's law says that the power lost in a resistor goes up with the square of the current.

For example, if you draw 80A and are losing 1600W to resistance in the wire, if you cut your draw to 40A, you will only lose 400W to resistance in the wire.

P = I^2 * R
1600W = 80^2 * R
R = 0.25 Ohm

0.25 Ohm * 40^2 = 400W

I agree. In absolute terms, you lose more with higher current. I used that same I^2 formula above when calculating the losses on the feeder wire above. The difference here is when you consider the percentage of your power lost in the wire. The load and the wire see the same current (all of the current must flow through both). The total power used by both the load and the wire is: P_tot = P_wire + P_load. The percentage lost to the wire is P_lost(%) = 100 * P_wire / (P_wire + P_load). From there:

P_lost% = 100 * (I^2 * R_wire) / (I^2 * R_wire + I^2 *R_load)
P_lost% = 100 * (I^2 * R_wire) / (I^2 *(R_wire + R_load))
The I^2's cancel.
P_lost% = 100 * R_wire / (R_wire + R_load)

Those Rs don't change with current (temperature is negligible in this case), at least as far as I'm aware. The current disappears from the mix when you consider the power lost to the wire as a percentage of total power from the utility. You're always losing X% when charging your S. Doesn't matter what current use use (within reason). To my original point - the power company is aware of that X% and already charges us for it.
 
I think the larger issue isn't some relatively small fraction of energy lost. Rather, it is the load placed on the wider grid if many cars plug in at 20 kW during peak hours or when there's no surplus renewable power. In this time of declining electricity consumption (thanks to efficiency improvements), EVs are a very good thing for the utilities. But they are going to want to take steps to manage EV demand patterns.
 
I think the larger issue isn't some relatively small fraction of energy lost. Rather, it is the load placed on the wider grid if many cars plug in at 20 kW during peak hours or when there's no surplus renewable power. In this time of declining electricity consumption (thanks to efficiency improvements), EVs are a very good thing for the utilities. But they are going to want to take steps to manage EV demand patterns.

Currently I am not on time-of-use metering... so, on my end I could care less when I charge at 20kW.

When I get my solar setup up and running, I will likely charge only +/- a couple hours of solar noon.