After checking out the article here <American wire gauge>, it appears that AWG #6 is sufficient for a 50-amp setup (for NEMA 14-50). However, I recall reading some posts here on TMC saying that it would be better to go bigger (#4, not sure if #5 is common). To my mind, a thicker wire would incur a smaller amount of energy lost to heat (therefore increasing the charging efficiency). Would it be worthwhile to go this route?
At 40 A over a wire run of 5 m for 9 hours, total energy lost over the wire run for a 300-mile full charge is:Code:AWG#6 - 1.296 mΩ/m AWG#5 - 1.028 mΩ/m (21% less than #6) AWG#4 - 0.8125 mΩ/m (37% less than #6)
for AWG#6: (40 A)^2 x (5 m x 1.296 mΩ/m) x 9 hr = 0.093 kWh
for AWG#5: (40 A)^2 x (5 m x 1.028 mΩ/m) x 9 hr = 0.074 kWh
for AWG#4: (40 A)^2 x (5 m x 0.8125 mΩ/m) x 9 hr = 0.059 kWh
After 300,000 miles, the AWG#4 will have saved around 34 kWh over AWG#6.
At 0.08 cents/kWh, that's only $2.72, over several years at least.
So it seems getting a thicker wire will not save money, but it will reduce the temperature of the wire itself during charging.
I guess my questions are:
1) How much more expensive is AWG#4 / #5 over AWG#6?
2) How much cooler will the thicker wire be?