Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

HEADS-UP: Simultaneously Charging a Tesla and a LEAF

This site may earn commission on affiliate links.
That's how voltage converters work. They generate X volts at Y amps and draw the current needed to get that. They really can't be built any other way if you want a regulated output.

So what is the functional relationship between X and Y for voltage converters.

It seems to be self-defeating, as increasing the current coming from the power lines is normally going to drop the line voltage even more.
 
So what is the functional relationship between X and Y for voltage converters.

It seems to be self-defeating, as increasing the current coming from the power lines is normally going to drop the line voltage even more.

No, it's just basic physics.

Short answer:

Your circuit needs a certain amount of power to run. Let's say it needs 10 watts. That's 2 amps at 5V, 1 amps at 10V, or 0.5 amps at 20V. If the voltage goes down, then the current has to go up to compensate.

More detailed answer:

Most electronic devices require a constant voltage power supply to operate correctly. Depending on the circuit this might be 12V, 5V, 3.3V, 400V, whatever. But fundamentally most circuits need clean, reliable, constant voltage input. If they need more power, then they draw more current, but the voltage needs to stay the same. That requires a voltage regulator. You'll find voltage regulators in 99% of all electronic equipment that you use.

The simple way to regulate the voltage is to put some sort of resistor in series, and adjust it to maintain a constant voltage as conditions change. Usually this "resistor" is actually a transistor, and there's a feedback circuit to maintain the right voltage. Current in equals current out. Very simple. The problem is that this regulator is very wasteful. If 12V comes in and you only need 6V, you have to burn off half the power in your regulator. This kind of simple regulator tends to get very toasty hot, because although it is simple it is quite inefficient.

A better technique is to use a switching regulator. It can convert between two voltage levels without burning lots of energy. To do this it chops the incoming power with a switch (a transistor of course, but wired up to always switch all-on or all-off very hard) and then uses a coil or transformer to "boost" or "buck" that chopped power to the right voltage (some converters can boost, some can buck, some can do both). Because it's being driven by a switch, all-off or all-on, it doesn't burn off power -- the converter is much more efficient. No switching regulator is 100% efficient, but they are often in the high 90's.

So you need an output of, say, 10V at 1 amp. That is 10 watts. Ignoring the few percent efficiency loss, it needs 10 watts in. If the input voltage is 10 volts, it draws 1 amp = 10 watts. If the input voltage is 20V it draws 0.5 amps = 10 watts. As you can see, as the input voltage goes up, the current goes down. And as the voltage goes down, the current has to go up. It needs that 10 watts. So if your line voltage in starts to droop, it has no choice but to either draw more current, or switch off altogether (which a good design will do automatically if the voltage droops too much - just like the Roadster does).
 
Excellent description, Doug!

Yes, the Roadster includes a charging system that is well worth the price paid. Hopefully they can maintain the high level of quality over the years as each new model (S, X) comes out at a lower price.
 
No, it's just basic physics.

Short answer:

Your circuit needs a certain amount of power to run. Let's say it needs 10 watts. That's 2 amps at 5V, 1 amps at 10V, or 0.5 amps at 20V. If the voltage goes down, then the current has to go up to compensate.

More detailed answer:

Most electronic devices require a constant voltage power supply to operate correctly. Depending on the circuit this might be 12V, 5V, 3.3V, 400V, whatever. But fundamentally most circuits need clean, reliable, constant voltage input. If they need more power, then they draw more current, but the voltage needs to stay the same. That requires a voltage regulator. You'll find voltage regulators in 99% of all electronic equipment that you use.

The simple way to regulate the voltage is to put some sort of resistor in series, and adjust it to maintain a constant voltage as conditions change. Usually this "resistor" is actually a transistor, and there's a feedback circuit to maintain the right voltage. Current in equals current out. Very simple. The problem is that this regulator is very wasteful. If 12V comes in and you only need 6V, you have to burn off half the power in your regulator. This kind of simple regulator tends to get very toasty hot, because although it is simple it is quite inefficient.

A better technique is to use a switching regulator. It can convert between two voltage levels without burning lots of energy. To do this it chops the incoming power with a switch (a transistor of course, but wired up to always switch all-on or all-off very hard) and then uses a coil or transformer to "boost" or "buck" that chopped power to the right voltage (some converters can boost, some can buck, some can do both). Because it's being driven by a switch, all-off or all-on, it doesn't burn off power -- the converter is much more efficient. No switching regulator is 100% efficient, but they are often in the high 90's.

So you need an output of, say, 10V at 1 amp. That is 10 watts. Ignoring the few percent efficiency loss, it needs 10 watts in. If the input voltage is 10 volts, it draws 1 amp = 10 watts. If the input voltage is 20V it draws 0.5 amps = 10 watts. As you can see, as the input voltage goes up, the current goes down. And as the voltage goes down, the current has to go up. It needs that 10 watts. So if your line voltage in starts to droop, it has no choice but to either draw more current, or switch off altogether (which a good design will do automatically if the voltage droops too much - just like the Roadster does).

That's what I was saying. They're being stupid (IMHO) to try to keep the keep the voltage level up by increasing the current. Increasing the current will cause a further drop in voltage. What I was wondering is if they use any tricks to get around this. For example, raising the peak current, but chopping it's pulse width so that average current is actually reduced but the peak current and voltage remain high. That's the approach used in a triac(IIRC) circuit I did many years ago (probably from a Radio Shack project book) that let you effectively dim fluorescent lights.

The physics isn't my problem, it's the power electrical engineering.

P.S. Maybe they aren't stupid. They increase the load so that the competition (e.g. a Tesla with a better designed system) is kicked off line.
 
Last edited:
That's what I was saying. They're being stupid (IMHO) to try to keep the keep the voltage level up by increasing the current. Increasing the current will cause a further drop in voltage.

The physics isn't my problem, it's the power electrical engineering.
Well, yes, if you have problems with power electrical engineering then some things might look stupid. But sometimes it may be more due to your misunderstanding than anything else.

They're not increasing the current (amps) to keep the voltage level (volts) up, they're doing it to keep the power level (watts) up. As Doug says, if your input has too little power, your only choice is to try and pull more current or just shut down. Simpler circuits will just pull more current, more expensive circuits like the Tesla will smoothly shut down and issue a warning.

It's like a power amplifier rated for 100W at 8Ω. If the user mistakenly attaches a 4Ω speaker, then the amp will still create the same voltage, but double the current will flow through half the resistance, resulting in 200W. The problem appears if the circuit cannot actually produce 200W, because then your amplifier burns out (or shuts down if it's well made). The worst case is a 0Ω speaker wire - i.e. shorting the amp outputs. This results in nearly infinite current due to the very, very small resistance, and then the amp must produce nearly infinite power (wattage) if it is to maintain the same voltage. In all of these examples, the amplifier designer is not specifically trying to produce more and more current as the resistance drops, it's just Ohm's law. In other words, basic physics control most of these things. It's simply more expensive to design the circuit for all possible unexpected conditions.
 
So is it only Roadster 1.5 cars that are at risk, or are Roadster 2.0 cars acting weird when a Leaf is charging nearby?
Not clear on that one. The problem has been demonstrated with a 1.5 and an early LEAF. A later LEAF should not exhibit the problem. A 2.0 has, AFAIK, not been checked. My guess (that's all it is), due to the significant PEM differences, the 2.0 would be ok. But as I said before somewhere ... with a toy that expensive, I'd be paying extra attention while charging near an early LEAF. (Wish I could tell you the VIN diffence between "early" and "later" for the LEAF ... but that info is currently not known (outside Nissan anyway). My very conservative guess is LEAF VIN after 2,000 ok for sure ... but for CA & WA & AZ that "leaves" (!) a lot of "early" LEAFs to "worry about" ...)
 
I now have an early VIN Leaf (651 I think), which I bought used in addition to my 1.5 Roadster. I hear people talking about caution with respect to these cars charging, but other than the Tesla not completing its charge I haven't heard anyone say that the Leaf is actually damaging the Roadster. What are the current thoughts on this issue. I have now seperated my charging routines for the 2 cars, as of tonight just in case, but I have had the Leaf for a couple of months and have not experienced any issue with the cars charging. I can't say that I know how often they might have charged at the same time in the past since they both charge while I am sleeping. Anyways I am curious if anyone has actually expereinced any damages from the Leaf charging.
 
Some early LEAFs (mine included) seem to produce some kind of "line noise" when charging from 240V.
Apparently some "very squeaky wheels" managed to convince Nissan to replace their charger with something updated that has better noise filtering. When I was charging during daytime from 240V, I would hear a buzzing noise coming out of my Solar inverter. The inverter recently failed and had to be replaced, but it is hard to know if the LEAF was really the cause... It might have been.
I heard anecdotes about Roadster 1.5 charging circuits being adversely affected too, but I don't know how severe.

So, there may well be a problem, but I doubt you get Nissan or Tesla to volunteer to fix things on their own.
There could be finger pointing - Nissan charger is too noisy? Tesla charger is too sensitive?