Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

400 volts vs 800 volts charging speeds question

This site may earn commission on affiliate links.
I was wondering if someone could "make it make sense" for me in regards to 400 volt vs 800 volt battery charging speeds. I obviously understand that all other things being equal, 800 volt batteries will charge faster than 400 volt batteries. However, my understanding is that Tesla compensates for their lower voltage by increasing the amperage output at superchargers to achieve a similar kW output. For some reason though I keep seeing or hearing that 800 volt batteries "charge faster" than 400 volt batteries, like the Hyundai Ioniq 5 apparently has the record for the fastest 10% to 80% SoC charging time? Is Hyundai simply sacrificing battery longevity for faster charging speeds whereas Tesla is not? Does this have anything to do with Ohms law i.e. less resistance when voltage is higher meaning less power loss?

In my example below the kW output of the charger is the same 280 kW, so would they not charge in the exact same amount of time (all other things being equal)?

Hyundai Ioniq 5 - 800 volts x 350 amps = 280 kW
Tesla Model 3 - 400 volts x 700 amps = 280 kW

Please note I understand there are different battery form factors, charging curves, other limiting factors, nominal vs max voltage, etc. I am simply interested in whether there is something about 800 volt architecture that makes it inherently charge faster than 400 volt architecture, i.e. something that can't be overcome with a 400 volt architecture.

The only articles I have found to date that kind of get at an actual real reason are below. It may have something to do with the heat generated at high amperage to achieve the same kW output...



 
Last edited:
You have the part about the current being half for an 800V architecture than it would be for a 400V architecture, given the same power at the end.

But the reason that 800V batteries charge "faster" is simply because the charger itself is only capable of a given amperage, likely due to the specs of the components it's built from.

Let's take the example of an Electrify America 350kW charging station that is labeled that way because it supports 1000V / 350A.

An DC fast charger is going to start out charging at Constant Current (CC) (in this case 350A, because that's what the charging station is spec'ed for and can output. In this mode, the charging station is acting like a pump pushing a fixed amount of current (the lesser of 350A or whatever the car is asking for) into the car. The voltage is going to rise along with the battery voltage (which increases slightly as the state of charge increases), but generally is going to be close to the car's battery voltage. So you plug in a 400V battery architecture vehicle into it, and you're going to be getting 140kW. Plug in an 800V battery into it, and you'll get 280kW. Since the voltage increases slightly as the battery fills, you actually get a slight increase in the power transfer during this phase.

At some point (when the car's BMS decides), it shifts from CC mode to Constant Voltage (CV). At this point, the charging station provides a fixed voltage (typically a bit above the "fully charged" voltage of the vehicle), and the battery just accepts whatever current naturally flows into it (which is a function of the difference between the CV voltage and the current battery voltage (we'll call that ∆V) , and the resistance of the battery. During this phase, as the battery "fills" and the voltage increases, ∆V becomes smaller, and the current drops (this is simple Ohm's law, I=V/R -- R is pretty much constant, but as V (or rather ∆V) drops, I drops along with it). Since V is constant, and I is dropping, the power (P = I*V) drops. This is the tapering effect we all love.

So getting back to why 800V charging is "faster". This is simply because a given charging station is spec'ed for a maximum current. Plug a 400V car into it, and it will only draw half as much power as an 800V car.

Now, could a Tesla Supercharger, that can provide 250kW at 400V (625A) output that much current at 800V? Well no, it's not quite that easy. There are other components in the charging station that take AC from the grid and rectify it into DC, and then take that DC and either boost the voltage to whatever voltage is commanded by the car, or when in CC mode, pump out the requested current , and the components that do these conversions are only spec'ed to a certain power level, so it won't just arbitrarily pump twice the power into an 800V battery.

The reason that cars are shifting to 800V architectures does have to do with power loss due to resistive losses. Power lost to IR drop (current times resistance) can become significant at high values of I (current). And one way you can combat this is by going to higher voltages. Double the voltage, and you halve the current. Or, alternatively, if you can live with a given amount of current (say 350A) in your charging cable and connector, by doubling the voltage you operate at, you can get twice as much power through that component and only experience the same loss. And it's not just the actual lost power that we're worried about here, but rather the heat generated by that loss. You don't want to melt the connector by trying to shove 1000A of current through it!

Hope this helps (and is not too inaccurate).
 
@RTPEV thank you for the detailed response! To simplify it further for me as I am only slightly more informed than a layman:

As you noted above, a Tesla SuC gets 250 kW @ 400 volts @ 625 amps, would that not provide the exact same charging time as a Hyundai Ioniq 5 @ 800 volts @ 312.50 amps for example? Both outputs equal 250 kW.

I understand if the amps are constant that 800 volts win every time. What I don't understand is why a 400 volt charger cannot just increase amps to account for the lack of volts, which is what Tesla SuC do. The only way I see an 800 volt charger being actually faster than a 400 volt charger is when the 400 volt charger can no longer increase amps to compensate for the lack of volts - i.e. 800 volts @ 500 amps may be doable (400 kW) but 400 volts @ 1,000 amps (400 kW) may not be doable. Is that correct?
 
  • Like
Reactions: RTPEV
@RTPEV thank you for the detailed response! To simplify it further for me as I am only slightly more informed than a layman:

As you noted above, a Tesla SuC gets 250 kW @ 400 volts @ 625 amps, would that not provide the exact same charging time as a Hyundai Ioniq 5 @ 800 volts @ 312.50 amps for example? Both outputs equal 250 kW.
It all depends on the charging curves, and more specifically, when the BMS switches from CC to CV (which initiates the taper). Teslas do this at a relatively low SOC (I think in the low 20%'s). I don't know when the Ioniq5 switches to CV, but with the 800V architecture and the ability to get more power in at lower current, it may not switch until a much higher SOC.

Note that the observed taper point might be misleading. For example, if I charge my Tesla at a V3 Supercharger, it will start tapering at 20% (or so). But if I charge at a V2 Supercharger, it will taper at 53%. This is simply because the charger itself is not capable of providing the amount of current that the car is requesting, artificially (from the car's point of view) flattening the charge curve.

Here's the Ioniq 5's charge curve (not sure at what kind of charger, but the up-taper on the left indicates to me that the charging station is not limiting the car here):

img-hyundai-ioniq-5-dcfc-power-20210426.png


As you can see, it appears to maintain it's CC until about 50%. And I'm not sure what's going on between 50-65% (the article speculates that maybe the battery got hot and the BMS lowered the current--after 65%, that starts to look more like a CV taper-down).

If the car is able to maintain CC at 225kW all the way to 50% (or higher), THAT's the reason it's charging faster, not so much because it's an 800V architecture.

Where the 800V architecture becomes important is that it's going to be easier for an Ioniq driver to find a high power charging station because high power charging stations are going to be easier to build at 800V than 400V (because their current-sensitive components only need to be spec'ed to half the current level).

I understand if the amps are constant that 800 volts win every time. What I don't understand is why a 400 volt charger cannot just increase amps to account for the lack of volts, which is what Tesla SuC do.
They can, but they become more expensive.

The only way I see an 800 volt charger being actually faster than a 400 volt charger is when the 400 volt charger can no longer increase amps to compensate for the lack of volts - i.e. 800 volts @ 500 amps may be doable (400 kW) but 400 volts @ 1,000 amps (400 kW) may not be doable. Is that correct?
It's not that it's not doable, but the components get more expensive. And extra cooling equipment may be required to draw off the lost heat. It's just easier and cheaper to go with an 800V architecture. Plus it is more efficient. However, 400V cars are essentially only going to have half the nameplate power from an 800V station.
 
. What I don't understand is why a 400 volt charger cannot just increase amps to account for the lack of volts, which is what Tesla SuC do.
They can, at a cost, within the constraints of the physical infrastructure all the way from the charging equipment to the individual cells.

More current requires bigger conductors to manage resistance and heat. So everything from the charging station electronics, the charging cable, the charge port, the HV vehicle cabling, the bus bars in the battery, etc etc etc need to be sized up at significant cost, weight penalty, and so on.

Your underlying assumption is correct - power is power whether provided at high voltage and lower amps or vice-versa. ALL other things being equal the batteries would charge at the same rate.

But practically speaking, high voltage low current charging produces much less heat/resistance so there is less thermal management required, charging peaks may be able to be sustained for longer without exceeding thermal thresholds, and so on.
 
They can, at a cost, within the constraints of the physical infrastructure all the way from the charging equipment to the individual cells.

More current requires bigger conductors to manage resistance and heat. So everything from the charging station electronics, the charging cable, the charge port, the HV vehicle cabling, the bus bars in the battery, etc etc etc need to be sized up at significant cost, weight penalty, and so on.

Your underlying assumption is correct - power is power whether provided at high voltage and lower amps or vice-versa. ALL other things being equal the batteries would charge at the same rate.

But practically speaking, high voltage low current charging produces much less heat/resistance so there is less thermal management required, charging peaks may be able to be sustained for longer without exceeding thermal thresholds, and so on.
Thanks for the response. I think this answers my question. 800v architecture cars charge quicker bc there is less heat generated at the same amount of power. The more heat = worse charging curve = slower charging time. I'm guessing that's why our Tesla's hit 250 kW quickly but don't stay there very long, bc that 250 kW is at ~625 amps where 250 kW for an 800v architecture is under 350 amps (so less heat generated).
 
  • Like
Reactions: cdswm3
In my example below the kW output of the charger is the same 280 kW, so would they not charge in the exact same amount of time (all other things being equal)?

Hyundai Ioniq 5 - 800 volts x 350 amps = 280 kW
Tesla Model 3 - 400 volts x 700 amps = 280 kW
Your understanding of this is all basically correct. The kW level is energy delivered per second, so yes, the 280 kW and 280 kW are putting energy in at the same speed, so they would be equal. The reason people say the 800V is faster is based on assumptions around that "all other things being equal" stuff. The thickness of the wires in a charging cable need to depend on the amount of current. So you generally wouldn't find the same cable used in the two examples above that can support either 350A or 700A. The assumptions people are using are that you have whatever sized cable. And that can support whatever amount of current. And so when they say 800V > 400V, they are assuming the currents are somewhat in the same ballpark, limited by the cable thickness, and so the double voltage is also creating about double the kW of power, and therefore faster energy delivery.
 
Your understanding of this is all basically correct. The kW level is energy delivered per second, so yes, the 280 kW and 280 kW are putting energy in at the same speed, so they would be equal. The reason people say the 800V is faster is based on assumptions around that "all other things being equal" stuff. The thickness of the wires in a charging cable need to depend on the amount of current. So you generally wouldn't find the same cable used in the two examples above that can support either 350A or 700A. The assumptions people are using are that you have whatever sized cable. And that can support whatever amount of current. And so when they say 800V > 400V, they are assuming the currents are somewhat in the same ballpark, limited by the cable thickness, and so the double voltage is also creating about double the kW of power, and therefore faster energy delivery.
Agreed. It seems to me though that with the same kW output, 800 volt architecture cars still seem to charge quicker. That was really what I was confused about, but it appears that the excess heat generated from higher amperage causes the 400 volt cars to charge slower due to a worse taper effect.
 
Agreed. It seems to me though that with the same kW output, 800 volt architecture cars still seem to charge quicker. That was really what I was confused about, but it appears that the excess heat generated from higher amperage causes the 400 volt cars to charge slower due to a worse taper effect.

As others have said, the wires simply need to be thicker at lower voltage to offset IIR losses. Elon was actually asked about this at the 2Q2022 stockholders meeting. He basically said they would only save about $100/car by moving to 800V. Presumably this cost savings is due to less copper, but probably offset by more sophisticated insulation on those wires and increased costs for other high voltage components.

Also realize we are talking voltage at the battery pack level. The individual cells within the pack don't know how they are wired up. (I.e., a certain number of cells in a series string at 400V, and some number of those strings wired in parallel vs twice as many cells in series for 800V and half as many parallel strings.)
 
Yeah, I saw Mr. @jjrandorin's post earlier today, but was in the middle of diagnosing fried inductors 😁.

So, I'm going to attempt to summarize all the stuff that's come above and perhaps make it all a bit clearer. And, in the process, give a solid understanding of what's going on with all the Power Conversion Stuff, anyway?

So, let's start off on the correct foot: What we're attempting to do is move energy from City Power to the Battery. The rate at which energy gets moved from point A to point B is measured in Watts. (We got a lot of dead early researcher names floating around here.. Watt, Joule, Ohm, Ampere, etc.)

Just to make sure that we speak in clear language, I'm not going to play around with the words, "kilowatt hour" much. That's a weird one put together by a bunch of power companies back in the day. Let's stick with Joules. A Joule is a unit of energy, the symbol is "J". There's 3.6 MJ in a kW-hr.

A Watt is a rate, like gallons per minute. In fact, the definition of a Watt is, exactly, 1 J/s. That 100W lightbulb in the fixture to your left? Right, it's using 100J of energy each and every second. A lump of coal, when burnt, gives off 23.9 MJ per kg; power companies pay for coal, they burn the stuff, that boils the water, that turns the turbine, that spins the generator, that generates the energy. You all pay for the energy which is directly attributable to that lump of coal, and you pay your fair share to the electric company.

Fine. Great. So, the number of Watts flowing on electric wires is the rate of energy transfer from the power company to wherever it is we want it.

So, Mr. Ohm was interested in this flow of energy and spent a ridiculous amount of time characterizing these flows across different materials. Turns out, it wasn't easy figuring out it all, but, eventually, he came up with what we call Ohm's Law: Voltage = Current * Resistance in a circuit. Further, Power = Voltage * Current. (Note: These equations are great when one is playing around with electromagnetic fields in and around conductors, but it tends to break when energy gets transferred into the Ether. Maxwell's equations cover everything, including photons traveling through free space, but it's a lot simpler to stick with Mr. Ohm if one is playing with wires.)

Let's define current. An Ampere is defined as the movement of one Columb of electrons per second through a conductor. A Columb, for those of you who don't remember Middle School chemistry, is a number, like, say, a million. Only a Columb is 6.022 x 10^23. Why a Columb? Because a Columb of something has a mass, in grams, of the atomic weight of whatever-it-is. Atomic weight of Iron (FE) is 55.84; so a Columb of that material would weigh 55.84 g.

Let's talk about that resistance a bit. Let's say that we have a conductor. Further, let's talk about metal conductors. The point about a metal conductor is that it's an array of atoms that are embedded in a gas of free-moving electrons. At room temperature, silver turns out to be the material where the electrons in the conduction band flow the easiest, in that they're not coupled very much to the atomic nuclei in the conductor. However, nothing's perfect: When current is flowing in silver, the occasional electron is going to nail the silver atom (with all the electrons that aren't in the conduction band) dead on, at speed: This makes the silver atom bounce around and, well, that's temperature. Electrons change orbitals as a result, then head back to their original spots, giving off photons of infrared as they go; those photons get absorbed by other silver atoms, get radiated away, and so on: But run a big enough current through ye silver wire and it's going to get hot. Hot enough, and it melts. And/or vaporizes. And, in any case, converting electrical power to heat energy doesn't do a heck of a lot of good about depositing energy into a battery: It's a loss, and we like not having losses.

So, those of us who muck with conductors have a term, called the resistivity. Like many other fundamental terms, the EE's have co-opted a Greek letter for this term, rho. The units of this beast are in Ohm-meters. Why that? Because the resistance of a straight conductor is
R = rho * (length of the conductor)/(Cross-Sectional Area of the conductor).
Rho for copper, a decent conductor, is 16.8e-9 Ohm-meters. Rho for silver, a better, but rarer (and more expensive) conductor is 15.9e-9 Ohm-meters. Gold, which doesn't corrode, but is really expensive is 22.14e-9 Ohm-meters. Everybody tends to use copper.

So, why high tension lines? Suppose that we're trying to move 250 kW from point A to point B. Let's pick a voltage, any voltage. Let's try.. 1 V!

Since P = V*I, that means that I to carry this voltage on by would be 250e3/1.0 = 250e3 Amps. Yeah. Suppose we want to carry this a kilometer, and we'd like to lose, say, 0.1% of the power being transferred to heat in the wire. So, the current being the same in the wire everywhere, we can take the power in the wire as P = I * V, but V = I * R (Ohm's law), and we get the loss in the wire as P + I * (I*R) = I^2*R. So, 250 kA, we'd like to only lose 0.1%, so the maximum power we'd like to dissipate would be 250 kW * 0.001 = 250W. And the resistance would have to be R = P/(I*I), or 250W/(250e3*250e3) = 4e-9 Ohms. Four nano-Ohms. Sure.... Well, R = rho*length/(area). We know rho, it's a kilometer long wire, so the area would have to be (in meters)
A = 16.8e-9 * 1000/(4e-9) = 4.2e3 square meters. If that thing was a square bus bar, that's 64 meters on a side. And a kilometer long. I think the accountants would have a problem with that.

So, let's go to the other extreme: Let's run at 100kV, and we'd still like 0.1% losses in that kilometer. First, I = P/V, so 250 kW/100kV = 2.5A. That's a lot better. Next, we're still losing 250W for 0.1%. But P = I*I*R, so R = 250/(2.5*2.5) = 40 Ohms. Finally, a copper wire, a kilometer long with 40 Ohms across the length would have an area of A = 16.8e-9*1000/(40) = 4.2e-7 meters squared, or (if it was a square wire) 648e-6 m across, which is 0.648 mm across. That's roughly an 20 AWG wire. The accountants won't get as mad.

So, why aren't we running 100 kV everywhere and knocking out losses?
  1. Corona discharge. Not mentioned in any of the above is the Electric Fields around those wires. It's typically measured in Volts/Meter. These fields terminate on the wire at one end and ground at the other; and they're strongest right at the surface of the wire. With a 20 GA wire, the electric field is strong enough to rip the electrons right off the molecules in the air. This makes for a really pretty light show and more losses. For that reason, one will notice that many high tension wires are physically thick, like an inch or two, precisely to cut back on this kind of discharge. The Really High Tension systems have, instead of one conductor, a quad of conductors that are, like, four or five inches to a face, that emulates an even larger physical wire which cuts down on the corona discharge even more.
  2. Solid insulators aren't that helpful either. A standard pane of window glass, in the presence of an E-field like that, will go, "Snap" and start conducting as the E-fields rip the electrons loose from SiO2. If you've ever looked at a high tension wire and the insulators that hang those wires to the towers.. those insulators are, physically, lllloooonnnnngggg, to reduce the volts per meter from one side of the glass insulator to the other.
  3. Safety. Not only will those high E-fields rip electrons loose from air molecules, they'd have no problems ripping electrons loose from you. Which is not healthy. There's Reasons why those power switch yards have big fences up with signs saying, "Go No Further".
Now that we got the basics cleared, let's go play with Teslas and other BEVs.

First, another factoid. These days, so long as one is below roughly 400V to 500V, there exist highly efficient DC-DC conversion boxes. These functional blocks take in a DC voltage and current; then, using high-voltage, high-current switching transistors, this current is put back and forth through a winding on a very efficient transformer. That's the primary of said transformer. On the other side of the transformer, the secondary, a voltage and current appears there, too; that can be synchronously rectified and filtered to make a second, output, DC voltage and current. By messing around with the switching transistors and the turns ratio on the transformer, one can take in pretty much any arbitrary voltage and current and make another voltage and current, at the same power as the input (minus, roughly 1%-5% due to losses in the transistors and transformer). We're talking 95% to 98% power efficiency here. So long as the switching frequencies are up above 150 kHz or so (and they are: The really advanced stuff gets up to 10 MHz), the physical component sizes are small and the parts are relatively cheap. And it'll all fit into a car, whereas those power transformers one sees on telephone poles and boxes on street corners won't.

Second, bonus factoid: The battery charger in a BEV is exactly one of these blocks. DC in from rectifiers hooked to 120VAC or 240 VAC, or from a Supercharger; DC to the battery, out. And the output voltage and/or current of the charger is varied by the battery control module.
(I've heard reports that, with Tesla, the Supercharger DC voltage is varied by commands from the car, rather than there being an actual charger in the car. Believable, but I haven't been able to find a decent block diagram yet to check. Doesn't change, really, what's to follow, though.)

So, why would 800V be better or worse than 400V, fundamentally? There's pluses and minuses.

800V is twice as high; therefore, the current is half as much at the input to the BEV.
  1. Pluses
    1. The current is half as much for the same amount of power, so, copper losses are less.
    2. Likewise, the current in the switching transistors is half as much, so less power dissipation in them, as well as the primary side windings in the transformers.
  2. Minuses
    1. Alrighty, then: 800V is a lot bigger voltage and one has to worry about the insulation. That's some thick insulation on the 400V wires; it has to be twice as thick for 800V, and that's space and money. Maybe enough to negate the use of less copper for 800V vs 400V.
    2. Switching transistors. So, I play, occasionally, with power transistors designed to handle 100V or so. The higher the voltage that a MOSFET transistor has to handle, the bigger its internal resistance gets to be. That causes additional losses over lower-voltage transistors used in 400V systems. Again, it's I*I*R, but we got half as much current.. But I don't know what R gets to when one bumps the voltage up to 800V. There are ways to keep the efficiency up, though:
      1. More transistors in parallel. ($$).
      2. Move to a different type of transistor. There are high-power, semiconductor switching transistors that are made out of Silicon Carbide. Really. ($$$).
      3. Live with the additional heat (Silicon Carbides are good for that kind of thing). But it's possible that one might need to apply some kind of cooling to those hot switching transistors - liquid, air blowing by, etc. That might not be hard to come by, given the liquid cooling found in every BEV except, apparently, the Leaf. But it'll be ($$), or even ($$$$).
    3. Safety. Yeah, it's all inside the car and sealed away. And 400V is lethal anyway. But 800V is certainly more lethal. Not that any hot-rodder should be messing around near those big wires in any case. But.. car crashes? Is some safety expert going to decide that keeping the high-tension wires found in an electric switching yard that never undergoes a rear-ender a good idea for cars that, inevitably, will?
So, I dunno. I've seen the comments above about how the current can be modified with the higher voltage in strange ways that results in better battery charging profiles. I.. don't think I agree. I think the main reason for going to the higher voltage is to cut back on the losses and, eventually, the cost of building the car by thinning out the copper and reducing the power consumption needs in the transistors and motor.
 
Great post, I agree with nearly everything.

Except one question: is the thickness of the insulation on typical low voltage (in power industry 800V is 'low voltage') cables actually at the minimum necessary for electric fields like insulation thickness/depth is for true HV equipment? I thought that the insulation was well over-thick but was there mostly for abrasion resistance and mechanical properties and safety.
 
Second, bonus factoid: The battery charger in a BEV is exactly one of these blocks. DC in from rectifiers hooked to 120VAC or 240 VAC, or from a Supercharger; DC to the battery, out. And the output voltage and/or current of the charger is varied by the battery control module.
(I've heard reports that, with Tesla, the Supercharger DC voltage is varied by commands from the car, rather than there being an actual charger in the car. Believable, but I haven't been able to find a decent block diagram yet to check. Doesn't change, really, what's to follow, though.)
This is true.

The battery charger in the car is what takes AC from the charge port (when doing L1 or L2 AC charging), rectifies it to DC and provides (via a DC/DC converter) the BMS commanded current or voltage to charge the battery. The built-in charger can only handle limited amount of power (from 3.3kW for early LEAFs & Volts) to around 12kW in modern EVs.

When DC fastcharging, this internal charger is completely bypassed. Essentially the charger is inside the Supercharger cabinet, and yes, the BMS in the car directs the now external charger with current/voltage requests just as it does the internal charger, except now the commands are being received by the Supercharger.

And here is where Tesla had an enormous advantage in the early days (that still lingers to today): for the first iterations of the Supercharger, all they did was take 12 of the charger modules out of the Model S and stuck them into a Supercharger cabinet in parallel.

rare-look-inside-tesla-supercharger.jpg


Obviously there is a bit more going on, but think about it: they had these modules that were already designed and in volume production, and bam, they use those in their Supercharger! Other DC fastcharger suppliers had to design their power supplies from scratch, and they get manufactured in relatively low volume. This is how Tesla was able to get higher power Superchargers out their faster and FAR less costly than the competition. And they were more or less proven. I think that is a highly underappreciated aspect of the whole Supercharger story.

One other minor note: while it is correct that there are isolation transformers in the charger circuits, the actual voltage conversion is not done using transformers with fixed amounts of turns (that would not allow for a variable current/voltage required for a battery charger circuit). Instead, the switched mode power supply circuit referred to uses a capacitor, diode, and inductor, in addition to the switch (some kind of transistor, be it MOSFET, IGBT, GaN or SiC) that is used to control the output voltage.
 
Agreed. It seems to me though that with the same kW output, 800 volt architecture cars still seem to charge quicker. That was really what I was confused about, but it appears that the excess heat generated from higher amperage causes the 400 volt cars to charge slower due to a worse taper effect.
I think you are a bit too focused on the heat. Heat is just a waste byproduct of what is actually going on here, and the taper effect is not caused by heat, but rather by the battery engineer who decides that for the given battery chemistry and architecture inside the pack, that the BMS will switch from CC to CV at point X (and the tapering happens when the charger is in CV mode).

Looking at the charging curves for the Tesla vs. the Hyundai, we see that Tesla switches over to CV at about 20% SOC, while the Hyundai goes to about 50% before it does this.

There are a number of reasons why this might be the case, and yes, one of those reasons (and maybe the key reason) is that the heat generated by trying to shove 625A down a serial chain of about 100 battery cells (each Li-ion battery has a voltage of around 4V, to use a round number) vs. shoving only 300A down a serial chain of 200 battery cells. Of course the heat could be mitigated by a larger or more effective cooling system. There would be a potential advantage to the cylindrical form factor of Tesla vs. the pouch form factor of Hyundai for example. Engineering usually comes down to tradeoff decisions like this.

But there could be other factors at play as well. There may be other components in the system (of the Tesla) that are unable to handle a sustained 625A current for more than a few minutes (doubtful, but possible). Or Hyundai's battery suppliers could have a superior chemistry, or even better cooling, that allows for sustained high power operation.

It's a good guess that reduced stress on the battery (and heat generated) is probably the main driver to why the Hyundai is able to charge at CC for longer. But probably the only ones who know for sure are the battery engineers at each company who designed the charging profile and when the BMS should switch from CC to CV (and, if the charging curve of the Hyundai is viewed, what the current in CC mode should be, based on environmental factors, such as temperature).
 
Great post, I agree with nearly everything.

Except one question: is the thickness of the insulation on typical low voltage (in power industry 800V is 'low voltage') cables actually at the minimum necessary for electric fields like insulation thickness/depth is for true HV equipment? I thought that the insulation was well over-thick but was there mostly for abrasion resistance and mechanical properties and safety.
Look, I play with wires, the kind used for hook-up and testing, all the time. The power cord wire that, in the U.S., goes to your bog-standard lamp or toaster is typically rated for 600V pk.

Think about that for a minute. In the wall socket in the U.S. the voltage is, say 125V rms. That the root-mean-square voltage; the peak voltage, plus or minus from "ground" (ha!) is +-125*(sqrt(2)), or +-176Vpk. That implies that the bog-standard lamp cord wire is over engineered by a factor of 340%.

But is it, really? OK: I'm not a power engineer. But I do play with good old Lightning Surge. (You should see the test equipment: It's impressive. The ESD gun I sometimes use is kind of old and obsolete, but it looks like it's straight out of Star Wars. A 15 kV zap, though, only goes about an quarter of an inch, though.) It's not at all unusual to get transient voltages due to a lightning strike down the street, or switching transients from the power yard, to give one 1 kV pulses on the power feeds here, there, and everywhere. In fact, pretty much anything that gets hooked up to City Power has to be designed to handle these kinds of transients without leathally failing in order to get that UL/CSA mark. And when I say, "anything", I mean anything: Your HVAC system, the TV set, your toaster, the oven, the fridge, the PC and its wall wart, the charger for your phone, and it goes on and on.

A few years ago some fellow, I think at HP, decided to take a good close look at Apple-brand chargers vs. the Chinese knock-offs and posted pictures. The Apple-branded stuff had, relatively speaking, Big Gaps on the circuit board inside the charger between City Power and the charger's DC output; good quality components that could handle a surge, for real, and it really did meet those UL/CSA/EU safety standards. All the $BS stuff from China that this guy looked at and sold on Amazon - not so much. No gaps, poor quality components, and the possibility of Halt And Catch Fire For Real While You're Sleeping In The Next Room Over.. exactly who are you going to sue, assuming you survive? For sure, Amazon isn't checking to see if these guys are meeting standards. But the ones one tends to find in, say, Staples, Wallmart, or Best Buy likely do. All three of those places are brick and mortar, process servers can find the corporate offices, and the lawyers at those places know it.

So, overdesigned by 340%, right? Wires get rubbed against stuff and abrade. The vacuum cleaner runs over the lamp cord. Pets Exist. So Do Toddlers. Plastic gets old and cracks. Cars get hit by things, and we'd prefer that a fender-bender doesn't result in the people in the car getting immolated. Lightning hits the pole outside a Supercharger, and we'd prefer that neither the Supercharger, the cars, or the people in and around those cars catch on fire.

Want a really fun one? Say that lightning strikes a perfectly innocent, properly built telephone pole with a ground wire at the top and a wire running straight down the pole into a ground stake plunged six feet into the ground. Great: A very sharp rise time, 5 million ampere pulse of current goes straight into the ground. Everything's safe, right? Ha, ha.

Let's say that the Supercharger cabinet is right next to the telephone pole and has its own ground stake plunged into the ground. Further, let's say that there's two or three ground stakes down the row of Supercharger stalls, with one at the end some 200' away from the cabinet.

Now, we've just dumped 5 million amps and a serious number of columbs of electrons into the ground at the base of that telephone pole. Quick bit of news: The earth is not a superconductor. The voltage around the pole is going to act like a pond with a rock thrown into it. There's going to be a peak spike of voltage at the ground stake, maybe a few thousand volts, that will propagate away at Ye Speed Of Light in all directions, circularly, away from the stake. It's a gonna take time for that surge to (a) reach the ground stake at the end of that row of stalls and (b) for the rise time to stop being so sharp as the energy dissipates out and away. And gets absorbed by the resistivity of the Earth. But in the meantime, the ground stakes in this system will all be at different voltages for tens of nanoseconds to microseconds. And we're talking hundreds of volts.

Ya think the 400 VDC going into the car, with respect to ground, is going to stay at 400V in all of this? Ya feeling lucky, kid?

People at power switching yards have MOVs (Metal-Oxide-Varistors, designed to start conducting at some voltage, with the intent to clamp surges) that are the size of hockey pucks or larger and can accept hundreds of Joules of energy without going ka-boom!. The bitty ones I play with are good for four or five Joules; typical, in-building lightning surge energies (people have actually measured this stuff) are a quarter of that.

But, in the presence of the possibility of surges of all flavors I would say: "You betcha, a 3.4X increase in the amount of insulation sounds like a right good idea."

Lightning surges are bad, but they're especially bad when there's the possibility of a direct lightning strike to something. I don't work on stuff that's installed on street corners, but: There's this old-time Telcordia standard that I was reading in detail for various reasons, GR-1244. There's test for street-corner telephony boxes. One opens up the box, drapes a certain grade of cheesecloth (yes, cheesecloth, the kind one makes cheese with) across the equipment, closes the box, zaps it with a serious lightning surge and, I quote:

"The equipment under test shall not become a fire or fragmentation hazard as a result of this test."

I suspect that there's similar requirements for BEVs and the equipment used to charge them. Not for the faint of heart.

(As it happens, I've never had to run that particular test. But I've had co-workers who've been through this.. One reports that, at one time, their spanking newly designed equipment didn't pass on the first try. Whee.)
 
  • Informative
Reactions: house9
Good information here in this thread, however, while it pretty much appears correct (I haven't read carefully through everything), it gets a bit lost in the details.
Here's a more simple approach to looking at the benefits between providing high power (P=I*V) by increasing either I (current) or V (voltage):
- increasing current requires thicker wires and/or the ability to handle more heat caused by more current
- increasing voltage requires more insulation in the wires in order to handle the danger of having very high voltages running around
Tesla, probably for many of the reasons already discussed, limited their systems to 400 volts. Therefore, they have to handle the hot wires with liquid cooling in order for the wires to be pliable enough for folks to handle.
Porsche launched 800v charging, mostly as a lame way to "one-up" Tesla on paper. The jury is still out, IMHO, on whether the additional hazards of handling such high voltages in the hands of the general populous will be acceptable. It will only take one cheap company to use cheap insulation, or not inspect and replace it as it ages, to cause a big disaster. Given the poor design and maintenance exhibited by pretty much all non-Tesla DCFC manufacturers and operators, I don't have high confidence.
 
But practically speaking, high voltage low current charging produces much less heat/resistance so there is less thermal management required, charging peaks may be able to be sustained for longer without exceeding thermal thresholds, and so on.

These discussions have to be clear which component is being talked about. So let's try to narrow it down a bit: Can an 800v architecture sustain a higher C rate for the same pack capacity than a 400v architecture due to less heat accumulation in the pack ?
 
  • Like
Reactions: Jeremy3292
These discussions have to be clear which component is being talked about. So let's try to narrow it down a bit: Can an 800v architecture sustain a higher C rate for the same pack capacity than a 400v architecture due to less heat accumulation in the pack ?

Thanks - this was essentially the question I was getting at but maybe didn't word it properly

Maybe add the word "longer" to make it more accurate:

Can an 800v architecture sustain a higher C rate longer for the same pack capacity than a 400v architecture due to less heat accumulation in the pack ?

Again, the heat accumulation can be mitigated if you are willing to throw enough cooling at the problem. But in the interest of keeping things as apples to apples as possible, let's toss out differences like upgraded cooling systems, thicker conductors and beefier components that would need to be added to the 400V system to allow it to compete with the 800V system. Strip it right down to the cells themselves.

To keep it simple, let's assume 200 cells in both systems. In the 400V case, you have 2 parallel chains of 100 cells in series, and in the 800V case you have a single serial chain of 200 cells. And let's assume you are trying to get 200kW into the battery. This implies 500A in the 400V case, and 250A in the 800V case.

However, in the 400V case the current splits between the two parallel chains, resulting in 250A through each chain.

In other words, it's basically equivalent to the 800V case.

So no, I don't think there is any inherent advantage to the 800V architecture in that respect. But, as you start actually assembling these things into real-world packs and cramming the cells next to each other and adding the cooling system and conductors, relays, onboard charger elements (even though this will be bypassed during DC fastcharging, the onboard charger still must produce 800V to charge the battery), the tradeoffs start to emerge. And maybe the 800V system plays a few additional tricks (as it appears is the case in the Hyundai) to actively manage heat accumulation during the CC phase so maybe they can maintain that peak charge rate longer.

I know you desire to have a simple answer to the question, but I'm not sure one can be provided. On the whole, yes, I believe there is an advantage to the 800V system, but it's hard to pin point the reason down to a single factor.