There was a massive thread that contains a lot of arguing about range and capacity calculations which I don't want to resurrect here. I'm not linking for reference so it doesn't get thread-necro'd! It has been dead for a few years now...
The information below was from page 57 of that thread and spoke to the process the BMS/BMBs use to balance bricks inside our packs. I'm curious if there have been any updates in the last few years about how the BMS balances packs in the S/X using the bleed resistors? What is the current understanding and best practices? wk057 posted this in that thread on 4/21/19:
Worth an update here.
Tesla has changed the balancing algorithm many many times over the years. Originally this was a very dumb setup that would only kick in once a cell group reached a threshold voltage, usually around 90-93% SoC. This is no longer the case.
First, let me point out that Tesla's BMS software has come a LONG way... I'd consider it a work of art now. Lots of genius in there. It's absolutely amazing and full kudos where kudos are due here.
One thing they're now able to do is to calculate out the capacity of individual bricks of cells (96 in the 85/90/100, 84 in the rest) based on a ton of factors and compute this in near real time, in a full range of conditions, with almost magical accuracy. They're basically running physics simulations (similar to how they calculate out unmeasurable metrics in the inverter firmware, like rotor temperature) of the entire pack based on measured power usage/charge, balancer usage, temperature, temperature delta based on coolant flow and coolant temp, predicting temperature gradients, and probably 100 more variables. This is the holy grail of proper balancing for safety and longevity for a battery pack. This is not a dumb system anymore by any means. Knowing the actual capacity of the individual bricks allows them to know exactly which ones need cell bleeders enabled, and for exactly how long. With this data, they can balance on the fly at any SoC, and just use top and bottom SoC windows for fine tuning, validation, and calibration.
The car balances all the time whenever its needed. It knows when a cell group will need balancing before it's even out of balance... which is really freaking weird when you think about it, especially if you're watching a playback of the pack balancing and voltages and see it engage a balancer on a cell group that doesn't look out of balance at all, and watch it fall completely in line still at the end of a charge or discharge cycle. It keeps track of which groups will need it, which wont, how long they'll need it, how much they've been balanced, etc.
It really is an epic setup now.
The short answer to the balancing question: It balances any time it needs to balance.
As for SoC shenanigans, yes getting closer to 100% or 0% will give it a chance to tune things better... but it's not needed anymore. Just charge like you need to, and drive.
The information below was from page 57 of that thread and spoke to the process the BMS/BMBs use to balance bricks inside our packs. I'm curious if there have been any updates in the last few years about how the BMS balances packs in the S/X using the bleed resistors? What is the current understanding and best practices? wk057 posted this in that thread on 4/21/19:
Worth an update here.
Tesla has changed the balancing algorithm many many times over the years. Originally this was a very dumb setup that would only kick in once a cell group reached a threshold voltage, usually around 90-93% SoC. This is no longer the case.
First, let me point out that Tesla's BMS software has come a LONG way... I'd consider it a work of art now. Lots of genius in there. It's absolutely amazing and full kudos where kudos are due here.
One thing they're now able to do is to calculate out the capacity of individual bricks of cells (96 in the 85/90/100, 84 in the rest) based on a ton of factors and compute this in near real time, in a full range of conditions, with almost magical accuracy. They're basically running physics simulations (similar to how they calculate out unmeasurable metrics in the inverter firmware, like rotor temperature) of the entire pack based on measured power usage/charge, balancer usage, temperature, temperature delta based on coolant flow and coolant temp, predicting temperature gradients, and probably 100 more variables. This is the holy grail of proper balancing for safety and longevity for a battery pack. This is not a dumb system anymore by any means. Knowing the actual capacity of the individual bricks allows them to know exactly which ones need cell bleeders enabled, and for exactly how long. With this data, they can balance on the fly at any SoC, and just use top and bottom SoC windows for fine tuning, validation, and calibration.
The car balances all the time whenever its needed. It knows when a cell group will need balancing before it's even out of balance... which is really freaking weird when you think about it, especially if you're watching a playback of the pack balancing and voltages and see it engage a balancer on a cell group that doesn't look out of balance at all, and watch it fall completely in line still at the end of a charge or discharge cycle. It keeps track of which groups will need it, which wont, how long they'll need it, how much they've been balanced, etc.
It really is an epic setup now.
The short answer to the balancing question: It balances any time it needs to balance.
As for SoC shenanigans, yes getting closer to 100% or 0% will give it a chance to tune things better... but it's not needed anymore. Just charge like you need to, and drive.