Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

How does a battery gauge work???

This site may earn commission on affiliate links.

daniel

Well-Known Member
May 7, 2009
5,732
5,508
Kihei, HI
My Roadster tells me, apparently with decent accuracy, how much range I have left. Both ideal range based on steady 55 mph without heater or A/C, and if I continue to drive the way I've been driving the last 30 miles.

How does it know the actual number of kWh remaining in the pack?

Presumably it's not just measuring the Wh used and subtracting from ideal full pack, because as the pack degrades with age the gauge knows.

Does the pack voltage tell you how many kWh remain in the pack, even as the pack degrades with age? That would surprise me, given the level of precision it would need to detect the difference in pack voltage as remaining range declines by just a few miles. If not, what else can it measure to calculate this?
 
What they used to do was simply measure the voltage of the battery pack, but this is unreliable as it varies long term with things like age and even short term with things like temperature (among many others). Most modern batteries owe their accuracy to what is called coulomb counting, if you have seen the new smart batteries (like what is in your computer and cell phone), then you have seen this form of charge measurement. Essentially, an onboard chip sits in between the actual battery itself and the charger and 'counts' all of the watts heading to the battery. This allows cool innovations like the battery being able to measure its own age (some of the IBM computers will estimate degradation for you [I would be ultra surprised if the Model S didn't have this feature tucked away somewhere]). This also allows the battery itself to automatically reduce incoming power as it approaches full charge (which is why battery charging has become so simple lately for the end user).

In situations when it matters, no one metric is used by itself. For example, in a Coulomb counting setup, you can charge a battery to full charge and shelve it for some time as it looses charge (unbeknownst to the meter). You can then have fun watching how confused the battery meter gets when it thinks it has 30 or so percent left, and is really almost at zero. It then only takes a full charge cycle for the meter to get it's bearings again.

However, this can be mitigated if the voltage is monitored as well.

Hope this is somewhat helpful.

-Cameron
 
Thanks, Cameron.

The Porsche originally had an Ah meter. It told me how many Ah I'd used, or if I preferred, I could have it count backwards. The problem was I had to assume the pack actually had its nominal 24 (?) kWh. (I forget now how many kWh the pack was supposed to have.) But the problem is that if the pack actually held fewer, my remaining-range estimate would be off.

Let's say that a pack has degraded to 50% of its original capacity. Continuous Coulomb counting wouldn't tell it unless you ran the pack to dead empty so that it could count the Coulombs going in. There's just no way it could be accurate enough over a period of several years. Plus it has to account for losses, which I'm sure cannot be predicted precisely. Would this pack not charge to full nominal voltage? Could they tell, just from the fully-charged voltage, the degree of degradation? We assume that the ideal-range-remaining meter knows how much energy is in a fully-charged battery pack.

Of course I'm not asking about Tesla's algorithm. I'm just asking what they are measuring. I can think of nothing but voltage and Coulomb counting, and those two seem insufficient to detect battery degradation over time, even though they'd be adequate when the pack is relatively new holds the amount of energy it was designed for.

And FWIW, Gordy took out the Ah meter from the Porsche, as the new speed controller has a built-in SoC function, along with a lot of other fancy stuff. There's also a BMS now (not sure if that's part of the controller) that sounds an alarm if any cell in the pack gets too low.
 
Of course I'm not asking about Tesla's algorithm. I'm just asking what they are measuring. I can think of nothing but voltage and Coulomb counting, and those two seem insufficient to detect battery degradation over time, even though they'd be adequate when the pack is relatively new holds the amount of energy it was designed for.

Temperature is the other basic variable to measure, but the real scope for being clever is in combining these variables: you can measure the rate of change of the voltage with respect to the charge or discharge current, both short-term (gives the internal resistance of the pack) and longer-term (tells you how much the state of charge varies with the charge you take out of it). Combined with the fact that you've got individual voltage measurements on every cell (even with fairly crude battery controllers) gives you more data than you know what to do with.

At a crude level, voltage directly gives you SoC, but it requires correcting for various factors and you can't measure it accurately enough to give you the precision you want on your range meter. Coulomb counting will let you interpolate between SoC points (ie. to make your meter count down slowly). Long-term measurements let you correct the pack capacity (SoC declined from 50% to 40% during which time Coulomb counting tells you 10Ah were extracted -> pack capacity is probably 100Ah). Temperature measurements need to be factored in to remove inaccuracies in these measurements. And there's another source of error in that you are typically making your measurements under conditions of highly variable load.

So, in theory it's easy, in practice there are sufficient sources of error that it's hard.

[I happen to be writing some battery management software right now - but my pack has only 8 cells and an accuracy of 10% on my SoC figure will be fine. So I'm not in the Tesla league].
 
Let's say that a pack has degraded to 50% of its original capacity. Continuous Coulomb counting wouldn't tell it unless you ran the pack to dead empty so that it could count the Coulombs going in. There's just no way it could be accurate enough over a period of several years. Plus it has to account for losses, which I'm sure cannot be predicted precisely. Would this pack not charge to full nominal voltage? Could they tell, just from the fully-charged voltage, the degree of degradation? We assume that the ideal-range-remaining meter knows how much energy is in a fully-charged battery pack.

Of course I'm not asking about Tesla's algorithm. I'm just asking what they are measuring. I can think of nothing but voltage and Coulomb counting, and those two seem insufficient to detect battery degradation over time, even though they'd be adequate when the pack is relatively new holds the amount of energy it was designed for.
A couple things I could touch on here:
-> The primary driver of degradation is cycling. A battery can (in theory) sit on a shelf and lose charge without 'aging' (degrading) the battery.
-> It will still charge to nominal voltage - as a battery approaches full charge, its internal resistance changes (which is why charging has to slow down). The controllers in 'smart batteries' look for this resistance change and will still charge to max regardless of how soon (or late) it approaches max.
-> Losses can be predicted by comparing how many Ahs it takes to reach max charge versus how many it is should take
-> Voltage fully charged will be the same in a battery degraded versus one new, a degraded one simply delivers less Ah over its discharge profile. If this sounds counter-intuitive, then you have stumbled across one of the many reasons Voltage alone is an unreliable measuring tool (good for SOC but near useless for total capacity).

Another fun bit I could point out is that it is typically a technical impossibility to charge a battery to the absolute max, for numbers of reasons. For starters, it becomes harder and harder to deliver energy to a battery nearing full charge (re: the internal resistance change). Furthermore, the energy you deliver to the battery is actually manifested in the battery - the electrolyte becomes more volatile etc. In theory you could force a battery to charge linearly by increasing voltage to match the internal resistance, but in reality nasty things happen to the electrolyte (think boil off/overheating/burning).

All of this conspires to make sure that a battery is never quite fully charged. :p

-Cameron
 
Thanks to both arg and stAtrill for that info. This is a fascinating subject for me. The complexity you both mention perhaps explains why some cars (e.g. Leaf) get it so wrong while others (e.g. Roadster) seem to do such a good job.

If anyone else has anything to add, I'm always interested.
 
Both my LEAF and Ford RangerEV will make "adjustments" to the range gauge after the vehicle sits for a while (in the powered on state.)

When you are driving they tend to measure current output from the pack and make an "educated guess" as to how much is probably still left.
Measuring voltage under load (while driving) doesn't work to know pack fullness since the load drags the voltage down, so they have to try to track energy use instead.
When the vehicle sits for a while powered on, but with no drive-train load then it can do a "recalc" based on a consistent voltage reading.

With the RangerEV, I noticed if I let it sit for a long time (many days or weeks), when I turn it on the DTE (Distance To Empty) shows what it last showed when I last powered down, but after many minutes it will suddenly drop a bunch when it realized that some charge had "leaked out" while it had been sitting. I suppose a more intelligent gauge could have used a calendar clock and other data to estimate probable energy leakage over time, and factor that in rather than showing a misleading stale reading until it sees the lower consistent voltage.

So, there is a lot of estimation, recalibration and smoothing/buffering going on to present you with some value that is the "best educated guess" as to what the computer thinks is likely in the pack at any given time. There isn't one exact approach to do this right. Some engineers come up with the best algorithm they can think of given the constrains of available inputs, battery behavior and available time/budget on the project.

Tesla did remarkably well compared to some other manufacturers that probably had more engineers and more budget to try to get it right.