The TI chip on the monitoring boards of each module uses a 14-bit ADC with a 0 to 6.250V range and a 2.5 V precision reference. Here is a snippet from the user's guide:
The converter returns 14 valid unsigned magnitude bits in the following format:
<00xxxxxx xxxxxxxx>
Each word is returned in big-endian format in a register pair consisting of two adjacent 8-bit registers. The MSB of the word is located in the lower-address register of the pair, i.e., data for cell 1 is returned in registers 0x03 and 0x04 as 00xxxxxx xxxxxxxxb.
Cell Voltage Measurements
Converting the returned cell measurement value to a dc voltage (in mV) is done using the following formula (all values are in decimal).
mV = (REGMSB x 256 + REGLSB) x 6250 / 16383
Example:
Cell_1 == 3.35 V (3350 mV);
After conversion, REG_03 == 0x22; REG_04 == 0x4d
0x22 x 0x100 + 0x4d = 0x224d (8781.)
8781 x 6250 / 16,383 = 3349.89 mV ~ 3.35 V
i'm thinking that the Cell and modules voltage (and temperatures) are sent using this format to the main BMS board at the back of the pack, where they are then scaled and reformatted and sent over the CAN. From looking at Kalud's data i had guessed that they were biased to 2.5V with a 0 to 5V range over 14-bits.
So would the formula would be this: [value x 5000 / 16383] + 2500 = cell mV
----------
edit:
i was using 0x0AFA (2810d), in the 6% soc listing from Kalud's
post 630 data and the 2500 seemed to be necessary to get the correct cell voltage of 3.353...
2810 x 5000 / 16383 = 857 mV + 2500 mV = 3353 mV or 3.353 V
Can someone post a few lines in hexidecimal of ID 6F2 data?