diy solar

diy solar

BMS SOC algorithm ?

RayE

New Member
Joined
Jan 16, 2023
Messages
37
Location
Philippines
It would be nice to know how their SOC algorithm works. It must partly be based on some user settings like battery capacity and I would have thaught some voltage window that can bring back cumulative coulomb count errors over time? I have a battery out on an EV that's cycled twice per day (24s 120Ah LIFEPO4), it's been running since May 2022 and the BMS states is 58Ah. I'll get that battery back to my place soon and do a capacity test on it to determine if it's the cells OR the BMS giving bad readings.
 
Trying to figure out how the SOC algorithm works on JK BMS as well as others.
 
Last edited:
Here is a technical paper from Analog Devices that's worth a read. So from my understanding and to be as brief as possible.................

For coulomb counting it does indeed periodically need to re-estimate the SOC based on either the OCV after a rest period (possibly the end of charge) OR on the cell voltages under load, it knows the load so it can reference a look up table to do the calcs for an estimated SOC. It needs to do this periodically to negate the accumulative errors introduced by coulomb counting alone.
 

Attachments

  • a-closer-look-at-state-of-charge-and-state-health-estimation-techniques.pdf
    2.1 MB · Views: 8
Last edited:
cool thread! gaining understanding of how state of charge is estimated with various chemistries is super interesting.
 
It would be nice to know how their SOC algorithm works. It must partly be based on some user settings like battery capacity

Yep. Can't be meaningful without knowing how big the battery is.

and I would have thaught some voltage window that can bring back cumulative coulomb count errors over time?

Yep. Periodic synchronization to 100% OR sufficient charging to achieve near 100% without meeting the "charged" criteria, i.e., the BMS literally counts past 100% and resets to 100%.

I have a battery out on an EV that's cycled twice per day (24s 120Ah LIFEPO4), it's been running since May 2022 and the BMS states is 58Ah. I'll get that battery back to my place soon and do a capacity test on it to determine if it's the cells OR the BMS giving bad readings.

what BMS?

24S? what application?

That's actually not quite correct. You get errors while coulomb counting such as current measurement, battery aging etc etc. This means there is more to SOC calculating than just measuring energy in/out. As mentioned in the video there can be an intermittent open circuit voltage measurement (zero load) that is used to effectively recalibrate the SOC.

This is generally not used for LFP. The flatness of the voltage curve makes this wildly inaccurate. This is very common and accurate for Lithium NMC, LMO, NCA, etc... the "3.6-3.7V" nominal Lithium chemistries.

There is DEFINITELY more to SOC than just coulomb counting and this is what I want to try to understand.

Yep. In the vast majority of cases, the BMS synchronizes to 100% with regular charges to full.

NiMH is even worse than LFP, and NiMH is never fully charged or even close in HEV applications. Essentially, they have detailed "lookup" tables of voltage, current and temperature that are combined with current counting to estimate SoC. It's pretty amusing to watch SoC calculations made by the car jump wildly due to a significant temperature range even though it hasn't moved since the last calculation was made.

I suspect the NiMH method is also used for LFP and may be applied to other lithium chemistries as they need to track state of health, i.e., if the coulomb count says you're at 50%, but the voltage/current/temperature data says it's actually lower, it would estimate the reduced capacity and a less than 100% SoH.
 
Yep. Can't be meaningful without knowing how big the battery is.



Yep. Periodic synchronization to 100% OR sufficient charging to achieve near 100% without meeting the "charged" criteria, i.e., the BMS literally counts past 100% and resets to 100%.



what BMS?

24S? what application?



This is generally not used for LFP. The flatness of the voltage curve makes this wildly inaccurate. This is very common and accurate for Lithium NMC, LMO, NCA, etc... the "3.6-3.7V" nominal Lithium chemistries.



Yep. In the vast majority of cases, the BMS synchronizes to 100% with regular charges to full.

NiMH is even worse than LFP, and NiMH is never fully charged or even close in HEV applications. Essentially, they have detailed "lookup" tables of voltage, current and temperature that are combined with current counting to estimate SoC. It's pretty amusing to watch SoC calculations made by the car jump wildly due to a significant temperature range even though it hasn't moved since the last calculation was made.

I suspect the NiMH method is also used for LFP and may be applied to other lithium chemistries as they need to track state of health, i.e., if the coulomb count says you're at 50%, but the voltage/current/temperature data says it's actually lower, it would estimate the reduced capacity and a less than 100% SoH.
The BMS I'm using in the prototype battery I make is JK 2A24S
what BMS?

24S? what application?
JK-B2A24S20P, 6 passenger, commercial E-Trike.
 
This is generally not used for LFP. The flatness of the voltage curve makes this wildly inaccurate. This is very common and accurate for Lithium NMC, LMO, NCA, etc... the "3.6-3.7V" nominal Lithium chemistries.
Yes, totally agree on that. Perhaps the JK bases the synchronization when the "top knee" kicks in at near to full charge? Im on putting together another 4 of these battery's, these new ones use grade A LF105 EVE cells direct from the factory...............no more grade B cells with untraceable QR codes ;-)

I will log the charge/discharge/SOC data and determine this.
 
Is there a PC app for logging the JK BMS data via RS485/BT? Googling come back with ESP home that appears to be an add on to home assistant? i have some ESP32 modules at hand so perhaps ill take that route?
 
The most variability is the determination of 'fully' charged. Some monitors just rely on a trigger voltage, like slightly below charger absorb voltage setting. It is usually good enough to get you within a few percent of full.

Victron battery monitors can be pretty critical with voltage level and taper off current that must occur within a given time period. The greater the charge current rate, the longer the taper off period.

When you have an inverter operating with variable AC loading in parallel with an active PV charge controller using taper current can be misleading. Maybe the current taper is just clouds going by during absorb cycle. This is why most PV charge controller do a timed absorb cycle and restrict the number of absorb cycles allowed in a given period of time (like once or twice a day) to avoid stressing battery with too many full absorb cycles.

It can be tough for unit to determine full charge state, particularly with a PV charge controller with variable charge current and with inverter randomly putting load on battery while PV is trying to charge battery. Most battery monitors trigger 100% full prematurely and just hold 100% until charger finishes absorb and monitor starts counting off when battery voltage drops off its peak absorb voltage. Make the 'full' criterion too tough to meet and you may not get enough monitor reset references to clear cumulated error in amp-sec tally.

You have to get a full charge reference reset periodically to clear cumulated errors in amp-secs tally. Common mistake is not periodically charging battery to a high enough voltage to trigger monitor to a 100% full reset. Monitor just becomes more inaccurate over time without a fresh reference point of full battery. This is like navigating by 'dead reckoning'. You need to find a known reference point to periodically reset errors.

The current measurement accuracy varies over the level of current being measured. This accumulates errors in amp-secs tally over time.
 
Last edited:
Is there a PC app for logging the JK BMS data via RS485/BT? Googling come back with ESP home that appears to be an add on to home assistant? i have some ESP32 modules at hand so perhaps ill take that route?
I spent quite a bit of time googling an app to log data from the JK BMS but could not find an app that suits my particular application. I spent a few hours understanding the protocol coupled with TTL/USB converter to a PC terminal app and have started on an ESP32 to cloud interface so I can log/graph the data. The PC end will become Arduino cloud IOT dash board that I've used in the past and I'm happy with. Maybe another 7 to 20 days work on this but I'll make it available to anyone interested. This will allow me to figure out when the SOC algorithm actually does it's resets, amongst other things I need to know.
 
Pretty sure it adjusts the capacity when it hits bottom or top. Bottom or top being when cell under or over voltage protection is triggered. So - if for example you switch on a big load that causes a cell voltage to drop rapidly you can prematurely trigger it to go to 0% and reset the measured capacity to a smaller value than you would actually get in normal use.
 
A lot of the new 'smart' monitors can independently determine, and update, battery AH capacity by measuring AH's over a full charge/full discharge cycle.

Some 'real smart' monitors will measure battery voltage when there has been near zero load current for a while to set % capacity remaining based on predictable OCV to SoC for a given battery chemistry. Problem with this is the current really needs to be close to zero. For LFP cells there is 10-20 mV of overpotential voltage shift from OCV for just a small amount of cell current.

10-20 mV uncertainty on LFP cell OCV can be about 10-15% SoC error when cell is in the lowest OCV SoC voltage slope region in the 50-80% SoC range.

I personally prefer setting the battery's AH capacity myself. I usually put in a lower AH capacity than battery's actual. This is to give a little safety margin for errors in percent remaining readout. For LFP battery you really want to avoid a BMS undervoltage shutdown. For a lead-acid you might want to set monitor with 50-60% of battery rated capacity to encourage not draining lead-acid battery below 50% state of charge,
 
Pretty sure it adjusts the capacity when it hits bottom or top. Bottom or top being when cell under or over voltage protection is triggered. So - if for example you switch on a big load that causes a cell voltage to drop rapidly you can prematurely trigger it to go to 0% and reset the measured capacity to a smaller value than you would actually get in normal use.
I would certainly not want the SOC algorithm to reset at zero ( user definable). It would seem safer to know how many AH have been used in the charge process to bring the battery to the top knee, this feels safer. Anyway, once I have my JK BMS logger finished then I can determine where these reset points are ( hopefully)

Seems the JK counts how many AH have been taken for the life of the battery, I'm wondering if this is how aging ( SOH) can be determined?
 
Last edited:
Seems the JK counts how many AH have been taken for the life of the battery, I'm wondering if this is how aging ( SOH) can be determined?
Aging of cells have many factors. Best test is keeping track of overpotential voltage slump for given current load, at given temperature. The voltage slump is highly dependent on cell temperature so it must be controlled.

The normal aging process is fracturing and regrowth of Solid Electrolyte Interface (SEI) coating around anode graphite during charging. Constant repairing of SEI layer during charging consumes some available lithium reducing cell capacity over time. It also thickens the SEI layer which increases impedance of cell resulting in greater overpotential voltage slump required to drive the lithium-ion migration to support demanded current and degrades electrolyte.

There are other damaging factors that can degrade cell. Electrolyte degradation due to overcharging and holding high SoC increases the amount of electrons escaping neg anode graphite into electrolyte during charging which chemically decomposes electrolyte. Charging rate at low temp and over discharging creates lithium metal formation in anode region that can grow conductive dendrites that create leakage paths and cell shorts if they punch through separator.

For series connected cells, not starting with matched cells and/or not keeping them balanced in SoC will result is some of series connected cells to age at different rates. Matched cells are not just cells with same AH capacity. They also need similar overpotential voltage versus current demand.

There is a saying that many Li-Ion batteries do not die of old age, they are murdered first.
 
Last edited:
Back
Top