diy solar

diy solar

JBD BMS - How to correct inaccurate SOC value ✅

@Vi s hi I started the process today, after charging my battery to the max and let it rest for a night, I discharged it by 20% (40Ah). I used my PZEM and the BMS to get the Ah value, but the problem is they are not telling the same. While I reached the 40Ah with the PZEM, the BMS only shows 34Ah. I don't know which one to trust, I chose the PZEM and stopped the experiment there for this first step.

That's strange they don't give the same value. Plus, while discharging the battery with a hair dryer, the instant current shown by the BMS was 42A, and 48 with the PZEM. That's the same 6A difference as I had with the Ah.
Do you know what could cause this please?

The hair dryer was plugged into an inverter which was connected to the PZEM"s shunt like all the rest.
Also after one hour or so, the BMS was at 50-60 degrees celcius (120-140 Faraneiht) I don't know if it's a normal value that's the first time I use it with such a high current.

I now let the cells rest and I'll check their voltage tomorrow.
Thanks

EDIT : I forgot to do the capacity test, I'll probably discharge the whole battery tomorrow to do it
Important, the discharge or charge current should be relatively steady. Use a clampmeter (keep in mind +-2% accuracy) to check the current and voltage. Then compare values with pzem and bms. Also check current before and after bms and before and after pzem to check if there is something unusual.
Bms should be more accurate above 5w, below it doesn't show nor count.

Pzem and bms have different refresh rates so it's normal they don't show the same values in case of an inrush current or brief current fluctuations.

50 to 60° is still acceptable, not yet too high.

Yes you should do the capacity test before. Nevertheless you can check and compare your measurement instruments while discharging to find out which are most accurate (any meter is at best 98% accurate).
 
Also, ideally for this test, don't charge or discharge your battery with more than 0.5C.
0.5C of 200ah is 100A!

Check in the datasheet of your battery which charge and discharge rate was used when they did the factory capacity test.
 
Thanks for your reply, very much appreciated.
Yes I checked with a clamp amperemeter too and it says 40A for instant current, measured at several places (battery +, batt -, after shunt, etc, always get 40A).
I didnt precise that I was discharging with a hair dryer (40A) plus 2 usb led lights (1A total) and a little fan (1A I think). So total not more than 42 or 43A.
So as you say the BMS doesntsee the leds and fan since their power is too week, yet it gives 42A as the total current consumption which is very close to my calculation. I don't understand why the PZEM gives 48A though that's strange since it should have been the most accurate with its shunt counting precisely the coulombs.
And my clamp meter with its 40A is quite ok-ish.
I used the PZEM to stop my first 20% discharge, but I think I should have chosen the BMS value. Now I may have discharged only by 17 or 18% instead of 20%. Jeeeez this stuff is mind boggling :)
 
Thanks for your reply, very much appreciated.
Yes I checked with a clamp amperemeter too and it says 40A for instant current, measured at several places (battery +, batt -, after shunt, etc, always get 40A).
I didnt precise that I was discharging with a hair dryer (40A) plus 2 usb led lights (1A total) and a little fan (1A I think). So total not more than 42 or 43A.
So as you say the BMS doesntsee the leds and fan since their power is too week, yet it gives 42A as the total current consumption which is very close to my calculation. I don't understand why the PZEM gives 48A though that's strange since it should have been the most accurate with its shunt counting precisely the coulombs.
And my clamp meter with its 40A is quite ok-ish.
I used the PZEM to stop my first 20% discharge, but I think I should have chosen the BMS value. Now I may have discharged only by 17 or 18% instead of 20%. Jeeeez this stuff is mind boggling :)
Sorry i have to correct myself, the jbd counts everything but doesn't show anything below 4-5w.
The bms should be more accurate than pzem. Bms also counts coulomb.
Just use this discharge to get to test compare etc your meters and bms.
Important before you start your capacity test (start charging after discharging) write into your bms as full capacity 220ah. Because otherwise it will stop counting when you hit 200ah. Good cells should have more ah than declared.
 
Thanks, then I'll continue my test today with the BMS values, I'll discharge the 6Ah remaining (difference between the PZEM and the BMS).
Well noted for the capacity test, will do this.
I will update soon with my final results, thank you I feel more reassured with your advices
 
@Vi s hi I started the process today, after charging my battery to the max and let it rest for a night, I discharged it by 20% (40Ah). I used my PZEM and the BMS to get the Ah value, but the problem is they are not telling the same. While I reached the 40Ah with the PZEM, the BMS only shows 34Ah. I don't know which one to trust, I chose the PZEM and stopped the experiment there for this first step.

That's strange they don't give the same value. Plus, while discharging the battery with a hair dryer, the instant current shown by the BMS was 42A, and 48 with the PZEM. That's the same 6A difference as I had with the Ah.
Do you know what could cause this please?

The hair dryer was plugged into an inverter which was connected to the PZEM"s shunt like all the rest.
Also after one hour or so, the BMS was at 50-60 degrees celcius (120-140 Faraneiht) I don't know if it's a normal value that's the first time I use it with such a high current.

I now let the cells rest and I'll check their voltage tomorrow.
Thanks

EDIT : I forgot to do the capacity test, I'll probably discharge the whole battery tomorrow to do it
Calibrate the BMS current first
 
what do you mean please, how to do that?
I am still doing the discharging, 40% remaining (I do 20% per day so the cells rest completly)
Its in the bms configuration setting in the XiaoxiangBMS iOS app. I imagine its same for other apps. You calibrate the internal shunt using an external meter so that it reads the current (and consumption) correctly
 
Ok I finished my calibration process. Each point has been set after a few hours rest.

First I charged each cell at 3.65v, let them rest a few hours and they settled down at 3.506v. That's my 100% starting point.

The BMS cuts discharging when the cells reached 2.5v, then after a few hours cells stabilized at 2.87v, that's my 0% point.

I have 2 questions :

1/ How can the BMS estimate the SOC accurately since cells have to rest several hours before giving an accurate voltage.. I mean the SOC displayed instantly does not take into account this latency so it might be false if I check the SOC while using the battery?

2/ Since I didn't make the capacity test before doing all these measurements, does it mean I have to redo them? Let's say my 200Ah battery is in fact a 220Ah, all my measurements are then false, no?

Anyway I think I already did the capacity test without knowing it while discharging the battery because : for the test I discharged 200Ah from the battery (5 times 40Ah) from 100% full, to the BMS cut, so I guess the battery has done one complete cycle and its capacity is 200Ah no more no less.

Anyway here are the graphics in case someone wants to use them as a reference for their tests (I have lishen prismatic cells)

Thanks for any feedback @Vi s

calibration.jpg
Its in the bms configuration setting in the XiaoxiangBMS iOS app. I imagine its same for other apps. You calibrate the internal shunt using an external meter so that it reads the current (and consumption) correctly
Thank you I will look at it
 
Ok I finished my calibration process. Each point has been set after a few hours rest.

First I charged each cell at 3.65v, let them rest a few hours and they settled down at 3.506v. That's my 100% starting point.

The BMS cuts discharging when the cells reached 2.5v, then after a few hours cells stabilized at 2.87v, that's my 0% point.

I have 2 questions :

1/ How can the BMS estimate the SOC accurately since cells have to rest several hours before giving an accurate voltage.. I mean the SOC displayed instantly does not take into account this latency so it might be false if I check the SOC while using the battery?
It can’t. The voltage can be very misleading as you’ve identified. Use coulomb counting. I wish the firmware made it possible to force this.
 
Ok I finished my calibration process. Each point has been set after a few hours rest.

First I charged each cell at 3.65v, let them rest a few hours and they settled down at 3.506v. That's my 100% starting point.

The BMS cuts discharging when the cells reached 2.5v, then after a few hours cells stabilized at 2.87v, that's my 0% point.

I have 2 questions :

1/ How can the BMS estimate the SOC accurately since cells have to rest several hours before giving an accurate voltage.. I mean the SOC displayed instantly does not take into account this latency so it might be false if I check the SOC while using the battery?

2/ Since I didn't make the capacity test before doing all these measurements, does it mean I have to redo them? Let's say my 200Ah battery is in fact a 220Ah, all my measurements are then false, no?

Anyway I think I already did the capacity test without knowing it while discharging the battery because : for the test I discharged 200Ah from the battery (5 times 40Ah) from 100% full, to the BMS cut, so I guess the battery has done one complete cycle and its capacity is 200Ah no more no less.

Anyway here are the graphics in case someone wants to use them as a reference for their tests (I have lishen prismatic cells)

Thanks for any feedback @Vi s

View attachment 142243

Thank you I will look at it
1. Bms calculates soc via coulomb counting and estimates through reference voltage values (100% to 0%).
How it seems to work:
In real time it uses coulomb (doesn't reference to voltage), if manual reset, reboot, change of values in settings etc it uses the reference voltage values as a first estimate till next full charge (auto reset).
2. No need since these are trustworthy cells. Calculate with the factory value of 202ah.

How i would have done it to get real life values:
1. Charge the pack full with the solar charger that pack is normally charged with. Charge it with the settings you are usually using. Say you charge your cells only to 3.55v then use that.
That's your 100%.
2. Then discharge it in one go to your actual stop voltage (of 2.9v or whatever) you have set in your charge controller (low voltage disconnect). That's your 0%.
Whatever ah value you get after this discharge (i estimate it will be close to 200ah) that's your actual available capacity.
3. Charge again and discharge in 10% steps. Wait a few hours between each step or just as long as your battery would maximum sit still in your real life usage.
4. Insert all voltage values in bms settings
5. Charge again and observe over a few weeks while using the battery to see if there is any abnormal behavior, whether you are satisfied with its soc accuracy.

If you don't use your solar charger your values will then later not match while using your pack in real life.
 
Last edited:
We've cycled our 4s 200ah pack at least 30-40 times with our JBD 200a BMS. Many times at .6c powering our truck camper a/c unit continuously from 100-0 (2.7v low voltage cut-off). All the presets I'm referencing are from the premium (~$7) iOS Xiaoxiang BMS app (not the XiaoXiangElectric app).

Once our pack has been fully charged above Cell Full Voltage, then discharged below Cell Minimal Voltage, after a few min of charge time the coulomb counter rewrites the previous Total Battery Capacity with the ah count it obtained for the last complete cycle. As it charges the coulomb counter tracks the charge current and updates the SOC display until it's fully charged (above Cell Full Voltage). SOC is reset to 100%. At this point, the process starts over again. It will establish a new Total Battery Capacity shortly after the next discharge cycle. We've found the coulomb counter to be pretty accurate. Our Total Battery Capacity typically hovers between 197-199ah after each cycle (.6c discharge; sometimes reaches 200ah with a lower c rate). Cold temps will drop values further.

The only time our JBD 200a BMS uses the Cell xx% Capacity Voltage as a reference for the SOC is if the coulomb counter is interrupted. A change to any of the voltage/current/temp presets will immediately reset the SOC using the appropriate Cell xx% Capacity Voltage. Unfortunately, if you're doing a high c discharge, the voltage sag will temporarily result in a very inaccurate SOC reading. After a few min. it may or may not readjust. Even with a readjust it's typically still off 10% or so. We've readjusted the Cell xx% Capacity Voltage values a number of times to no avail. In any case, after any reset, the coulomb counter takes over the count using Cell xx% Capacity Voltage reset as a baseline. If this baseline reset is inaccurate, the SOC will remain inaccurate until the Cell Full Voltage and Cell Minimal Voltage are reestablished.

As long we don't fiddle with any presets, the coulomb counter does it job well---the SOC and ah count is very accurate. Fiddle with the presets, all bets are off. I might also add, our JBD 200a BMS's coulomb counter will not track charge or discharge current lower than .45a. Over time this can throw the overall ah count and SOC off significantly. Fortunately, our truck camper's parasitic current hovers around .60a, so we're good. Another tidbit: If the Total Cycle Capacity value is higher than the Total Battery Capacity, the cycle counter will not increment.
 
Last edited:
Thanks @Vi s and @OTRwSolar for the helpful additional information.
I think I will use my data as is for a few cycles and see how the BMS behaves.

If the BMS just uses these values as an estimation for a start, then it doesnt seem useful for me to do another round of measurements, since it will switch to coulomb counting pretty quick to readjust the SOC.

I just hope it will remain accurate over several charging phases, since I'm using both solar panels and DCDC to charge the cells.

I have a question though. I am now using 3.506V as 100% in the BMS app. Does it mean it will stop charging when reaching this value? I need it to continue charging above that because 3.506v is the value after several hours rest. If it stops there, the voltage will diminish itself after a few hours which is not what I want.
Same question for my Victron app, I think I need to use a higher value so it continues charging.

Thanks again for your help guys
 
. . . I have a question though. I am now using 3.506V as 100% in the BMS app. Does it mean it will stop charging when reaching this value? I need it to continue charging above that because 3.506v is the value after several hours rest. If it stops there, the voltage will diminish itself after a few hours which is not what I want . . .
Charging will continue to high voltage cutoff. The Cell Full Voltage only resets the SOC to 100%.
 
Thanks @Vi s and @OTRwSolar for the helpful additional information.
I think I will use my data as is for a few cycles and see how the BMS behaves.

If the BMS just uses these values as an estimation for a start, then it doesnt seem useful for me to do another round of measurements, since it will switch to coulomb counting pretty quick to readjust the SOC.

I just hope it will remain accurate over several charging phases, since I'm using both solar panels and DCDC to charge the cells.

I have a question though. I am now using 3.506V as 100% in the BMS app. Does it mean it will stop charging when reaching this value? I need it to continue charging above that because 3.506v is the value after several hours rest. If it stops there, the voltage will diminish itself after a few hours which is not what I want.
Same question for my Victron app, I think I need to use a higher value so it continues charging.

Thanks again for your help guys
the bms can't control the charging unless you have other equipment it is talking to.
you don;t wnat to be using the over voltage disconnect to stop charging. it's there for protection.
use a charge voltage of 13.9 or similar and you will get 100% or thereabouts without any stress to the cells.
 
the bms can't control the charging unless you have other equipment it is talking to.
you don;t wnat to be using the over voltage disconnect to stop charging. it's there for protection.
use a charge voltage of 13.9 or similar and you will get 100% or thereabouts without any stress to the cells.

13.9 seems a bit low since my 100% is at 14.02, no ?
I was thinking about putting 14.5 in the victron app, because after resting it will mean 14v
 
Last edited:
13.9 seems a bit low since my 100% is at 14.02, no ?
I was thinking about putting 14.5 in the victron app, because after resting it will mean 14v
What voltages you use to charge your lifepo4, and what voltage you use for establishing 100% SOC, is all a matter of personal preference---within reason, of course :)

That being said, I believe your "resting" voltage is a tad optimistic.

A charge voltage of 13.60v (3.400v cell voltage) will still charge a 12v lifepo4 battery to 99-100% SOC. Charge times at 13.60v will be much slower than 14.00v - 14.60v, but the end result (SOC) will be the same for all intents and purposes. We charged our 12v lifepo4 to 14.40v yesterday, then activated the battery disconnect switch (no charge or discharge current). Battery voltage right now is 13.68V (cell voltage 3.420v) and it's still dropping. *Final* resting battery voltage will ultimately be less than 13.60v by a fair margin.
 
Last edited:
What voltages you use to charge your lifepo4, and what voltage you use for establishing 100% SOC, is all a matter of personal preference---within reason, of course :)

That being said, I believe your "resting" voltage is a tad optimistic.

A charge voltage of 13.60v (3.400v cell voltage) will still charge a 12v lifepo4 battery to 99-100% SOC. Charge times at 13.60v will be much slower than 14.00v - 14.60v, but the end result (SOC) will be the same for all intents and purposes. We charged our 12v lifepo4 to 14.40v yesterday, then activated the battery disconnect switch (no charge or discharge current). Battery voltage right now is 13.68V (cell voltage 3.420v) and it's still dropping. *Final* resting battery voltage will ultimately be less than 13.60v by a fair margin.
If I take your example, you charged to 14v and then the battery is now at 13.68v. Imagine if you charged it at 13.6 in the first place, it would drop to something like 13v or less after rest, which is not a 100% SOC.
I don't know how to express this (english is not my primary language sorry) but I think you have to charge the battery to a level that is higher than the desired voltage to get a 100% SOC after rest.
That's why I'd like to set the victron to charge up to 14.5v or so, so after rest I get 14v which my 100% SOC I've determined earlier during my discharging process.
 
Back
Top