diy solar

diy solar

JK BMS SoC?

Ai4px

Solar Enthusiast
Joined
Feb 20, 2021
Messages
249
I've just gotten my JK BMS turned on last night and let it balance some 280ah LFP cells. It did well and was balanced in a couple hours... I hooked up a 48v charger to them and now they are at 55.2v and balancing again. Here's the question.... my SoC guesometer shows 37%. How can I force it to 100%? Does it need to just sit at 55v for some hours to adjust?
 
I have found that by changing the battery capacity setting (ie 280ah) to some bogus number and back to 280 makes it recalculate SoC. It’s still not right…. Battery charged to 3.47 per cell and still showing 76% now for days. On the other hand, a 2nd pack I am presently charging was edited a few hours ago and has since gone from guessometer 90% to 100% and still filling. The SoC is clipped at 100%… so when the battery finally gets full, the SoC guessometer will be waiting on it.
 
Hi...I was wondering if you had found a solution for this problem, I have a JK bms and by changing some settings and the soc % changed from 100% to 78%. The battery is fully charged at 54.4v...how can I get it back to 100%...
 
What is the software interface? I have XiaXiang (spelling) software on a different BMS and it does the same thing. One BMS works the second does not.

I was hoping charging cycles would make it go away, but the one still reads in accurately.
 
A battery monitor keeps track of amp-secs going in and out of battery, sometimes with fudge factors adjustments for things like charging efficiency.

To do an AH remaining or % full it must know a few reference items. Two most important items that user must set is AH size of battery and what battery voltage constitutes a 'full' battery. Full battery voltage is usually set a little below your absorb voltage level for charging.

Some newer BMS's can fill in AH capacity by observing a full discharge range, but this takes a full cycle to learn.

'Full' battery voltage is very important for user to setup. It causes the cumulative amp-seconds counter to reset, clearing any cumulative errors that have grown over usage time due to errors in current reading. Typically shunt current monitor has less accuracy at low currents.

If you do not recharge battery up to the 'full' reset trigger level voltage for a long time the battery monitor continues to run on 'dead reckoning' meaning small current measurement errors will continue to accumulate over time and your AH reading and % full gauge will become more inaccurate.
 
I just went through this. The JK needs to go through an entire cycle to get itself sync'd to the SoC of the battery: fully charged, discharged at least as far as you think your normal cycle capacity will go down, and then back up to full. At that point it seems to know where it is. I'm still new enough that I'm not sure if it is accurate after that, but it does get it in sync at that point.
 
I believe the JK BMS has a user setup setting to allow you to enter your battery AH size to get you started. It has no way to initially know your battery size.

What I am not sure of is if it overrides your entered AH setting if you discharge battery to its default value of low battery voltage, based on how many AH's it has accumulated in its coulomb counter.

In order for it to automatically fill in battery AH setting, it first must be charged at or above its default battery 'full' voltage, then discharged to its default low battery voltage setting.

I think, based on your battery type setting, and number of cells setting, it defaults to an initial full discharge voltage value, probably something like number of cells x 2.5v per cell, and a default 'full' reset voltage value of something like 3.5v x number of cells. You can manually enter your own values for these two voltages.
 
ok...let me see if I understand, I have to discharge the battery to its lowest point and then fully charge it and then the bms will register the % of soc... Yesterday I tried to put the values, of the battery, back in I have turned off the bms settings, restarted it and nothing... JK should have put, in the application, a synchronization tap on the % of the soc...
 
I have 10 BMS's. Only two have an accurate SOC%. Chargery and JK. I have 10 Battery Meters, half with SOC. Only one is accurate. A 100A shunt type made by AiLi.
 
Interesting. If you have that many, how are you judging which
I use dc clamp-on current measurements, Sol-Ark measurements and watch charge/discharge cycles. I trust my Chargery and JK BMS with AiLi current shunts more than the other 8 BMS and Hall current sensors. Over the last 7 weeks off-grid, they have always remained spot on at 100% charged and very close to Sol-Ark SOC% during discharging down to 20%.
 
Last edited:
And how did you achieve 100% accurate soc in JK... help me please
The first step is to set the capacity of the battery, which is what the capacity of the battery would be. You can set it a little larger, for example, 100AH battery capacity, 120AH or 150AH battery capacity.
The second step is to discharge the battery through the discharger, discharge the battery and put it on the protective board to prompt the individual undervoltage protection.
Step 3: Charge the battery without interruption until the protective panel prompts for single overvoltage protection.
After these steps, the capacity will be automatically updated. The APP battery capacity shows the actual capacity of the battery, at which point the SOC will be accurate, and all capacity-related will be accurate.
 
The first step is to set the capacity of the battery, which is what the capacity of the battery would be. You can set it a little larger, for example, 100AH battery capacity, 120AH or 150AH battery capacity.
The second step is to discharge the battery through the discharger, discharge the battery and put it on the protective board to prompt the individual undervoltage protection.
Step 3: Charge the battery without interruption until the protective panel prompts for single overvoltage protection.
After these steps, the capacity will be automatically updated. The APP battery capacity shows the actual capacity of the battery, at which point the SOC will be accurate, and all capacity-related will be accurate.
Thanks...(y)
 
My jk messes up soc if it let's go for something. I unplug the power for a while and then hook back up and power it up. That gets it back to what seems ok.
 
The first step is to set the capacity of the battery, which is what the capacity of the battery would be. You can set it a little larger, for example, 100AH battery capacity, 120AH or 150AH battery capacity.
The second step is to discharge the battery through the discharger, discharge the battery and put it on the protective board to prompt the individual undervoltage protection.
Step 3: Charge the battery without interruption until the protective panel prompts for single overvoltage protection.
After these steps, the capacity will be automatically updated. The APP battery capacity shows the actual capacity of the battery, at which point the SOC will be accurate, and all capacity-related will be accurate.

Would it be possible to add a feature to just manually start a new cycle? That is, when I know the battery is at 100%, just reset the coulomb counter so that the reading at that point becomes just 100%, xxxAh remaining.
 
Would it be possible to add a feature to just manually start a new cycle? That is, when I know the battery is at 100%, just reset the coulomb counter so that the reading at that point becomes just 100%, xxxAh remaining.
would be ideal if you could put in 0%, 100% - whatever you want. Then you could resume where you left off after a power interruption to the BMS.
 
My JK BMS's ( I have two) were reading 100% SoC when batteries full and on standby for weeks. But a few weeks ago they started showing a lower and lower SoC. Power went out the other night and we started pulling from batteries and the SoC just kept going down from there. I soon realized that SoC is just a reading and really has no function.... discharge would be disabled if any one cell gets low, nothing to do with indicated SoC.

Is this correct?
 
The JK BMS (and the JBD BMS, and probably most others) does do Coulomb counting. Although it may be a bit less accurate than a Victron Smart Shunt, I don't think that is the problem seen by @Ai4px. Although I don't watch it all the time, what I've seen from the JK BMS and the JBD BMS seems to match pretty closely with the Victron Smart Shunt in two systems I've built.
My JK BMS's ( I have two) were reading 100% SoC when batteries full and on standby for weeks. But a few weeks ago they started showing a lower and lower SoC. Power went out the other night and we started pulling from batteries and the SoC just kept going down from there.
It may be a silly question, but do you know what the actual SoC was of the batteries during this time? Do you have some other device that is measuring the SoC?

Most Coulomb counting devices require that the battery getting monitored periodically get to 100% SoC.
 
SOC by BMS requires a couple of cycles to "set" relative to the pack.
The JK's have 3 decimal accuracy (internally) which is really the only way to make correct SOC Calcs (as best as possible).
The SOC numbers are also affected by Temps which affect the chemistry and that can also throw things off somewhat ie: when Cold 40F versus when Hot at 100F.

External Shunts with 2 or 1 decimal place accuracy are less accurate with lower resolution. This is Observable with different devices, unfortunately, even the Victron Smart Shunts are 2 Decimal Place accurate, while Midnite Solar WizBangJr (uses a Deltec 500A/50mv Shunt) is only 1 Decimal Place accurate.
This is also a problem with most DMM/DVOMs as the general ones are 1 DEecimal Place accurate while others (more $) can be 2 decimal place accurate (*Thanks to the AC Voltage Requirements which is essentially "brute force", like Lead Acid Batteries.)

Another issue is how one views SOC %. Too many people make the Assumption that 0% SOC = 2.500Vpc and 100% = 3.650 which is Grossly WRONG ! As that is the Allowable Voltage Range, while the Working Voltage Range is actually from 3.000Vpc to 3.400Vpc.
* Vpc = Volts Per Cell.

For External Monitoring some key details to add to settings if the software / firmware allows for it.
Battery Charge Efficiency for LFP
= 99%
TailCurrent/EndAmps = Battery AH * 0.05 (ie 280AH * 0.05=14A)
Peukert Exponent is 1.05 for LFP. (When subjected to high discharge rates, lithium batteries perform much better than lead-acid batteries.)

Some SmartShunts & Monitors will also allow you to Set your Top (100%) & Bottom (0%) SOC Values while most others do not, so this adds further complications due to the diversity of available products.

Lastly, if you are using an External MOnitoring device and it is a Single Decimal Place accurate and it comes back at say 12.8/25.6/51.2 That translates as 3.2xx Volts Per Cell accordingly (50% SOC but NOT). That covers cell voltage from 3.200-3.299 and THAT is AmpHours and it is only 0.099 voltage difference. This is why the Very Flat Curve REQUIRES Finer Resolution.
 
Last edited:
Of all the devices mentioned, the DC clamp on meter is likely the least accurate.

There are several things that dominate the accuracy of a battery monitor. The current through the shunt resistor produces a voltage that represents the current through the shunt. It can have a positive or negative current and has positive or negative voltage depending on if charging or discharging. The battery monitor samples the current periodically and keeps a running tally of cumulative amp-seconds of current.

First is the accuracy of shunt resistance. The good quality large external shunts are usually the most accurate. Their resistive metal elements are made of a combination of metal alloys to give them a low temperature coefficient variance for resistance. Shunts can get quite hot. If you have a 500A / 50 mV shunt, it means it is 0.1 milliohm and produces 1 mV voltage drop per 10 amps of current. At 200 amps of current the heating loss in shunt is I^2 x R = 200^2 x 0.1 milliohm = 4 watts of heating. This will make shunt get hot so its ability to have the same resistance at hot temp is important to accuracy.

BMS's have several chip resistors in parallel, likely very cheap chip resistors. Their accuracy and temp stability are not as good as a good quality external shunt. They are also subjected to additional heat from BMS MOSFET switches in BMS.

Second part of accuracy is voltmeter circuitry that measures voltage across shunt. This is the main part that is adjusted for calibration. It has to measure very small voltage from shunt resistor. There is also a filter to average out the 2x AC line frequency ripple current produced by a sinewave inverter.

Third part is any DC offset associated with the DC voltage measurement across shunt. It must be set to zero at zero current and should not drift with temperature of voltmeter circuitry. Any slight DC offset on voltage reading has a big impact on measuring low currents accurately.

The voltage reading accuracy and therefore the current accuracy is worse at low currents.

Beyond the accuracy of the monitor there are other factors involving losses in the battery. Some monitors have separate fudge factors for charging and discharging current. A lead acid battery has significant losses in its charging efficiency that also depends where in the recharge state of charge the battery is. Overall, the charging of lead-acid battery is 85-90% percent efficient.

LFP batteries are very efficient for charge and discharge efficiency. The efficiency gets a little worse the greater the charge and discharge currents, but in the range of current of 0 - 30% of the battery capacity referenced amperage, C(A), their charge and discharge efficiency is about 99%. Round trip is about 98%. As cells get older and their internal impedance rises, their efficiency drops, and their actual capacity drops.

On the downside, LFP battery capacity get significantly less at low temperatures. Their internal ion migration gets more sluggish at cold temps. The drop off accelerates as temp drops below 15 degs C. It is actually the internal impedance caused by poorer ion migration at cold temp which translates to greater terminal voltage slump for moderate discharge current. Actual AH's is still there, it is just more of battery energy is consumed in battery internal losses. As an inverter DC input voltage is reduced due to battery terminal voltage slump, it draws more battery current to make a given output AC power.
 
Last edited:
My own numbers are completely bogus after 17 days of run time.

My very new JK BMS presents is presenting values for "Remain Battery" (percentage) and "Remaining Capacity" (AH) without regard to the measured cell voltages, and the values decline over time. Right now (after about 17 days of runtime) the V4.7.1 Android App is showing totally bogus values of just "4%" and "9.2AH" remaining.

The 4S pack has a total voltage of 13.28V on the display, with each cell show as 3.320 and no cell deviating by more than .002 volts. (It's a bit "jumpy" in that cell voltage status display, with one or another jumping high by .001 or .002, and the "high cell" moving from one GUI App update to the next. That happens 2x-3x per second while connected and running via Bluetooth.)

The CORRECT SOC on these LFP cells as I write this, with the pack at room temperature, is slightly above 70% remaining. Even if the displayed value excludes about 5% of my "remaining capacity" because I reserved the bottom 5% via "undervolt protection" (value 2.85 volts, approximately 5% left), a "remaining usable" figure should be at least 65%, and the remaining AH should be around 150Ah.

These are new cells which have been capacity tested against an LFP voltage curve, coulomb-counted to release 221 AH, starting from a totally balanced state of 3.620 volts per cell (top) until the point at which the lowest cell pulled down to 2.850 Volts. Adding back almost 5% unused at the bottom end, that was a capacity slightly higher than the nominal '230Ah' in the product description for the EVE cells. This was discharging into a totally non-reactive array of large resistors, for many hours. A second round of charge-up and discharge again got virtually identical results.

Summary: The figures which are being presented in Android App V4.7.1, from an almost brand-new JK-1A8S20P-H, are completely bogus. As a workaround, I TOTALLY IGNORE THEM and simply use the curve "in my head", matching it to the pack voltage (== cell voltage * 4 cells). In nearly all other respects, I am delighted with my new-to-me JK 200A unit.
 
Last edited:
My relatively new JK presents continuously values for "Remain Battery" (percentage) and "Remaining Capacity" (AH), without regard to the measured cell voltages. Right now (after about 17 days of runtime) they are totally bogus values of "4%" and "9.2AH" respectively.

Cell voltages don't relate to state of charge. You need to do a full cycle: empty to full to calibrate the BMS - I've asked before @Nami (and will do it again now): can you add a feature to just allow manually resetting the state of charge at e.g. 100% (when I know the battery is full) or some similar scheme?
 
Cell voltages don't relate to state of charge. You need to do a full cycle: empty to full to calibrate the BMS - I've asked before @Nami (and will do it again now): can you add a feature to just allow manually resetting the state of charge at e.g. 100% (when I know the battery is full) or some similar scheme?
With the highest respect (because you're THE BEST :giggle:) I must protest that Cell Voltages manifestly do relate to State of Charge, although the voltage changes which must be measured vary by relatively small amounts within most of the 'working range' of LFP battery cell voltages.

Those voltage values can only be determined while the cells are not being discharged in any significant way, and can only be measured after waiting considerable time for "surface effects" to spread through the pack.

Using dual coulomb counters, I have found that current consumed by the unloaded JK itself is very small (less than 0.10A) when the display/switch attachment is disconnected and the App is not running. My current JK-shown voltage measurements meet those measurement requirements, except for the fact that the App is running (and probably forcing JK to consume another watt or two, with very little effect on its cell voltage measurements). I have left my JK's topmost "power connector wire" separate from the topmost cell's balancing/measurement wire, so the current providing power along the black wire is not affecting the voltage measurement along the red wire in a significant way: There are separate ring terminals on the top-most battery post to support those two wires.

Your request is a good one, but I feel that the JK App and BMS microcode needs to be fixed before bothering to provide a new User entry parameter. The presented values seem to be "leaking" power into the 'luminiferous aether' over a period of multiple days. Luminous aether does not exist - the "missing" power is definitely still present within the pack.

Thanks!
 

diy solar

diy solar
Back
Top