diy solar

diy solar

Connecting JK-BMS to Inverter - Any major advantage?

I know I am a little late to the party but my 2 cents on this:

Bat-Inv communication is quite important when you have several batteries in parallel and the inverter is more powerful than any individual battery. For example 100A batteries and an inverter that does 200A, because of the scenario where your inverter is charging full wack, the other batteries hit full and block charging current leaving one battery on its lonesome the eat 200A which of course will cause the BMS to shut down. Not the best practice and could happen every time you hit full (depending on your panels). This could slowly wear one of your batteries out over time (likely to reoccur for the same battery in the set up)

If you don't have Inv power > 1 Bat power then whether it is worth it or not depends on how smart the inverter is at acting on the BMS info. If the inverter is smart enough to taper Charge current based upon a cell voltage peaking to give the BMS time to balance and avoid OVP kicking in then this would be good for battery longevity. I am dubious that many inverters do that, correct me if i am wrong.

If the communication gives you control over the charging strategy of the inverter then that would be worth it for battery longevity. A lot of inverters simply use CC -CV and hold at requested charge voltage - this keep the battery at an unnecessarily high voltage which contributes to degradation.
Others do CC-CV then stop charging after a set absorption time/ tail current, it then waits until Bat voltage hits a threshold to begin charging again - the problem with this is after the battery hits full, loads draw current from the battery so you end up in a cycle of 100% - 95% -100% -95% (assuming charging kicks in at 95%) which is again unnecessary wear and tear on the Bat cycling at the top end.
The Ideal (IMHO) charging behavior is CC - CV then absorb for set time (Say 0.5-1h) to allow balance then reduce voltage to ~ 3.35V (the natural resting voltage of full LFP chem) and is activley held there until the next day, this is done with a float mode. This hopefully means when loads draw power the inverter will supply the current to hold the float voltage which stops the battery micro cycling and less time is spent at high chargeing voltage.
Inverters dont really need the communication to do this (and I hope this protocol is becomming more common) but the only way I know of to reliably force and control the behaviour is by BMS - Inv communication. Either with the add on boards mentioned or with the newer JK inverter BMS.

If you dont have Inv Power > 1 Bat unit power AND the communication won't allow you to impove the charging behavior AND you are not trying to more accuratly trigger a charging source such as a gen set, then I think the communication is mostley a gimmik not worth the hastle. If you can do some of the above then it might improve battery life

Sorry for the essay. The topic has been in my head for a while
 
Last edited:
If the communication gives you control over the charging strategy of the inverter then that would be worth it for battery longevity. A lot of inverters simply use CC -CV and hold at requested charge voltage - this keep the battery at an unnecessarily high voltage which contributes to degradation.
Others do CC-CV then stop charging after a set absorption time/ tail current, it then waits until Bat voltage hits a threshold to begin charging again - the problem with this is after the battery hits full, loads draw current from the battery so you end up in a cycle of 100% - 95% -100% -95% (assuming charging kicks in at 95%) which is again unnecessary wear and tear on the Bat cycling at the top end.
Growatt inverter is famous for doing this 95 - 100 % nonsense.

Plus, the latest JKBMS Inverter BMS has serious issue with its SOC and coulomb counting.
There are two ways to trigger 100% SOC: coulomb counting and cell voltage 100% value.
Problem with these two approaches:
1) coulomb counting = It doesn't take account of cell self-discharge rate or any of power consumption by other equipment (such as connected Neey active balancer). Eventually, the actual SOC will be lower than the SOC reported by the BMS. I can only hope for JK to add some sort of offset to account for this, unfortunately, I had sent few emails every month to them and never hear back for the past 4 months.

2) If any of the cell hit the defined cell voltage 100% value, voltronic oem inverter will shut off charging immediately leaving no time for balancing to take place at all.

Additional info on Growatt:
Luckily, there is workaround for Growatt inverter which makes use of USE mode. Downside is, there is no communication between BMS and inverter.
Program 19 CV to 55.2v (for 16 cells). When the pack voltage reach 55.2v, it straight away move to float.
Program 20 floating to 53.6v.

Program 43 Battery equalization set to Enable.
Program 44 Battery equalization voltage set to 55.2V.
Program 45 Battery equalized time set to 60minutes (1 hour) (similar to JKBMS RFV)
Program 46 Battery equalized timeout set to default 120 minutes
Program 47 Equalization interval set to 1 day....or any duration from 1 day to 90 days?
 
Last edited:
Growatt inverter is famous for doing this 95 - 100 % nonsense.

Plus, the latest JKBMS Inverter BMS has serious issue with its SOC and coulomb counting.
There are two ways to trigger 100% SOC: coulomb counting and cell voltage 100% value.
Problem with these two approaches:
1) coulomb counting = It doesn't take account of cell self-discharge rate or any of power consumption by other equipment (such as connected Neey active balancer). Eventually, the actual SOC will be lower than the SOC reported by the BMS. I can only hope for JK to add some sort of offset to account for this, unfortunately, I had sent few emails every month to them and never hear back for the past 4 months.

2) If any of the cell hit the defined cell voltage 100% value, voltronic oem inverter will shut off charging immediately leaving no time for balancing to take place at all.
Thanks for the examples! that is exactly the kind of behaviour which sucks. Yeah, I think sadly it depends on the quality of the inverter programming (which isn't something you can garantee by simply paying more!) which is inverter specific, you might have an inverter which has great charging behaviour and then the comms could be redundant, the opposite could be true.

I wouldnt have thought that the BMS and balancer would make much diffrence to a big battery @~ 20ma it would take 6 days to use 1% on a 48V 280Ah battery or 1.5 days on a 12v 280Ah which if you are charging every week would be negligable. also the self discharge of LFP is very slow and we are talking weeks again. I think the fundermental problem is Coulomb counting which accumilates errors and only gets reset if you hit top or bottom, there isn't any real way around this without EXTREAMLY accurate current measurement with TINY time resolution. Which just isnt around for a sensible price. Its like trying to figure out where you are by summing up the lenth and direction of all your strides since you left home, you would almost certainly conclude you are inside a bick wall somewhere. Might half work for a short trip into your garden and back. I never assume it is better than 20% from the truth.

I do agree with you though for batteries that get on-off useage it would be useful to have a parasitic draw parameter.

Also Thanks for messaging! I think the reason we aren't all stuck with Daly is because people keep telling JK what would be useful, so thanks for your contribution

The situation could also be improved if they let you enter several calibration points for the current as i have found I can either make it accurate for small currents 0-10A or higher 50+A but not both at the same time. Probably due to heating effects making the current readings non linear. You can sometimes turn trash to gold with enough calibration maps and drift monitoring.

with LFP Voltage is king and SOC for indication only
 
Back
Top