I know I am a little late to the party but my 2 cents on this:
Bat-Inv communication is quite important when you have several batteries in parallel and the inverter is more powerful than any individual battery. For example 100A batteries and an inverter that does 200A, because of the scenario where your inverter is charging full wack, the other batteries hit full and block charging current leaving one battery on its lonesome the eat 200A which of course will cause the BMS to shut down. Not the best practice and could happen every time you hit full (depending on your panels). This could slowly wear one of your batteries out over time (likely to reoccur for the same battery in the set up)
If you don't have Inv power > 1 Bat power then whether it is worth it or not depends on how smart the inverter is at acting on the BMS info. If the inverter is smart enough to taper Charge current based upon a cell voltage peaking to give the BMS time to balance and avoid OVP kicking in then this would be good for battery longevity. I am dubious that many inverters do that, correct me if i am wrong.
If the communication gives you control over the charging strategy of the inverter then that would be worth it for battery longevity. A lot of inverters simply use CC -CV and hold at requested charge voltage - this keep the battery at an unnecessarily high voltage which contributes to degradation.
Others do CC-CV then stop charging after a set absorption time/ tail current, it then waits until Bat voltage hits a threshold to begin charging again - the problem with this is after the battery hits full, loads draw current from the battery so you end up in a cycle of 100% - 95% -100% -95% (assuming charging kicks in at 95%) which is again unnecessary wear and tear on the Bat cycling at the top end.
The Ideal (IMHO) charging behavior is CC - CV then absorb for set time (Say 0.5-1h) to allow balance then reduce voltage to ~ 3.35V (the natural resting voltage of full LFP chem) and is activley held there until the next day, this is done with a float mode. This hopefully means when loads draw power the inverter will supply the current to hold the float voltage which stops the battery micro cycling and less time is spent at high chargeing voltage.
Inverters dont really need the communication to do this (and I hope this protocol is becomming more common) but the only way I know of to reliably force and control the behaviour is by BMS - Inv communication. Either with the add on boards mentioned or with the newer JK inverter BMS.
If you dont have Inv Power > 1 Bat unit power AND the communication won't allow you to impove the charging behavior AND you are not trying to more accuratly trigger a charging source such as a gen set, then I think the communication is mostley a gimmik not worth the hastle. If you can do some of the above then it might improve battery life
Sorry for the essay. The topic has been in my head for a while
Bat-Inv communication is quite important when you have several batteries in parallel and the inverter is more powerful than any individual battery. For example 100A batteries and an inverter that does 200A, because of the scenario where your inverter is charging full wack, the other batteries hit full and block charging current leaving one battery on its lonesome the eat 200A which of course will cause the BMS to shut down. Not the best practice and could happen every time you hit full (depending on your panels). This could slowly wear one of your batteries out over time (likely to reoccur for the same battery in the set up)
If you don't have Inv power > 1 Bat power then whether it is worth it or not depends on how smart the inverter is at acting on the BMS info. If the inverter is smart enough to taper Charge current based upon a cell voltage peaking to give the BMS time to balance and avoid OVP kicking in then this would be good for battery longevity. I am dubious that many inverters do that, correct me if i am wrong.
If the communication gives you control over the charging strategy of the inverter then that would be worth it for battery longevity. A lot of inverters simply use CC -CV and hold at requested charge voltage - this keep the battery at an unnecessarily high voltage which contributes to degradation.
Others do CC-CV then stop charging after a set absorption time/ tail current, it then waits until Bat voltage hits a threshold to begin charging again - the problem with this is after the battery hits full, loads draw current from the battery so you end up in a cycle of 100% - 95% -100% -95% (assuming charging kicks in at 95%) which is again unnecessary wear and tear on the Bat cycling at the top end.
The Ideal (IMHO) charging behavior is CC - CV then absorb for set time (Say 0.5-1h) to allow balance then reduce voltage to ~ 3.35V (the natural resting voltage of full LFP chem) and is activley held there until the next day, this is done with a float mode. This hopefully means when loads draw power the inverter will supply the current to hold the float voltage which stops the battery micro cycling and less time is spent at high chargeing voltage.
Inverters dont really need the communication to do this (and I hope this protocol is becomming more common) but the only way I know of to reliably force and control the behaviour is by BMS - Inv communication. Either with the add on boards mentioned or with the newer JK inverter BMS.
If you dont have Inv Power > 1 Bat unit power AND the communication won't allow you to impove the charging behavior AND you are not trying to more accuratly trigger a charging source such as a gen set, then I think the communication is mostley a gimmik not worth the hastle. If you can do some of the above then it might improve battery life
Sorry for the essay. The topic has been in my head for a while
Last edited: