• Have you tried out dark mode?! Scroll to the bottom of any page to find a sun or moon icon to turn dark mode on or off!

diy solar

diy solar

JKBMS Inverter Edition Problems/Issues | No Support / Help to fix major issues. - DO NOT BUY ! Warning (as of Oct.12.2024)

Interesting, I have my Morning Coffei in hand and going.
I re-read @Nami responses a few times. Must be the translation software being used, makes little to no sense. Also does not answer the points presented through this thread. Well that's more Deja-Vu, here we go again... oi... I do not think the "Terminology Differences" excuses apply either.

Throttling Charge Rate with internal resistance does that just by the chemical interactions from the cells, makes no sense. The increasing Internal resistance as they charge up naturally reduces the incoming charge from the outside source.

EDIT:
One place that Charge Throttling "MAY" be important is in a situation where you may have far more Charge Capacity (Amperage) than the Battery Packs can handle at once. So instead of a BMS Cutting off charge if AMPS available is above what is allowed (IE 100A max) then to reduce input charge by throttling it to the Max Allowable Amperage as set in the BMS. This could occur with a Large Bank that has Large Solar Input capacity and if packs cutoff for any reason, there could be excess solar input available for a short period of time.

SOC Calculations depend on what the USER/OWNER has set as their 0%-SOC and their 100%-SOC regardless of ANY other settings.
If 0%-SOC is set to 2.800 and 100%-SOC is set to 3.450 that leaves you 0.650V range to calculate SOC for the User Set Values.
When to Transition to FLOAT from Bulk/Absorb can only be triggered by using the Amps Taken by the battery packs as the Trigger for transition.
Once Amps Taken has dropped to EndAmp value the cells are full and can be floated for active balancing & topping off.

EndAmps or TailCurrent is the only way to evaluate the depth of charge within a Cell at a Set Voltage Point. When the cell is locked to charge no higher than 3.450 it will take as much amperage it can until it reaches that specific voltage. It will continue taking the amperage until the Internal resistance prevents it and starts to reduce the amount being accepted. For 280AH cell that value is 14A (280AH X 0.05 = 14A)
This is when the cells are "Saturated" or fully charged to the preset Voltage setting.
It does not matter if the user chooses 3.400V or 3.500V per cell, once the cell can take no more than 14A it is saturated/full.

Example: (simplified for translator)
6 battery Packs - 280AH each in Parallel. 200A Output & 100A Charge input capable.
200A Charge input from Solar to Battery Bank. (Bulk/Absorb Mode) - Constant Voltage/Constant Current).
All Battery Packs take a proportionate charge of approximately 33A with a slight bit of current float up/down slightly (normal).
All packs take all available amperage until the Batteries reach the SET Bulk/Absorb charge Voltage of 3.450V per cell.
They continue to do so until Internal Resistance reduces the amount of Amperage they can take at that set Voltage of 3.450V per cell.
When the Smartshunt reads the Amperage Taken by the bank falls to 14A it transitions the Solar Controllers to FLOAT mode (Constant Voltage/Variable Current).
Within 15 minutes of entering Float Mode, all packs are at 3.450V per cell with a Maximum deviation of 0.003 between cells in ALL Packs.
This is the proper and A-Typical behaviour for a properly configured battery bank & Solar Charge Controller(s)

Temperature Considerations for SOC is more complex because Temps affect capacity, regardless of cell Quality or Grading. It also affects the C-Rates (Capability) as well, frozen cells @ 0C Temp discharge & charge much slower than at 25C and again at High Temps like 50C they do not charge/discharge the same as at 25C. And of course every chemistry is very different, especially when it comes to temperatures. This is not something that can be addressed in a "Generic BMS" such as these. An EV Car Maker can deal with that because they are programmed for the Specific Battery Chemistry they are using, the same applies to Commercial Energy Storage Systems like Tesla Powerwalls which are programmed for the specific cell chemistry they are using.

I hope this translates well and is understood as intended.

EDIT 08:30 EST
For READERS of this Post/Thread.
Please be aware that NOT all Equipment (Inverter/AIO or SCC) may not expose Values such as EndAmps/TailCurrent and in some cases many not even have such capability (see value gear). Some equipment may use their SmartShunts and related software to interact with SCC's Inverter/Chargers etc... Other Higher End AIO's may also have internal Shunts with this capability. There are also a few who perform this internally and do not expose this to the end user. Basically, just because you may not have settings or values for EndAmps/TailCurrent it does not mean that this is not a key aspect of Battery System Charging and it applies to all Chemistries.
 
Last edited:
Subsequently, the soc-ocv curve will be corrected according to the MSDS of different brands of cells,
This can improve the accuracy of SOC,
At the same time, we will improve the equilibrium efficiency. We will update this scheme later


Basic definition of SOC:
Qmax - Maximum allowable charge and discharge capacity of the battery, which can be understood as rated capacity * SOH
Ieff - Charge discharge current or self discharge current, charging is negative
Coulomb efficiency of charge and discharge


The current list of industry algorithm solutions is as follows, including four schemes: ampere hour integration, open circuit voltage, artificial neural network, and Kalman filter

SOC-OVC curve acquisition
Basic test
At room temperature (22 ± 3 ℃), the new battery was tested at 1C current. During the experiment, the precision current test was passed
The device calculates the battery capacity by integrating the current. During the test, when SOC changes by 5%, the battery is left standing for 3 hours, and the steady state is measured
Open circuit voltage, and draw the soc-ocv relationship of the battery.
Condition test
The above experiments were repeated at different temperatures (-20,0,10,40 ℃) for batteries with different SOH states (capacity decreased to 90%, 80%, 70% of the original capacity).
Note: since there is no unified soc-ocv test standard at present, the test methods of different battery manufacturers may be different. Two dimensions of soc-ocv curve cluster can be selected, the experimental temperature T and the life state SOH. According to the experimental data, the working environment and health state of the battery are applied to correct its soc-ocv characteristic curve
Let me try some translation software.

我不确定这是否是语言障碍……但我还是试试吧。希望翻译软件能做好工作。

1725712112967.png

参考SMA CAN-Bus协议和SMA Sunny Island逆变器。 在SMA手册中提到:“请注意,电池的充电不会根据SOC值停止(例如在100%时)。只有电池放电会根据定义的SOC值停止。” Victron逆变器也采用了SMA的相同方法。

然而,核心问题在于Growatt/Luxpower/Voltronic逆变器的做法有所不同。Growatt/Luxpower/Voltronic逆变器在接收到100%信号时会立即停止充电。这正是我们想要防止的。这些逆变器基本上会覆盖BMS命令以继续充电。设想一下,我将SOC100%电压设置为3.449V,将起始平衡电压设置为3.450V。假设我有很多不平衡的电池单元,其中一个单元的充电量高于其他单元。如果这个特定的单元达到3.449V,它将触发JKBMS SOC重置为100%,这将导致Growatt/Luxpower/Voltronic逆变器立即停止充电。这样就没有时间进行主动单元平衡,电池组中的其他单元根本没有充满电。

注意:Growatt/Luxpower/Voltronic逆变器只有在SOC%降到95%以下后才会恢复充电。

第二个关键问题是库仑计。使用相同的设置,即将SOC100%电压设置为3.449V,将起始平衡电压设置为3.450V。由于库仑计的不准确性,库仑计会使SOC百分比达到100%,即使电池组中的16个单元都没有达到3.449V。你可以理解我想说的。由于向逆变器发送了100%的信号,充电被过早终止。

接下来,目前的JKBMS固件似乎没有考虑库仑效率(CE),也称为法拉第效率和电动效率。充电电池总是需要比电池产生的电量更多的电力。例如,重新充电一个完全放电的280Ah电池组需要290Ah,即使刚刚使用了280Ah的电量。从我可以验证的情况来看,JKBMS似乎期望1:1的充电和放电库仑计数,这在现实中并不发生……这将导致负SOC%漂移。由于JKBMS库仑计期待完美的1:1充电和放电比率,一旦充电周期被库仑计的SOC 100%停止,就会导致电池组中10Ah的容量未被充入。如果长时间使用电池组,最终电池单元的真实电荷状态会完全漂移,导致BMS因欠压保护而断开。设想一下,每次充电周期缺少10Ah,在第28天,JKBMS会报告100%,而电池组的实际电荷状态实际上接近零或非常接近空。


edit:
亲爱的Nami,你可以直接用中文回复,我们有多个翻译软件/服务,所以不用担心。由于某些未知原因,你使用的翻译软件在翻译上表现不佳。
 
Last edited:
@Nami : wouldn't it be easier to just release the software as open source so we can pool resources? There really isn't anything secret in the code after all. JK could release official versions, the community could build their own for certain edge cases, implement protocols they want, etc. both can take parts from one another... Personally, I believe that would lead to a much more robust and future-proof situation.
 
@Nami : wouldn't it be easier to just release the software as open source so we can pool resources? There really isn't anything secret in the code after all. JK could release official versions, the community could build their own for certain edge cases, implement protocols they want, etc. both can take parts from one another... Personally, I believe that would lead to a much more robust and future-proof situation.
This would be a TERRIFIC solution. And could dramatically improve sales as a result!

The problem is.....if the user accidentally flash wrong stuff and bricked the BMS.....who is going to deal with the warranty and relevant issue?
 
My guess is that as with true version control, the computer software could verify the target BMS before performing the update. A good feature to have regardless.
 
The problem is.....if the user accidentally flash wrong stuff and bricked the BMS.....who is going to deal with the warranty and relevant issue?

This is a very easy thing to fix. Essentially, you implement a protected area in flash on the microcontroller where you place a bootloader. Even if everything else fails, the bootloader is there and on BMS power-up it can be triggered to get the firmware from the PC.
 
Exactly, don't forget it usually takes more power to recharge a battery than discharging it due to cell inefficiency.
Actual data from LFP cells done at 40 amp charge and 40 amp discharge. Efficiency will be slightly better when currents are reduced.

Round trip efficiency loss is relatively small for LFP at reasonable cell currents. About 3% to 3.5% round trip power loss for a good cell @ 40A chg & 40A dischg. Typical inverter/charger losses will be 2 to 3 times the LFP battery efficiency losses.
LFP cell chg_dischg curves.png
 
I'm trying to update one of my JK Inverter BMSs from v15.29 to v15.30 but when I try and update, the progress bar stays at 0% then the BMS hangs and reboots by itself. Has anyone seen this issue before and what's the solution? I updated another BMS yesterday without an issue so it doesn't appear to by my RS485 adapter or wiring.

Here's a video of the issue:

Edit: Tried a forced update with the same results. Update fails to start, BMS hangs then reboots by itself

Video of the forced update attempt here:
 
Last edited:
I'm trying to update one of my JK Inverter BMSs from v15.29 to v15.30 but when I try and update, the progress bar stays at 0% then the BMS hangs and reboots by itself. Has anyone seen this issue before and what's the solution? I updated another BMS yesterday without an issue so it doesn't appear to by my RS485 adapter or wiring.

Here's a video of the issue:

Edit: Tried a forced update with the same results. Update fails to start, BMS hangs then reboots by itself

Video of the forced update attempt here:
I have exactly the same issue with one BMS, I'm pretty sure it is an hardware issue. I have been in contact with JK BMS but they keep sending me the same procedure to try to force update the drive. Does yours still work as en RS485 slave ? Mine did not see any slaves (when configured as master) and was not seen by the master when configured as slave.

I found possibly another bug, I use an ESP with ESPhome to make a bluetooth WiFi bridge and connect it to home assistant but when I do that (and it works like a charm) my DCL in victron HRM is going all over the place as soon as I stop the bluetooth connection it is fine. See attachment in the middle the bluetooth connection is turned off.
 

Attachments

  • JK BMS bluetooth issue.PNG
    JK BMS bluetooth issue.PNG
    57.3 KB · Views: 6
I have exactly the same issue with one BMS, I'm pretty sure it is an hardware issue. I have been in contact with JK BMS but they keep sending me the same procedure to try to force update the drive. Does yours still work as en RS485 slave ? Mine did not see any slaves (when configured as master) and was not seen by the master when configured as slave.

I found possibly another bug, I use an ESP with ESPhome to make a bluetooth WiFi bridge and connect it to home assistant but when I do that (and it works like a charm) my DCL in victron HRM is going all over the place as soon as I stop the bluetooth connection it is fine. See attachment in the middle the bluetooth connection is turned off.
I'm in contact with their Alibaba store at the moment and they're sending me the same forced update procedure but it's not working:

1.Disconnect the host.
2.Select Force update from PC software. (Need authorization code, I will provide you)
3.Press the activation button without releasing it.
4.Press the reset button and release it quickly. You will see the PC software and start the firmware update.
5.If this fails, repeat the preceding steps.

I've tried this so many times no and still no luck. Out of interest, what firmware version is your faulty BMS running? I only received my BMSs within a week so I have some recourse to file a warranty claim I guess.
 
It makes a lot of sense,
First charge to 3.45v with constant current, and then charge to 0.05c with constant voltage,
However, there is a risk that during constant current charging, due to the inconsistency of the battery, some unit voltages will exceed 3.45v during constant current charging, which will cause the actual capacity of the battery to be smaller than the rated capacity,
On the premise of balance and consistency, we can do this if the inverter supports constant voltage charging.
At present, our standard is to charge at constant current first and then at constant voltage, so all charging strategies can be realized on the premise of good battery consistency
Interesting, I have my Morning Coffei in hand and going.
I re-read @Nami responses a few times. Must be the translation software being used, makes little to no sense. Also does not answer the points presented through this thread. Well that's more Deja-Vu, here we go again... oi... I do not think the "Terminology Differences" excuses apply either.

Throttling Charge Rate with internal resistance does that just by the chemical interactions from the cells, makes no sense. The increasing Internal resistance as they charge up naturally reduces the incoming charge from the outside source.

EDIT:
One place that Charge Throttling "MAY" be important is in a situation where you may have far more Charge Capacity (Amperage) than the Battery Packs can handle at once. So instead of a BMS Cutting off charge if AMPS available is above what is allowed (IE 100A max) then to reduce input charge by throttling it to the Max Allowable Amperage as set in the BMS. This could occur with a Large Bank that has Large Solar Input capacity and if packs cutoff for any reason, there could be excess solar input available for a short period of time.

SOC Calculations depend on what the USER/OWNER has set as their 0%-SOC and their 100%-SOC regardless of ANY other settings.
If 0%-SOC is set to 2.800 and 100%-SOC is set to 3.450 that leaves you 0.650V range to calculate SOC for the User Set Values.
When to Transition to FLOAT from Bulk/Absorb can only be triggered by using the Amps Taken by the battery packs as the Trigger for transition.
Once Amps Taken has dropped to EndAmp value the cells are full and can be floated for active balancing & topping off.

EndAmps or TailCurrent is the only way to evaluate the depth of charge within a Cell at a Set Voltage Point. When the cell is locked to charge no higher than 3.450 it will take as much amperage it can until it reaches that specific voltage. It will continue taking the amperage until the Internal resistance prevents it and starts to reduce the amount being accepted. For 280AH cell that value is 14A (280AH X 0.05 = 14A)
This is when the cells are "Saturated" or fully charged to the preset Voltage setting.
It does not matter if the user chooses 3.400V or 3.500V per cell, once the cell can take no more than 14A it is saturated/full.

Example: (simplified for translator)
6 battery Packs - 280AH each in Parallel. 200A Output & 100A Charge input capable.
200A Charge input from Solar to Battery Bank. (Bulk/Absorb Mode) - Constant Voltage/Constant Current).
All Battery Packs take a proportionate charge of approximately 33A with a slight bit of current float up/down slightly (normal).
All packs take all available amperage until the Batteries reach the SET Bulk/Absorb charge Voltage of 3.450V per cell.
They continue to do so until Internal Resistance reduces the amount of Amperage they can take at that set Voltage of 3.450V per cell.
When the Smartshunt reads the Amperage Taken by the bank falls to 14A it transitions the Solar Controllers to FLOAT mode (Constant Voltage/Variable Current).
Within 15 minutes of entering Float Mode, all packs are at 3.450V per cell with a Maximum deviation of 0.003 between cells in ALL Packs.
This is the proper and A-Typical behaviour for a properly configured battery bank & Solar Charge Controller(s)

Temperature Considerations for SOC is more complex because Temps affect capacity, regardless of cell Quality or Grading. It also affects the C-Rates (Capability) as well, frozen cells @ 0C Temp discharge & charge much slower than at 25C and again at High Temps like 50C they do not charge/discharge the same as at 25C. And of course every chemistry is very different, especially when it comes to temperatures. This is not something that can be addressed in a "Generic BMS" such as these. An EV Car Maker can deal with that because they are programmed for the Specific Battery Chemistry they are using, the same applies to Commercial Energy Storage Systems like Tesla Powerwalls which are programmed for the specific cell chemistry they are using.

I hope this translates well and is understood as intended.

EDIT 08:30 EST
For READERS of this Post/Thread.
Please be aware that NOT all Equipment (Inverter/AIO or SCC) may not expose Values such as EndAmps/TailCurrent and in some cases many not even have such capability (see value gear). Some equipment may use their SmartShunts and related software to interact with SCC's Inverter/Chargers etc... Other Higher End AIO's may also have internal Shunts with this capability. There are also a few who perform this internally and do not expose this to the end user. Basically, just because you may not have settings or values for EndAmps/TailCurrent it does not mean that this is not a key aspect of Battery System Charging and it applies to all Chemistries.
 
This is a very easy thing to fix. Essentially, you implement a protected area in flash on the microcontroller where you place a bootloader. Even if everything else fails, the bootloader is there and on BMS power-up it can be triggered to get the firmware from the PC.

Agree this is the common / best practice way to handle updates in micro controllers, and if sufficient flash storage is available you can even use two images to allow booting from the previous image (fallback) in case the update got interrupted or the new image fails from starting/crashes.

I would be super happy if we can have an open source firmware for these BMSes, but agree the concerns about potential warranty claims.

While there is a possibility to "brick" the device (which in practice means that depending on the uC used you may need to revert to another way to (re)load the boot loader/software), the biggest risk I see is that with improper firmware the hardware may be driven to damage the circuitry:

- board running outside it's design limitations / safe operating area, eg. higher current than the hardware can handle
- driving outputs in such a way things gets damaged (not sure this is possible by the hardware design -- I did not see any schematics)

...or simply some of the safety features not functioning correctly due to software bugs.

All of this can potentially happen with either 'official vendor provided' firmware, 'released open source' firmware or 'self compiled' firmware; in all cases a proper testing procedure as part of the software release process is key to prevent issues at the battery systems in use.
=> for me this would probably mean I'll setup a smallish non-production / testing setup so I can run various tests before deploying it on BMSses in the production environment. Guess not all users want to go that extra trouble (and expense)....

What in my eyes would be an excellent path forward, is to have a solid way to identify and proof if a board is running a vendor supported (signed?) firmware, or running 'something else'. If not running the vendor supported firmware --> warranty void -- you're at your own risk.

Getting to the point where the JK BMSes would be running open source firmware would allow JK to benefit from the community feedback and contributions (git pull requests?), while staying in control of quality and release vendor supported firmware.

At the same time I can think of other BMS manufacturers being very interested in that firmware repository too, potentially resulting in (more?) clones of the hardware... There are many aspects to consider before making the firmware open source :cool:
 
At the same time I can think of other BMS manufacturers being very interested in that firmware repository too

I don't think anyone would be really, except for maybe protocol implementations for Victron or such. All the rest is bog standard current/voltage measurement and controlling some switches. The BT module is a standard one connected to a UART; the protocol is nothing special. I have students doing this in their final year embedded systems project.
 
It makes a lot of sense,
First charge to 3.45v with constant current, and then charge to 0.05c with constant voltage,
However, there is a risk that during constant current charging, due to the inconsistency of the battery, some unit voltages will exceed 3.45v during constant current charging, which will cause the actual capacity of the battery to be smaller than the rated capacity,
On the premise of balance and consistency, we can do this if the inverter supports constant voltage charging.
At present, our standard is to charge at constant current first and then at constant voltage, so all charging strategies can be realized on the premise of good battery consistency
I am sorry that whatever translator you use cannot properly translate what I tried to share.
It is clear you did not understand my posting or that of others. This is terribly unfortunate.
Your answer makes no sense... 0.05C for a 100AH Battery is 5A not 0.5C 50A which is what it can accept for charging.

The vast majority of Energy Storage Systems users (like the members here) also do not charge their Batteries & especially banks of batteries at 0.5C, rather it is fairly common for most of us to charge our batteries around 0.2-0.3C.
 
I am sorry that whatever translator you use cannot properly translate what I tried to share.
It is clear you did not understand my posting or that of others. This is terribly unfortunate.
Your answer makes no sense... 0.05C for a 100AH Battery is 5A not 0.5C 50A which is what it can accept for charging.

The vast majority of Energy Storage Systems users (like the members here) also do not charge their Batteries & especially banks of batteries at 0.5C, rather it is fairly common for most of us to charge our batteries around 0.2-0.3C.
Seems to make sense to me....
Charge at the higher rate while on constant current mode, then reduced rate at 0.05C while in the constant voltage mode...
ie a current limited, constant voltage supply from the charge controller (not all are capable of this however...)
 
Last edited:
I am sorry that whatever translator you use cannot properly translate what I tried to share.
It is clear you did not understand my posting or that of others. This is terribly unfortunate.
Your answer makes no sense... 0.05C for a 100AH Battery is 5A not 0.5C 50A which is what it can accept for charging.

The vast majority of Energy Storage Systems users (like the members here) also do not charge their Batteries & especially banks of batteries at 0.5C, rather it is fairly common for most of us to charge our batteries around 0.2-0.3C.
For what it's worth, I'd like to get them to figure out proper functioning at higher C rates for a single battery instance, the worst case scenario. Then if you want to add multiples and reduce C rates thats fine for those users that choose to.
 
Seems to make sense to me....
Charge at the higher rate while on constant current mode, then reduced rate at 0.05C while in the constant voltage mode...
ie a current limited, constant voltage supply from the charge controller (not all are capable of this however...)
Using the simplest 100AH battery values.
1C / 100A discharge & 0.5C / 50A Charge is the A-Typical values for common LFP. These are the Maximums (excluding variants).
When charging a 100AH Pack, to 3.450V per cell. 24V = 27.6V or 48V = 55.2V you can push 50A at it at that voltage but once "set" voltage is reached the Amps taken by the pack decreases naturally (die to IR) and will continue to fall till it actually reaches a point where it will take nothing 0.0A and goes idle. Teh BMS has nothing to do with this... This is a result of the Chemical Interaction (physical) within the cells and there's nothing anyone can do about it.... Iyt is what it is.

FLOAT = Constant Voltage / Variable Current !
It keeps the Voltage at 27.6/55.2 and will push that 50A (if capable) but shortly after the cells reach that SET point, IR starts to build & the inpaut is reduced as a result. There is NO NEED to manipulate Current by the BMS it is outside of that and irrelevant.

A Solar Controller or Charger when in FLOAT mode will lower the Amps Available relative to IR. This applies to ALL Charging Systems that have a FLOAT Function (which is all of them, even the cheapo's). That being said, some of the Cheap Value grade SCC/Chargers are not so good at it.

MOST IMPORTANTLY: LFP (like other Lithium Based Batteries) uses only 2 STAGE Charging, unlike FLA/AGM etc.
Constant Voltage/Constant Current (Bulk/Absorb)
Constant Voltage/Variable Current) (FLOAT).

Case in Point using My Bank of 6x280AH Packs. That can deliver 1200A for discharge and take 600A Charge. I can Solar Charge with 170A and Full Push Charge also using the Inverter adding another 80A for a Total of 250A Charge which is NOT the 600A they can take collectively. Right now my Battery Bank is charging (low solar) and trickling the packs for all intents & purposes at this moment. Solar will pickup & push more and they'll take it without a blink. On a Nice Sunny day, the system usually reaches FLOAT around 12:00-13:00 and the Solar Systems Smartshunt is set to use EndAmps (14A) for the ABSORB Transition to FLOAT. When the Smartshunt reads 14A Taken (by the bank) it transitions to FLOAT (Variable Current) and within 30 minutes ALL Packs are @ 3.450V per cell and no more that 0.003mv deviation.
BULK harges the Packs up "full Throttle" which switches to ABSORB that is (set for 1 hour) OR until EndAmps is reached which usually only takes 25-35 minutes and then flips to Float.

I have watched all my packs sit at 100% taking 0.1A to 0A after balancing is done. Float at this point only provides whatever energy is "requested" by the inverter and bypasses pulling anything from the battery bank. It will only draw from the Bank if there is not enough Solar Input Amps to service the demand. As soon as the draw is over, then Float resumes re-topping the packs & rebalancing as need. This is all transparent and requires zero intervention.

THROTTLING the BMS is needed for handling things like PRE-CHARGING an Inverter/AIO on Startup as the initial pull surge is often seen by a BMS as a SHORT and shuts them down. The new JK's have this function and it works quite well. There is no other "Throttling" required. If the BMS can output 200A and handle a 5 second surge up to 350A that is what you have to work within (constraints). If you need to deliver more Constant Current or Surge Handling then you need bigger Battery & BMS to Match or more packs to divide/share both load & charge.
 
I'm in contact with their Alibaba store at the moment and they're sending me the same forced update procedure but it's not working:



I've tried this so many times no and still no luck. Out of interest, what firmware version is your faulty BMS running? I only received my BMSs within a week so I have some recourse to file a warranty claim I guess.
15.12
 
Using the simplest 100AH battery values.
1C / 100A discharge & 0.5C / 50A Charge is the A-Typical values for common LFP. These are the Maximums (excluding variants).
When charging a 100AH Pack, to 3.450V per cell. 24V = 27.6V or 48V = 55.2V you can push 50A at it at that voltage but once "set" voltage is reached the Amps taken by the pack decreases naturally (die to IR) and will continue to fall till it actually reaches a point where it will take nothing 0.0A and goes idle. Teh BMS has nothing to do with this... This is a result of the Chemical Interaction (physical) within the cells and there's nothing anyone can do about it.... Iyt is what it is.

FLOAT = Constant Voltage / Variable Current !
It keeps the Voltage at 27.6/55.2 and will push that 50A (if capable) but shortly after the cells reach that SET point, IR starts to build & the inpaut is reduced as a result. There is NO NEED to manipulate Current by the BMS it is outside of that and irrelevant.

A Solar Controller or Charger when in FLOAT mode will lower the Amps Available relative to IR. This applies to ALL Charging Systems that have a FLOAT Function (which is all of them, even the cheapo's). That being said, some of the Cheap Value grade SCC/Chargers are not so good at it.

MOST IMPORTANTLY: LFP (like other Lithium Based Batteries) uses only 2 STAGE Charging, unlike FLA/AGM etc.
Constant Voltage/Constant Current (Bulk/Absorb)
Constant Voltage/Variable Current) (FLOAT).

Case in Point using My Bank of 6x280AH Packs. That can deliver 1200A for discharge and take 600A Charge. I can Solar Charge with 170A and Full Push Charge also using the Inverter adding another 80A for a Total of 250A Charge which is NOT the 600A they can take collectively. Right now my Battery Bank is charging (low solar) and trickling the packs for all intents & purposes at this moment. Solar will pickup & push more and they'll take it without a blink. On a Nice Sunny day, the system usually reaches FLOAT around 12:00-13:00 and the Solar Systems Smartshunt is set to use EndAmps (14A) for the ABSORB Transition to FLOAT. When the Smartshunt reads 14A Taken (by the bank) it transitions to FLOAT (Variable Current) and within 30 minutes ALL Packs are @ 3.450V per cell and no more that 0.003mv deviation.
BULK harges the Packs up "full Throttle" which switches to ABSORB that is (set for 1 hour) OR until EndAmps is reached which usually only takes 25-35 minutes and then flips to Float.

I have watched all my packs sit at 100% taking 0.1A to 0A after balancing is done. Float at this point only provides whatever energy is "requested" by the inverter and bypasses pulling anything from the battery bank. It will only draw from the Bank if there is not enough Solar Input Amps to service the demand. As soon as the draw is over, then Float resumes re-topping the packs & rebalancing as need. This is all transparent and requires zero intervention.

THROTTLING the BMS is needed for handling things like PRE-CHARGING an Inverter/AIO on Startup as the initial pull surge is often seen by a BMS as a SHORT and shuts them down. The new JK's have this function and it works quite well. There is no other "Throttling" required. If the BMS can output 200A and handle a 5 second surge up to 350A that is what you have to work within (constraints). If you need to deliver more Constant Current or Surge Handling then you need bigger Battery & BMS to Match or more packs to divide/share both load & charge.
Well aware of how they work (its been my job and 'hobby' for most of my life lol and I'm well past 50...), however many systems (well over here, where solar is actually affordable and widespread) have arrays that are well in excess of what is needed for charging on 'good days' and are specced for the 'worst case' days, and do need throttling on good days to prevent over current charging of the battery bank...

re precharging- this is a habit many DIY'ers seem to have gotten into the habit if ignoring (with the resultant damage to their inverters longterm- high current surges into the inverters capacitors are 'bad, m'kay' and will result if premature failure of those capacitors...- all for the sake of omitting one lousy 5A MCB breaker and a high wattage resistor (literally a QH headlight bulb will do on 12 or 24v, or a pair of 24v ones on 48v- total cost under $10)

Cute little batteries ;-)- my own 20kwh LYP's 16x 400Ah can safely take 3C on charge and discharge, with peaks of up to 10C on discharge (so 1200A with peaks of up to 4000A on discharge)- the last big system I put in had 50kwh of battery storage on a 50kw solar array...
 
Correct logic,
Two ways to solve this problem
First: three stage or multi-stage charging. The logic is to gradually reduce the current during charging, realize constant voltage and current reduction, and realize it in stages. Different charging and discharging rates are set according to different SOC.
Generally, there are three sections. The ideal state is to set different charging logic according to the MSDS and soc-ocv curve of the cells from different manufacturers. This path will be more complex. This scheme is generally used in the electric vehicle system, while the cells in the energy storage market are very responsible and have many brands. It is difficult to edit different MSDs. We can only make a complete plan for several brands,
Second: improve equilibrium efficiency,
For large capacity battery cells, due to the inconsistency of battery cells, the energy transfer scheme may have a weak sense of balance efficiency. At present, the maximum current of BMS in the energy transfer balance scheme is 2a, which is transferred to the super capacitor through the high voltage, and then to the low voltage. This means that without considering the energy loss, the theoretical value of the balance current is 1a,
The current balance is
Assuming that there are currently eight battery units,
Namely
① 3.50v ② 3.46v ③ 3.44v ④ 3.48v ⑤ 3.42v ⑥ 3.30v ⑦ 3.38v ⑧ 3.36v
1. inductive balance, logic is adjacent balance,
Transfer from battery ① to adjacent battery ②, and then from battery ② to adjacent battery ③. At the same time, only one channel is balanced, and only one maximum voltage and one minimum voltage are balanced.
2. capacitor balance, logic is to balance the whole group, energy transfer balance.
Transfer from the highest battery ① to ⑧, and then from ④ to ⑦. At the same time, only one channel is balanced, and only one maximum voltage and one minimum voltage are balanced.
3. unidirectional DC balance, the logic is to transmit the total voltage to a single battery
At this time, the total voltage is 27.44v. The virtual total voltage transfers the voltage to ⑧, and the virtual total voltage transfers the voltage to ⑦
At the same time, multiple channels are balanced. According to the number of AFEs, there will be multiple minimum voltages for balancing.
This scheme will generally be equipped with passive equalization function at the same time, and use resistance discharge to consume high-voltage cells.
4. bidirectional DC balance. The logic is that the total voltage is transmitted to each unit, and each unit transmits the total voltage at the same time,
At this time, the total voltage is 27.44v. The virtual total voltage is directly transmitted to ⑧ through the transformer, and the virtual total voltage is directly transmitted to ⑧ through the transformer
At the same time, ① convert the voltage into virtual total voltage, ② convert the voltage into virtual total voltage,
At the same time, multiple channels are balanced. According to the number of AFEs, there will be multiple maximum and minimum voltages for balance.
5. SOC balance: pre equalization is carried out according to the system strategy. The individual SOC determines whether it needs to be balanced in advance and controls the SOC equalization with maximum efficiency.
Within the SOC balance range, the capacity of the battery pack can always be kept within 5% of the capacity of a single battery until the battery pack capacity is less than 80% in the 13th year. This test can increase the battery life by 30%
Using the simplest 100AH battery values.
1C / 100A discharge & 0.5C / 50A Charge is the A-Typical values for common LFP. These are the Maximums (excluding variants).
When charging a 100AH Pack, to 3.450V per cell. 24V = 27.6V or 48V = 55.2V you can push 50A at it at that voltage but once "set" voltage is reached the Amps taken by the pack decreases naturally (die to IR) and will continue to fall till it actually reaches a point where it will take nothing 0.0A and goes idle. Teh BMS has nothing to do with this... This is a result of the Chemical Interaction (physical) within the cells and there's nothing anyone can do about it.... Iyt is what it is.

FLOAT = Constant Voltage / Variable Current !
It keeps the Voltage at 27.6/55.2 and will push that 50A (if capable) but shortly after the cells reach that SET point, IR starts to build & the inpaut is reduced as a result. There is NO NEED to manipulate Current by the BMS it is outside of that and irrelevant.

A Solar Controller or Charger when in FLOAT mode will lower the Amps Available relative to IR. This applies to ALL Charging Systems that have a FLOAT Function (which is all of them, even the cheapo's). That being said, some of the Cheap Value grade SCC/Chargers are not so good at it.

MOST IMPORTANTLY: LFP (like other Lithium Based Batteries) uses only 2 STAGE Charging, unlike FLA/AGM etc.
Constant Voltage/Constant Current (Bulk/Absorb)
Constant Voltage/Variable Current) (FLOAT).

Case in Point using My Bank of 6x280AH Packs. That can deliver 1200A for discharge and take 600A Charge. I can Solar Charge with 170A and Full Push Charge also using the Inverter adding another 80A for a Total of 250A Charge which is NOT the 600A they can take collectively. Right now my Battery Bank is charging (low solar) and trickling the packs for all intents & purposes at this moment. Solar will pickup & push more and they'll take it without a blink. On a Nice Sunny day, the system usually reaches FLOAT around 12:00-13:00 and the Solar Systems Smartshunt is set to use EndAmps (14A) for the ABSORB Transition to FLOAT. When the Smartshunt reads 14A Taken (by the bank) it transitions to FLOAT (Variable Current) and within 30 minutes ALL Packs are @ 3.450V per cell and no more that 0.003mv deviation.
BULK harges the Packs up "full Throttle" which switches to ABSORB that is (set for 1 hour) OR until EndAmps is reached which usually only takes 25-35 minutes and then flips to Float.

I have watched all my packs sit at 100% taking 0.1A to 0A after balancing is done. Float at this point only provides whatever energy is "requested" by the inverter and bypasses pulling anything from the battery bank. It will only draw from the Bank if there is not enough Solar Input Amps to service the demand. As soon as the draw is over, then Float resumes re-topping the packs & rebalancing as need. This is all transparent and requires zero intervention.

THROTTLING the BMS is needed for handling things like PRE-CHARGING an Inverter/AIO on Startup as the initial pull surge is often seen by a BMS as a SHORT and shuts them down. The new JK's have this function and it works quite well. There is no other "Throttling" required. If the BMS can output 200A and handle a 5 second surge up to 350A that is what you have to work within (constraints). If you need to deliver more Constant Current or Surge Handling then you need bigger Battery & BMS to Match or more packs to divide/share both load & charge.
 
If you charge a Battery Pack to 55.2V and that is the SETTING within any/all charging devices, they will not go above that voltage.
You can push 5A or 500A but if the Physical battery will NOT take it, you cannot force it... It's not like Compressed Gas, you cannot squeeze more into in under pressure. Trying to do so would result in very nasty outcomes. The Chemistries Physical Limits are that, limits. You can charge to 55.2V and the Amps taken will drop to 0A and nothing will go in once the Internal Resistance is high enough which is the point where the physical cells are totally "saturated" and incapable of taking any more "at that set voltage". Now if you up it to 55.3, they'll take a wee bit more but will soon take nothing due to IR being pushed up again.

Not gonna dance around this... no point...

BOTTOM LINE, is JK is STILL BUSTED and the SOC Values displayed/present by the BMS are TRASH and useless.
The BMS' work in Standalone mode just fine and behave as they should for all intents & purposes.
The BMS have issues interacting with AIO's which is largely Protocol Implementation failures that must be solved for proper operations & interoperability.
Software both PC & Phone Apps need updating as well as a few functions needed to make operating these easier/better/efficient. IE like having a BMS Profile you can set & upload to your BMS Fleet instead of having to do each one painfully.

Active Balancing with the new JK is NOT A PROBLEM. My own experience with MANY of the previous versions & the current Inverter model, the New Inverter Model BMS is much better at the Active Balancing. It is smoother & faster actually.

SOC stating that it has reached 100% when ALL cells are at 3.385 (NOT 3.450 as SET in the Control Panel of the BMS) IS a problem (for anyone relying on the SOC to AIO interface for charging management).
 
Last edited:
I don't think anyone would be really, except for maybe protocol implementations for Victron or such. All the rest is bog standard current/voltage measurement and controlling some switches. The BT module is a standard one connected to a UART; the protocol is nothing special. I have students doing this in their final year embedded systems project.
I don't think anyone would be really, except for maybe protocol implementations for Victron or such. All the rest is bog standard current/voltage measurement and controlling some switches. The BT module is a standard one connected to a UART; the protocol is nothing special. I have students doing this in their final year embedded systems project.
 
For the SOC control logic of the inverter, the protocol of the inverter needs to be analyzed again,
We used cerbo before,
Then raspberry 。
Now the inverter is used for debugging,
1725892784673.png
If you charge a Battery Pack to 55.2V and that is the SETTING within any/all charging devices, they will not go above that voltage.
You can push 5A or 500A but if the Physical battery will NOT take it, you cannot force it... It's not like Compressed Gas, you cannot squeeze more into in under pressure. Trying to do so would result in very nasty outcomes. The Chemistries Physical Limits are that, limits. You can charge to 55.2V and the Amps taken will drop to 0A and nothing will go in once the Internal Resistance is high enough which is the point where the physical cells are totally "saturated" and incapable of taking any more "at that set voltage". Now if you up it to 55.3, they'll take a wee bit more but will soon take nothing due to IR being pushed up again.

Not gonna dance around this... no point...

BOTTOM LINE, is JK is STILL BUSTED and the SOC Values displayed/present by the BMS are TRASH and useless.
The BMS' work in Standalone mode just fine and behave as they should for all intents & purposes.
The BMS have issues interacting with AIO's which is largely Protocol Implementation failures that must be solved for proper operations & interoperability.
Software both PC & Phone Apps need updating as well as a few functions needed to make operating these easier/better/efficient. IE like having a BMS Profile you can set & upload to your BMS Fleet instead of having to do each one painfully.

Active Balancing with the new JK is NOT A PROBLEM. My own experience with MANY of the previous versions & the current Inverter model, the New Inverter Model BMS is much better at the Active Balancing. It is smoother & faster actually.

SOC stating that it has reached 100% when ALL cells are at 3.385 (NOT 3.450 as SET in the Control Panel of the BMS) IS a problem (for anyone relying on the SOC to AIO interface for charging management).
 

diy solar

diy solar
Back
Top