diy solar

diy solar

What is my correct SOC? BMS ok?

BigBrosMo

New Member
Joined
Feb 6, 2021
Messages
93
Hi all, I've been lurking and learning for awhile and finally just built a new 24v, EVE 280ah LiFePO4 system. I am cost conscious so using components that have been recommended by the various well-known youtubers;

EPEVER 4210 SCC
TR16/AiLi Battery Monitor (350A shunt)
JK/Jikong 8s BMS K-B2A8S20P (spec sheet says it has an onboard coulomb counter)

I top balanced my batteries to 3.64v, current down to 0.1A, according to the charger used.

I temporarily connected up my system in series and attached the BM and BMS. Voltages measured were all about the same - or so close as that I assumed acceptable margin of error. I needed the power for the night so attached my inverter. The next day I set up the system in it's permanent home and connected the SCC.

I am on an in island in Thailand and conditions have not been perfect. It has basically been raining nonstop since I connected it all (today looks like it might get sunny/fingers crossed).

So here I am, roughly 2.5 days in with intermittent discharge of the battery, very little incoming charge. But now all of the units are reporting different voltages and SOC and so I have no idea which one to rely on! Here's a snapshot in time:

BMS is reporting:
26.35v system/Cells balanced each at 3.293v
69% of 280ah capacity
0.75A incoming charge (the SCC has minimally kicked in and the system is under small load)

Battery Monitor is reporting:
26.22v
43.9% of 280ah capacity
0.635A incoming charge

SCC is reporting:
26.4v
2.4A of charging current

The Volt Meter is reporting:
26.02V total/3.253V/cell at the Batteries themselves
26.07V and 2A output at the SCC

So my question is, since everything is reporting a different value, how do I trust that the SCC and BMS will do their jobs correctly or adhere to the settings I put in? How can they vary so much from the VM and how do I know what meter is correct for %SOC (BM or BMS)?

For the BMS, a 0.4v difference per cell is quite big and for the SCC my charging profile will be all out of whack won't it?
 
Hi all, I've been lurking and learning for awhile and finally just built a new 24v, EVE 280ah LiFePO4 system. I am cost conscious so using components that have been recommended by the various well-known youtubers;

EPEVER 4210 SCC
TR16/AiLi Battery Monitor (350A shunt)
JK/Jikong 8s BMS K-B2A8S20P (spec sheet says it has an onboard coulomb counter)

I top balanced my batteries to 3.64v, current down to 0.1A, according to the charger used.

I temporarily connected up my system in series and attached the BM and BMS. Voltages measured were all about the same - or so close as that I assumed acceptable margin of error. I needed the power for the night so attached my inverter. The next day I set up the system in it's permanent home and connected the SCC.

I am on an in island in Thailand and conditions have not been perfect. It has basically been raining nonstop since I connected it all (today looks like it might get sunny/fingers crossed).

So here I am, roughly 2.5 days in with intermittent discharge of the battery, very little incoming charge. But now all of the units are reporting different voltages and SOC and so I have no idea which one to rely on! Here's a snapshot in time:

BMS is reporting:
26.35v system/Cells balanced each at 3.293v
69% of 280ah capacity
0.75A incoming charge (the SCC has minimally kicked in and the system is under small load)

Battery Monitor is reporting:
26.22v
43.9% of 280ah capacity
0.635A incoming charge

SCC is reporting:
26.4v
2.4A of charging current

The Volt Meter is reporting:
26.02V total/3.253V/cell at the Batteries themselves
26.07V and 2A output at the SCC

So my question is, since everything is reporting a different value, how do I trust that the SCC and BMS will do their jobs correctly or adhere to the settings I put in? How can they vary so much from the VM and how do I know what meter is correct for %SOC (BM or BMS)?

For the BMS, a 0.4v difference per cell is quite big and for the SCC my charging profile will be all out of whack won't it?
Could you provide a schematic of the system?
Did you set the AH rating of the battery in the AILI settings?
 
You need a complete cycle or 3 for the BMS to read accurately. So, you need to accomplish that before looking for any other issues.
Small voltage differences can be caused by voltage drop (use larger wires to reduce that) or poor connections.
 
First... make absolutely certain all your connections are properly torqued. Loose connections will screw you at every turn.

EPEVER 4210 SCC
TR16/AiLi Battery Monitor (350A shunt)
JK/Jikong 8s BMS K-B2A8S20P (spec sheet says it has an onboard coulomb counter)

I top balanced my batteries to 3.64v, current down to 0.1A, according to the charger used.

So here I am, roughly 2.5 days in with intermittent discharge of the battery, very little incoming charge. But now all of the units are reporting different voltages and SOC and so I have no idea which one to rely on! Here's a snapshot in time:

BMS is reporting:
26.35v system/Cells balanced each at 3.293v
69% of 280ah capacity
0.75A incoming charge (the SCC has minimally kicked in and the system is under small load)

Battery Monitor is reporting:
26.22v
43.9% of 280ah capacity
0.635A incoming charge

The two are reporting NET current including charge sources and loads.

Concerns are the different battery voltages and currents.

These voltages are "open circuit", i.e., the measurements aren't affected by current.

SCC is reporting:
26.4v
2.4A of charging current

When charging the SCC will typically report a higher voltage than the BMS/monitor. It's measuring voltage in response to the charge current. No surprise it's higher. The current is also higher because the SCC only measures what it's feeding to the batteries.

The Volt Meter is reporting:
26.02V total/3.253V/cell at the Batteries themselves

Are these actually cell measurements or total divided by 8?

26.07V and 2A output at the SCC

Again, not concerned about the voltage difference because the meter is reading OCV. Are you confident in the accuracy of the meter? I've seen them off a bit and had to adjust them. I have a Fluke that is my "master".

So my question is, since everything is reporting a different value, how do I trust that the SCC and BMS will do their jobs correctly or adhere to the settings I put in? How can they vary so much from the VM and how do I know what meter is correct for %SOC (BM or BMS)?

Consider that everything you're relying upon is cheap Chinese hardware and doesn't necessarily conform to any accuracy standard. It's reasonable to trust that the equipment will function in accordance with the parameters you set and the values they measure.

For the BMS, a 0.4v difference per cell is quite big and for the SCC my charging profile will be all out of whack won't it?

I'm seeing a 0.04V difference, not 0.4V difference per cell. If a BMS cuts off at 3.69 vs. 3.65, no biggie.

The SCC profile is mostly self-correcting as it hits absorption voltage, reduces current and the voltage approaches OCV.

Unfortunately, you're flying blind using multiple data sources with no confidence of accuracy. That said, I would probably pick the Aili monitor as the "master" and trust that the BMS is accurate enough to properly protect the battery.
 
First... make absolutely certain all your connections are properly torqued. Loose connections will screw you at every turn.



The two are reporting NET current including charge sources and loads.

Concerns are the different battery voltages and currents.

These voltages are "open circuit", i.e., the measurements aren't affected by current.



When charging the SCC will typically report a higher voltage than the BMS/monitor. It's measuring voltage in response to the charge current. No surprise it's higher. The current is also higher because the SCC only measures what it's feeding to the batteries.



Are these actually cell measurements or total divided by 8?



Again, not concerned about the voltage difference because the meter is reading OCV. Are you confident in the accuracy of the meter? I've seen them off a bit and had to adjust them. I have a Fluke that is my "master".



Consider that everything you're relying upon is cheap Chinese hardware and doesn't necessarily conform to any accuracy standard. It's reasonable to trust that the equipment will function in accordance with the parameters you set and the values they measure.



I'm seeing a 0.04V difference, not 0.4V difference per cell. If a BMS cuts off at 3.69 vs. 3.65, no biggie.

The SCC profile is mostly self-correcting as it hits absorption voltage, reduces current and the voltage approaches OCV.

Unfortunately, you're flying blind using multiple data sources with no confidence of accuracy. That said, I would probably pick the Aili monitor as the "master" and trust that the BMS is accurate enough to properly protect the battery.
Great info thank you. I will digest a bit but right off the bat I can respond to two items:

The battery measurements were direct. ie. At the end terminals, I got 26.02v and then at each cell 3.253. No division by 8.

You said you're seeing a difference of .04v.... OK! yes, I was wrong there. Thanks for the correction.

I guess the BMS and AiLi seem to be so different because of the currents they are seeing. AiLi has a shunt, the BMS incorporates a coulomb counter but I am not sure how. I would think they would both be accurate and hence nearly the same. It's annoying as my own usage calculations and hence sizing of the system, better reflect what the BMS is reporting. So if I choose to go with the AiLi as reference, I am using WAY more power than anticipated and I don't know why. In the past with my Lead Acid system I would use just under 2kwh per day with a pump, fridge, lights, device charging and occasional power tool use. Now, if the Aili is to be believed, I am using 2.8kwh per day with no pump or power tool use?!! I'll give it a few days, but something doesn't feel right.
 
Great info thank you. I will digest a bit but right off the bat I can respond to two items:

The battery measurements were direct. ie. At the end terminals, I got 26.02v and then at each cell 3.253. No division by 8.

Good. That means you're getting correlation between the meter and the BMS - a little more faith in that at least.

You said you're seeing a difference of .04v.... OK! yes, I was wrong there. Thanks for the correction.

I guess the BMS and AiLi seem to be so different because of the currents they are seeing. AiLi has a shunt, the BMS incorporates a coulomb counter but I am not sure how. I would think they would both be accurate and hence nearly the same. It's annoying as my own usage calculations and hence sizing of the system, better reflect what the BMS is reporting. So if I choose to go with the AiLi as reference, I am using WAY more power than anticipated and I don't know why. In the past with my Lead Acid system I would use just under 2kwh per day with a pump, fridge, lights, device charging and occasional power tool use. Now, if the Aili is to be believed, I am using 2.8kwh per day with no pump or power tool use?!! I'll give it a few days, but something doesn't feel right.

Go buy 10 multimeters and use them to read the same battery. You'll probably get 5-10 different readings depending on how many decimal places you use. Accuracy is a function of standards and money.

Based on reading this forum, the BMS current counting is "inconsistent," so I inherently don't trust them that much. The next time you have a more significant load, measure the same current the BMS/Battery Monitor see. Maybe the BMS is more accurate. Maybe the Aili can be calibrated.

What else has changed since the lead-acid system? If you've upgraded your inverter or up-sized it, it's likely using more power. I didn't account for that in my initial calculations, and my Quattro's use of 0.72kWh/day surprised me. It was about 30% of my "background" load.
 
BMS is reporting:
26.35v system/Cells balanced each at 3.293v
69% of 280ah capacity
0.75A incoming charge (the SCC has minimally kicked in and the system is under small load)

Battery Monitor is reporting:
26.22v
43.9% of 280ah capacity
0.635A incoming charge

SCC is reporting:
26.4v
2.4A of charging current

The Volt Meter is reporting:
26.02V total/3.253V/cell at the Batteries themselves
26.07V and 2A output at the SCC
Didn’t see it mentioned but with the 2 amps from the SCC and a charge current at teh batteries of around .7, is fairly normal for me. I have an idle lad that can be around that. When my inverter is on, automatically 1 amp and my radio is on, another 5 amps. I’d get similar readings to what you see.

My equipment is difficult to take readings from because the sun in my area is not laboratory quality and my readings keep fluctuating, so its averages.

I am using my Ali Meter on my crate setup. once the battery is charged again, recommend resetting yours to to 100%. THat’s one thing you can do to calibrate. Your batteries are not likely 280 AH each. In my case my meter is set to 50 ah, but my capacity test was 48 ah. So you can adjust the battery capacity to get a more correct number. Hard to do with out knowing your battery capacity.

Despite this, it still appears my Ali will show my batteries at 100% when my SCC is still charging. I think it has to do with how efficiently the batteries charge. I think Ali takes 10 amps out of my 50 amp pack and shows 80%. Once 10 ah go back in, it shows 100%. I think the battery charging is not 100% efficient, so it may actually need to put 11 ah back in to be truly charged where my SCC tapers down to float voltage.

My Victron monitor has more options, and seems to be more accurate. The Victron has something called a tail current where you adjust the reading and the SOC goes to 100% at the correct point. Many more options with the VIctron Battery Monitor, but quite a bit more expensive also.
 
Why does no one ever tell the new members to calibrate & correct their systems ?
Voltage @ Battery Terminal is one thing but then you have Lugs, Wires, Switches, Shunts & Fuses and ALL add resistance which adds up fast. This MUST be addressed or you will get inconsistent reading across the system and that CAN cause issues.... Lithium Based is NOT Brute force like Lead, even a 1/2 Volt out can result in problems. You will NEVER EVER see the same voltage at battery terminals & the Inverter on the "far end of the system".

Take a 24V system, you want the inverter to cutoff @ 22.0V (2.75Vpc) BUT it is 0.5V out from actual battery, so when it sees 22.0, the battery actual it will still cutoff at what it thinks is 22.0 but in reality it may be 21.5V (2.68Vpc).

Key Issues:
Charging: When charging the SCC or Inverter/Charger (if charging via that) will always show a higher voltage than what is at the Battery Terminals, as derration through the mix of "things" has that effect. But the KEY is the Charge cutoff points for Voltage and EndAmps/Tailcurrent. From the SCC point of view it must be accurate when charging or it can overcharge. What the SCC sees when it isn't charging is pretty well moot. IF an Inverter/Charger is in use & used for charging from Grid/Genset then it's chargimg voltage has to be matched up & callibrated as well.

Discharging:
The SCC itself doesn't care if it isn't doing charging and will likely be off. SCC has only 1 job, to charge.
The Inverter on the other hand MUST be calibrated for the Discharge voltage drops. The last thing you want is the Inverter to think it is cutting off at 22.0V when it is in reality at 21.0V (2.65Vpc) because by then weak cells will likely be triggering cutoffs (really not a good thing).

Links in my signature covers this and more stuff
Good Luck
Steve
 
Why does no one ever tell the new members to calibrate & correct their systems ?
Voltage @ Battery Terminal is one thing but then you have Lugs, Wires, Switches, Shunts & Fuses and ALL add resistance which adds up fast. This MUST be addressed or you will get inconsistent reading across the system and that CAN cause issues.... Lithium Based is NOT Brute force like Lead, even a 1/2 Volt out can result in problems. You will NEVER EVER see the same voltage at battery terminals & the Inverter on the "far end of the system".

Take a 24V system, you want the inverter to cutoff @ 22.0V (2.75Vpc) BUT it is 0.5V out from actual battery, so when it sees 22.0, the battery actual it will still cutoff at what it thinks is 22.0 but in reality it may be 21.5V (2.68Vpc).

Key Issues:
Charging: When charging the SCC or Inverter/Charger (if charging via that) will always show a higher voltage than what is at the Battery Terminals, as derration through the mix of "things" has that effect. But the KEY is the Charge cutoff points for Voltage and EndAmps/Tailcurrent. From the SCC point of view it must be accurate when charging or it can overcharge. What the SCC sees when it isn't charging is pretty well moot. IF an Inverter/Charger is in use & used for charging from Grid/Genset then it's chargimg voltage has to be matched up & callibrated as well.

Discharging:
The SCC itself doesn't care if it isn't doing charging and will likely be off. SCC has only 1 job, to charge.
The Inverter on the other hand MUST be calibrated for the Discharge voltage drops. The last thing you want is the Inverter to think it is cutting off at 22.0V when it is in reality at 21.0V (2.65Vpc) because by then weak cells will likely be triggering cutoffs (really not a good thing).

Links in my signature covers this and more stuff
Good Luck
Steve
Just wanted to thank you Steve (but also the rest of you!) for your response. Your signature does have a lot of useful information. My notes/observations/takeaway:

1. I will tailor SCC voltages in consideration of the voltage differences from the BMS.

2. I was able to find a way to adjust calibration voltage on the BMS but unfortunately it doesn't allow me to go as far as I need. ie. Best I can do shows a BMS voltage that's 0.2v higher than the VM measurement of the bank, or roughly .02v difference per cell higher. This tells me that the BMS cutoff points won't be accurate. Should I adjust for this in it's config then?

3. I used the current calibration on the BMS and now the the AiLi battery monitor matches the BMS pretty much spot on (.02v delta)

4. My inverter doesn't have adjustable cut off points so I will have to rely on the BMS' disconnect points.

So... In the end my only "issue" (I think) is the 0.2v difference between the BMS and volt meter. I will try another meter and/or see if there's a way I can calibrate the BMS voltage further (it seems it won't allow me to put a voltage that's closer than .2v, though I can go higher - so maybe it has an adjustment range and I am at the bottom end)

thanks again
 
Just following up for anyone interested. I used another meter and it lined up with the AiLi and the BMS. So taking that as accurate I found a way online to calibrate the original volt meter itself by adjusting a potentiometer inside the unit. Everything (except the SCC which I am accounting for in config) is within .01v now. I believer I am GOLDEN. Thanks again for all your assistance and thoughts.
 
Back
Top