• Have you tried out dark mode?! Scroll to the bottom of any page to find a sun or moon icon to turn dark mode on or off!

diy solar

diy solar

Delayed Capacity Tests, A real world capacity test?

Alkaline

Solar Wizard
Joined
Jul 26, 2020
Messages
2,335
Location
Orange, Tx
I think we need to change the way we are testing cells or at least have more than 1 way. What I'm proposing is a delayed capacity test: In this test you charge the cell up to 3.65 make sure it holds voltage at .5Amps or less and then just set it aside for 30 days. After 30 days you perform the normal capacity test form something like a ZKE-40. The resulting capacity should be about 95% of the rated capacity as cells loose 1-2% / month normally, if the loss is greater than that I think this is an indicator of cell that will be a pain to keep balance.

I say this because too much focus is placed on capacity and not on cell behaviour or other foolishness it does when in a pack. Self Discharge seems to be the main culprit and I suspect this is the main reason manufacturers *fail* a cell not capacity. If you see the latest 280K V3, even the grade B, with the BIG BOLD B on it easily hit 290+ in AH, so why are these considered B then?

To test my theory I have some 280N cells from an 8 cell pack that were from Basen and 1 of them completely failed so we took that pack apart and now we were able to salvage 4 of them to make a 12s battery. That leaves us with 3 cells that appear to be OK. Let it be clear these are bout 4 years old, grade B cells (even thought Basen said they were A...)

So I'm going to do a full charge, wait 1 hour, and then do a capacity test at 40 amps. Record the info and then I'm going to charge it back up, leave it alone for 30 days and do another capacity test and compare.

If you have any input let me know because time is an element in this test.
 
I have also arrived at the conclusion that self discharge is the primary cause of routine loss of top balance.

The other commonly specified culprits don't make a lot of sense:

Capacity differences will cause runners at the bottom, but a capacity difference should not inherently cause any imbalance when they all return to the top. They all got the same coulombs in and same coulombs out.

Resistance differences could only cause imbalance by energy being lost to internal heating, and my gut feeling is that effect is too small to explain the need for routine balancing otherwise the heating effect would be noticeable.

Therefore, it seems like IR differences are merely correlated with overall cell quality and it's actually the higher self discharge property of the lower quality cells that is resulting in the routine imbalance force.

I would modify your suggestion and say that two tests should be run, immediate capacity and 30 day stored capacity, and then it's the difference between these two that will reveal the self discharge property separated from the overall capacity property.

Edit: Oh wait that is your suggestion. Got it.
 
I think we need to change the way we are testing cells or at least have more than 1 way. What I'm proposing is a delayed capacity test: In this test you charge the cell up to 3.65 make sure it holds voltage at .5Amps or less and then just set it aside for 30 days. After 30 days you perform the normal capacity test form something like a ZKE-40. The resulting capacity should be about 95% of the rated capacity as cells loose 1-2% / month normally, if the loss is greater than that I think this is an indicator of cell that will be a pain to keep balance.

Sounds like a very interesting theory and experiment.

Will be watching.
 
How well cells stay in balance is determined primarily by internal resistance and self-discharge rate and the stability of those parameters over time. Low quality cells will tend to have internal resistance and self-discharge rate change differently over time. These parameters will tend to be more consistent if a pack is built from cells from the same manufacturing batch. These parameters can be measured with the right test setup. Internal resistance is typically the most important parameter.

Cell capacity is not a factor unless you let a cell's voltage go out of the safety range during a charge or discharge cycle due to a difference in capacity. Doing so, will cause the internal resistance and self-discharge rate for individual cells that have gone outside of safe voltage range.

If you have cells with consistent internal resistance and self-discharge rate, but different capacity. Or even if the cells in the pack aren't initially at the same SOC, they will still stay in relative balance. The only implication of not having the cells at the same initial SOC, is if your BMS is operating property and it stops charge when the highest cell gets to the upper voltage limit, and it stops discharge when the lowest cell gets to the low voltage limit, is you won't get the absolute max capacity out of the pack. But the relative SOC for individual cells in the pack should stay the same in relation to each other.

If the cells in a pack have different internal resistance or self-discharge rates, the only way the pack will stay in balance is if the system has a good implementation of an automatic balancing system. But a good balancing system is actually very difficult to engineer properly. And a lot of low cost BMS implementation don't have very good implementations.
 
I have some “grade A” LF280N from Docan I was thinking about testing.
 
Well, I did the test and a... I'm a bit shocked 🤯 to be honest

4 Year old grade B Basen cell tested to 279.4 AH and 903 WH. This thing outperformed my Luyuan Grade A 280K V2 cells.

So I'm going to charge it up again to 3.65 at a .5 amp cut-off and then set asside for 30 days then we will run this test again.

But damn I could have swore I was thinking I may get 240...🤨
 

Attachments

  • 2024 basen test.zip
    2024 basen test.zip
    140 KB · Views: 1
  • basen-test1.png
    basen-test1.png
    141 KB · Views: 19
  • basen-test2.jpg
    basen-test2.jpg
    224.5 KB · Views: 19
  • qr-code-basen.jpg
    qr-code-basen.jpg
    241.3 KB · Views: 19
Ok Charged back it took 278 AH to charge back up. As for cycles I estimate about 50-60 total so not much. But all signs point to this being pretty decent, after the charge it held 3.54 volts at idle. So let's see what it does after 30 days.
 

Attachments

  • charge-up.png
    charge-up.png
    145.1 KB · Views: 8
Nice test.

I have a set of eight EVE 280 that came from Xuba before Amy Wan started the new company. They're about three years old and have been in service the entire time. I didn't do an initial capacity test, which I regret only for intellectual purposes. They stay balanced, but they are also charged up every day and don't see much discharge.

Which unit are you using to discharge? You mentioned the ZKE-40 in your original post, but I wasn't sure if that's what you went with. The picture of the graphs has ZKE in it so it looks like you did. Are you testing one cell at a time?
 
Yes ZKE40. I have been pondering this for a while and it has been driving me nuts for at least 1-2 years, I'm sure others are wondering why there top balance magically disappears.
 
I have two 4s batteries that were top balanced and then run down to about 40% SOC and stored. If I remember right I removed the buss bars, you got me curious as to where they sit.
 
@upnorthandpersonal @RCinFLA

When a series string of batteries has different IR's, does this result in unequal amounts of energy being dissipated into them by charging? And if it does, does different proportions of that energy become heat vs. chemical energy, resulting in a more even or less even chemical energy formation?
 
When a series string of batteries has different IR's, does this result in unequal amounts of energy being dissipated into them by charging?

Yes.

Look at it this way as a simplified equivalent:

(+)---R(1 Ohm)---R(2 Ohm)---(-) --> hook this up to 12V

You'll have a current flow of 4A. The dissipation in each resistor is P = RI^2, so 16W in the 1 Ohm resistor and 32W in the 2 Ohm resistor.

In the case of a battery, this is energy dissipated that does not make it into chemical stored energy.
 
Well, I did the test and a... I'm a bit shocked 🤯 to be honest

4 Year old grade B Basen cell tested to 279.4 AH and 903 WH. This thing outperformed my Luyuan Grade A 280K V2 cells.

So I'm going to charge it up again to 3.65 at a .5 amp cut-off and then set asside for 30 days then we will run this test again.

But damn I could have swore I was thinking I may get 240...🤨

I wouldn't be surprised if this holds after 30 days. I've seen very few cells that have a high self discharge, and most of those were cells that had been used in high current applications, kept at high voltage, charged outside acceptable temperatures (or stored at high temps), etc. Typically, these also don't have their rated capacity anymore. Dendrites are one thing that will get you a higher self-discharge, but they tend to form under the conditions mentioned, again, especially temperature.
 
Yes.

Look at it this way as a simplified equivalent:

(+)---R(1 Ohm)---R(2 Ohm)---(-) --> hook this up to 12V

You'll have a current flow of 4A. The dissipation in each resistor is P = RI^2, so 16W in the 1 Ohm resistor and 32W in the 2 Ohm resistor.

In the case of a battery, this is energy dissipated that does not make it into chemical stored energy.
Thank you! This is exactly the separate understandings I'm working on reconciling.

My circuit understanding tells me as above, has to be uneven energy dissipation.

My battery understanding is that the chemical energy is always equal for all batteries in the string. This is the one I am uncertain of and questioning.

Is it that the entire difference goes into heat, and the chemical energy is even across the string? Or can there be uneven chemical energy distribution in the string?
 
Yes, this is possible. There are more complex equivalent battery schematics that account for this, but it essentially depends on manufacturing tolerances, chemical composition differences, etc.
In your best guess do you think the small imbalances that we regularly address with our BMS balancers are more from uneven self discharge or these complex uneven chemical energy dynamics during the charge/discharge process?
 
complex uneven chemical energy dynamics

These. I'm not convinced it's self-discharge in most cases. It's possible that there are those, but not the majority. Keep in mind that internal resistance is also dynamic: it depends on state of charge among others. That's why an internal resistance measurement at rest does not indicate much: you want to measure the internal resistance over the entire charge/discharge cycle in order to match cells. Even slight imperfections during the manufacturing can change these.
There are also more complicated aspects to keep in mind, such as the solid electrolyte interface (SEI) layer, and many more...

But a question: how much of an imbalance do you really see? I have some 5 year old cells (32 of them, definitely not grade A) that have been in constant use 24/7. They go through dark winters where they never reach a full cycle (we had a pretty much 7 month winter this year), and I recently charged them all the way up again. There was an imbalance, but it's only a couple Ah at most and is resolved withing an hour or two with a 0.6A balance current. Does anyone see more than that?
 
But a question: how much of an imbalance do you really see? I have some 5 year old cells (32 of them, definitely not grade A) that have been in constant use 24/7. They go through dark winters where they never reach a full cycle (we had a pretty much 7 month winter this year), and I recently charged them all the way up again. There was an imbalance, but it's only a couple Ah at most and is resolved withing an hour or two with a 0.6A balance current. Does anyone see more than that?
Yeah it's not much. The only quantifiable balancing I have done so far was the initial balance of 6 month old new EVE 304's and that was 16 hours with 150mA balancing. And my balancer pauses to cool off about a third of the time during that process as well. But that's not even a pack balance yet, I would guess there are small differences in the exact charge they left the factory with.

I have been running it in use now for two months and haven't been able to identify significant balancing necessity at all. It will hit the balancer which is set to .010 delta, and during the initial balance without any load I could reliably observe the balance down to .007, but in the live system under an active and dynamic inverter load it seems too erratic to reliably observe lower than .020ish. So the BMS seems to chew on that .020 and I don't know if it's able to make more progress, I have let it balance for up to 4 hours and I don't think it ever satisfied itself that it had achieved the .010 target delta in the live system.

In my imagination the IR differences combined with the 60hz fluctuating load are creating a sort of voltage noise on top of the chemical voltages, but there is probably voltage noise in the chemical voltages this way too like the minute differences in chemistry create minute differences in the way they are chemically following the load.

What I have learned for sure is that I need to fully charge for SOC calibration much more often than the balancing. It looks like it can get about 5% off after two weeks of middling operation. In my BMS I have to set SOC drift points to do this calibration, and I am using drift up to 95% at 3.40 and drift up to 100% at 3.437. I am sure there is improvement possible in those drift points, because they were just my own first guess. But they are working good enough. Luckily my SOC errors seem to tend in the downward direction, i.e. after two weeks of no full charge, when nearing the full charge, the BMS will believe it's come back up to around 95% when it hits the 100% drift. My system relies on low SOC triggers alone to call for grid input backup, so that's the safer error at least, with an accumulating underestimation of the charge state.
 
Last edited:
These. I'm not convinced it's self-discharge in most cases. It's possible that there are those, but not the majority. Keep in mind that internal resistance is also dynamic: it depends on state of charge among others. That's why an internal resistance measurement at rest does not indicate much: you want to measure the internal resistance over the entire charge/discharge cycle in order to match cells. Even slight imperfections during the manufacturing can change these.
There are also more complicated aspects to keep in mind, such as the solid electrolyte interface (SEI) layer, and many more...

But a question: how much of an imbalance do you really see? I have some 5 year old cells (32 of them, definitely not grade A) that have been in constant use 24/7. They go through dark winters where they never reach a full cycle (we had a pretty much 7 month winter this year), and I recently charged them all the way up again. There was an imbalance, but it's only a couple Ah at most and is resolved withing an hour or two with a 0.6A balance current. Does anyone see more than that?
See the same behavior , in both my "a" , "a-" and b cells...

Makes me wonder if that extra that I paid for my "a" cells was even worth it, but time will tell I guess
 
Ok so it took a little while longer but I ran the other test and it actually did very well only a 2AH loss it tested to 277AH the voltage dropped form 3.63 volts down to 3.445 in about after about 45 or so days.

So yeah, this kind of disproves my theory :unsure:😖 that or this was actually a grade A cell that got sold alongside some B grade cells, maybe it was grade A OLD stock?

I suppose the good old capacity test is well, good and old and works! If you get good results initially you will probably have good results in the future.

So I'll probably do this test again and see what I get with another cell that tested lower initially and see what it does.
 

Attachments

  • basen-test2-45day.png
    basen-test2-45day.png
    143.1 KB · Views: 6
Very interesting. Thank you for making this test and posting about it.

My "test of choice" for battery health is DC internal resistances. Resistances? As in more than one? Yes. More below. First let me say to do a test you need ability to draw lots of current (at least 0.3C) and an oscilloscope (preferrably digital so you have time to analyse the trace).

So first about these DC resistances. I read in some science article lithium cells DC resistance have (at least) two components. Immediate resistance and slightly delayed resistance. They explained exactly what's happening that this delayed resistance develops, but I forgot.

The best way to measure both is to have the oscilloscope on record on slow trace and apply high current (I like 0.5C) while measuring cell voltage. Keep the current on for 5-10S and then switch it off all while recording. The trace will look something like this
Compress_20240807_090450_0965.jpg
The first step is the immediate DC resistance caused voltage drop. The second can be measured once it reaches steady state by measuring the amount of rebound voltage and subtracting the first (some cells will also have time delayed rebound).

What the values should be can be inferred from looking at discharge curves in the datasheet, but also once you test few good cells you'll know what to expect.

The best thing about the test? It takes seconds and you can do it without pre-charging.
 

diy solar

diy solar
Back
Top