diy solar

diy solar

LiFePO4 Voltage Chart?

I've been watching my battery bank voltage along with my battery monitor kit which seems to be a good judge of +/- AH. I reset my AH meter at full charge yesterday and now my 345 AH bank is down 110AH and showing 46.0V (my 14S configuration is fully charged at 51.0V). According to the voltage chart on the first page, I am at approx 3.285V per cell which is right around 60% SOC.

I understand the SOC % is not going to be nearly as accurate as a shunt reading, but it seems to be in the close ballpark.
My OverKill Solar showed lower SOC vs. Victron Smart Shunt. So I change the OverKill Solar SOC settings to better match the charts here.
 
I’ve been looking at this thread and using it as a reference since I started building my first 8S LiFePO4 battery over a year ago.

But now that I’m starting to get increasingly familiar with my newly-built 560Ah 8S battery, I’ve got some questions:

First and foremost, in terms of recommended minimum discharge levels (how close to 0% SOC should you discharge), is it the depletion of charge that can result in wear/aging of a LiFePO4 cell or is it a low voltage level?

I’m fully discharging my battery nightly (to whatever minimum voltage level I set) and I’m trying to understand what safe voltage I can deplete my cells to.

I’ve seen the chart indicating the start of the ‘red zone’ at 3.00V and the start of the ‘yellow zone’ at 3.15V which is what I’ve been using.

I’ve got my LVD set at 25.2V which is working out pretty close to 3.15V per cell (my battery is bottom-balanced).

Soon after cutting off discharge at LVD, my battery ‘Springs Back’ by ~250mV or ~31mV per cell.

The IR on my cells is not great - at least double the estimated 0.25mOhm per cell, so high IR is probably to blame for at least half of this voltage recovery, but in any case, I’m assuming it is the voltage of the settled cell with no currents flowing that reflects SOC, correct?

So my cells I disconnected at 3.15V could actually be discharged by an additional ~31mV to 3.12V (25.95V) and assuming they settle at ~3.15V, I’m still in the ‘green zone’, correct?

And my second question is to understand the meaning of the ‘yellow zone’. If I discharge my cells down to an LVD of 24.00V (which means they should spring back to 3.031 if not a bit higher), what is the risk I’ll be taking in terms of reduced cycle life for my cells?

The ‘chart’ indicates ~4.5% capacity between 3.15V and 3.00V but I’m starting to suspect my cells actually have more of their capacity in the range and if I know that the true ‘danger zone’ is based on charge and not voltage, I might discharge to 2.5V (truly empty) then charge to 10% SOC to determine the voltage levels associated with the true ‘danger zone’ for my specific cells…
 
I’ve been looking at this thread and using it as a reference since I started building my first 8S LiFePO4 battery over a year ago.

But now that I’m starting to get increasingly familiar with my newly-built 560Ah 8S battery, I’ve got some questions:

First and foremost, in terms of recommended minimum discharge levels (how close to 0% SOC should you discharge), is it the depletion of charge that can result in wear/aging of a LiFePO4 cell or is it a low voltage level?

I’m fully discharging my battery nightly (to whatever minimum voltage level I set) and I’m trying to understand what safe voltage I can deplete my cells to.

I’ve seen the chart indicating the start of the ‘red zone’ at 3.00V and the start of the ‘yellow zone’ at 3.15V which is what I’ve been using.

I’ve got my LVD set at 25.2V which is working out pretty close to 3.15V per cell (my battery is bottom-balanced).

Soon after cutting off discharge at LVD, my battery ‘Springs Back’ by ~250mV or ~31mV per cell.

The IR on my cells is not great - at least double the estimated 0.25mOhm per cell, so high IR is probably to blame for at least half of this voltage recovery, but in any case, I’m assuming it is the voltage of the settled cell with no currents flowing that reflects SOC, correct?

So my cells I disconnected at 3.15V could actually be discharged by an additional ~31mV to 3.12V (25.95V) and assuming they settle at ~3.15V, I’m still in the ‘green zone’, correct?

And my second question is to understand the meaning of the ‘yellow zone’. If I discharge my cells down to an LVD of 24.00V (which means they should spring back to 3.031 if not a bit higher), what is the risk I’ll be taking in terms of reduced cycle life for my cells?

The ‘chart’ indicates ~4.5% capacity between 3.15V and 3.00V but I’m starting to suspect my cells actually have more of their capacity in the range and if I know that the true ‘danger zone’ is based on charge and not voltage, I might discharge to 2.5V (truly empty) then charge to 10% SOC to determine the voltage levels associated with the true ‘danger zone’ for my specific cells…
To answer your first question, I believe that the risk of harm to your cells is by going below 2.5V. I know the real lower "knee" in the discharge curve is closer to 3.15V, but if you go to 3.0V you are still plenty above the danger zone.

Let me give you some no kidding real data. I spent way too much money on one of the ECB-A40L capacity testers. It discharges a cell by a fixed amount (up to 40A), graphs the voltage as it discharges, and logs it such that it can be saved in a spreadsheet.

My cells are 230Ah cells, and they all exceeded the advertised capacity. So let's look at one of them. I was discharging at a constant 25A, and it went from fully charged to cut-off at 2.5V in 9:32:49 (9 hours, 32 minutes, 49 seconds). If you express that in decimal it is 9 + (32/60) + (49/3600) = 9.546944 hours. Ah is amps times hours, so 25 x 9.546944 = 238.67Ah.

Now I look at the spreadsheet, to see where (or actually, when) the voltage got to 3.15V and to 3.00V. The 3.15V crossover was at 31853 seconds. The 3.0V crossover was at 33449 seconds. So the time between 3.15V and 3.0V was 33449 - 31853 = 1596 seconds, which is 1596/3600 = 0.44333 hours. At 25A discharge, Ah (amps x hours) = 25 * 0.44333 = 11.08Ah. If you go with the rated capacity of 230Ah, this is 11.083/230 = 4.8%. If you want to go with the actual capacity of 238.67Ah, it is 11.083/238.67 = 4.6%. So I'm gonna say that there a bit under 5% of the available capacity between 3.15V and 3.00V.

Now, if you wanted to see what was left between 3.0V and 2.5V, I'll save you having to see me doing the math. It works out to be 6.35Ah, or about 2.7% of the rated capacity.

So in total, if you stop at 3.15V there is still 7.5% of the capacity in the cell. If you stop at 3.0V there is only 2.7% of the capacity left. I've made the personal decision that I'm comfortable with stopping at 3.0V/cell rather than 3.15V per cell, as I've still got 0.5V/cell before danger. I'm setting my inverter LVCO at 24V (3.0V per cell). The danger is going below 2.5V. But stopping at 3.0V is plenty safe.
 
I should add: There is of course a risk that in an 8S pack there are some "downward runners" and that the cells will likely be more out of balance at the bottom, since you did top balancing. However, the real steep drop is after 3.0V. In my limited experiments I've only seen about 300mV - 400mV worst case of deviation between the cells when they get close to 24V. I have my BMS LVD set to 2.5V. If I start seeing the BMS do an LVD much before the LVCO on the inverter, I'll revise my position. ;)
 
Thank you for the awesome breakdown @Horsefly of ampere hours vs state of charge % and voltage!

Grateful for the data.

Voltage difference * Number of cells is one margin metric I try to keep myself aware of when looking at total pack voltage and stuff.
 
I should add: There is of course a risk that in an 8S pack there are some "downward runners" and that the cells will likely be more out of balance at the bottom, since you did top balancing.

I did top balance before I first put my battery together, but I pretty quickly figured out that bottom-balanced works better for my application (time-shift).

However, the real steep drop is after 3.0V. In my limited experiments I've only seen about 300mV - 400mV worst case of deviation between the cells when they get close to 24V. I have my BMS LVD set to 2.5V. If I start seeing the BMS do an LVD much before the LVCO on the inverter, I'll revise my position. ;)
I’ve been disconnecting at 25.2V and my cells are all within 5mV of 3.150V upon LVD (and then bounce back to ~3.18V soon after, as I stated earlier).

I start charging when the sun comes up, start discharging around 50% SOC, and my battery continues to charge until evening, hitting a peak near 70-75%. Discharge continues through peak hours and into the night, generally hitting LVD just before midnight (now - should be later in summer).

So my battery never gets close to being fully-charged and my SCC has literally never made it out of Boost / Bulk.

If I’ve got runners, they will hit me near 90% SOC, not 5-10% SOC.

That’s the reason I started out asking about damage / wear at the low end - my battery will hit LVD 365 days a year, but will hit float much, much less often.

My biggest concern is whether my battery has the capacity to store all of the increased power available in summer. Only starting testing the system 3 weeks ago, so it’ll be ~8 months before I know…
 
I did top balance before I first put my battery together, but I pretty quickly figured out that bottom-balanced works better for my application (time-shift).


I’ve been disconnecting at 25.2V and my cells are all within 5mV of 3.150V upon LVD (and then bounce back to ~3.18V soon after, as I stated earlier).

I start charging when the sun comes up, start discharging around 50% SOC, and my battery continues to charge until evening, hitting a peak near 70-75%. Discharge continues through peak hours and into the night, generally hitting LVD just before midnight (now - should be later in summer).

So my battery never gets close to being fully-charged and my SCC has literally never made it out of Boost / Bulk.

If I’ve got runners, they will hit me near 90% SOC, not 5-10% SOC.

That’s the reason I started out asking about damage / wear at the low end - my battery will hit LVD 365 days a year, but will hit float much, much less often.

My biggest concern is whether my battery has the capacity to store all of the increased power available in summer. Only starting testing the system 3 weeks ago, so it’ll be ~8 months before I know…
Well, if you bottom balanced, I would not hesitate to go to 3.000V per cell. It is safe, as evidence of you low cell differential voltage at 3.15V. Go for it! Don't leave that cash (Ah) on the table!
 
Well, if you bottom balanced, I would not hesitate to go to 3.000V per cell. It is safe, as evidence of you low cell differential voltage at 3.15V. Go for it! Don't leave that cash (Ah) on the table!
Cool. I’ll change my LVD setting this evening. And what do you make of my ‘bounce-back’ of 31mV? Would you think I can safely set LVD at 2.97V knowing the cells will likely bounce back to 3,0V as soon as the load is cut?

Assuming you’ve taken the risk of a runner away through a proper balance, is the cost / risk / wear of discharging to 10% generally considered to be less than the cost / risk / wear of charging above 90%?
 
Cool. I’ll change my LVD setting this evening. And what do you make of my ‘bounce-back’ of 31mV? Would you think I can safely set LVD at 2.97V knowing the cells will likely bounce back to 3,0V as soon as the load is cut?

Assuming you’ve taken the risk of a runner away through a proper balance, is the cost / risk / wear of discharging to 10% generally considered to be less than the cost / risk / wear of charging above 90%?
Don't think of the 31mV rise as a "bounce". My observation of both AGM and LFP batteries is that there is a "sag" in the voltage during a discharge, and that the sag is bigger for an AGM than it is for an LFP. I've also been told that the sag is bigger for cheap AGMs than it is for higher quality AGMs. I don't have any references to give, but I believe I've read that this sag is actually just the voltage drop of the battery / cells, based on the current multiplied by the internal resistance. If that is true, the "bounce" that you are seeing is just the battery representing its true voltage with no load.

Having said that, I wouldn't set the LVD down to 2.97V. You're making out like a bandit by setting it to 3.0V. Be happy! You won! ?

I'm not sure I can answer your second question. The upper knee is much steeper than the lower knee. Without dragging up all those spreadsheets again, I can tell you that there was almost nothing - like 3-4Ah between 3.65V and 3.3V. So your question isn't fair, since the top and the bottom are so different. In addition, another forum member shared with me some data that (counter to the point he was trying to make) showed that LFP is pretty tolerant of overcharging. I don't want to encourage anyone to do anything bad, but I will say that it seems going below 2.5V is way, way worse than going over 3.65V. I'll just leave it at that.

I remember thinking when I saw this data on my cells that it might make more sense for me to use bottom balancing, because there was so much more to be gained squeezing out a bit below the bottom knee than was available above the top knee. However, I determined that in my case - vast majority of the time in float every day - top balancing still made more sense.
 
Thank you for the awesome breakdown @Horsefly of ampere hours vs state of charge % and voltage!

Grateful for the data.

Voltage difference * Number of cells is one margin metric I try to keep myself aware of when looking at total pack voltage and stuff.
Well, you are very welcome. I'm really glad you appreciated it!!

If you like that, here's some more just to show statistical relevance. I used the same capacity tool (ECB-A40L) on three other cells. The numbers were similar, but not quite the same.

Cell 4 was the one I gave the data on above:
11.083Ah from 3.15V to 3.0V (4.8% of capacity)
6.35Ah from 3.0V to 2.5V (2.7% of capacity)

Cell 5:
13.0Ah from 3.15V to 3.0V (5.6% of capacity)
6.416Ah from 3.0V to 2.5V (2.79% of capacity)

Cell 6:
11.93Ah from 3.15V to 3.0V (5.2% of capacity)
6.45Ah from 3.0V to 2.5V (2.8% of capacity)

Cell 7:
11.34Ah from 3.15V to 3.0V (4.9% of capacity)
6.34Ah from 3.0V to 2.5V (2.76% of capacity)

So it's pretty consistent, and if anything trends to say there is more usable Ah between 3.15V and 3.0V than in my first post.

Even though I guess I knew it, the epiphany for me was when I realized that with a constant current discharge, the Ah were just time. It makes the curves more meaningful and it makes the calculations easy.
 
Well, you are very welcome. I'm really glad you appreciated it!!

If you like that, here's some more just to show statistical relevance. I used the same capacity tool (ECB-A40L) on three other cells. The numbers were similar, but not quite the same.

Cell 4 was the one I gave the data on above:
11.083Ah from 3.15V to 3.0V (4.8% of capacity)
6.35Ah from 3.0V to 2.5V (2.7% of capacity)

Cell 5:
13.0Ah from 3.15V to 3.0V (5.6% of capacity)
6.416Ah from 3.0V to 2.5V (2.79% of capacity)

Cell 6:
11.93Ah from 3.15V to 3.0V (5.2% of capacity)
6.45Ah from 3.0V to 2.5V (2.8% of capacity)

Cell 7:
11.34Ah from 3.15V to 3.0V (4.9% of capacity)
6.34Ah from 3.0V to 2.5V (2.76% of capacity)

So it's pretty consistent, and if anything trends to say there is more usable Ah between 3.15V and 3.0V than in my first post.

Even though I guess I knew it, the epiphany for me was when I realized that with a constant current discharge, the Ah were just time. It makes the curves more meaningful and it makes the calculations easy.
How much variability did you see between the charge curves of the different cells?

I don’t have a data-logger like you but I’ve seen that whole 66% to 75% of my 16 cells have charge/discharge curves that are very similar (they have similar voyages from 0% to 100%, the other 25% to 30% vary - ‘faster’ in some areas and ‘slower’ in others.

As long as capacity between min and max is similar, it all comes out in the wash, but I assume this is the difference between true Grade-A-matched cells and the cheapo aftermarket cells we’re getting from resellers…
 
Don't think of the 31mV rise as a "bounce". My observation of both AGM and LFP batteries is that there is a "sag" in the voltage during a discharge, and that the sag is bigger for an AGM than it is for an LFP. I've also been told that the sag is bigger for cheap AGMs than it is for higher quality AGMs. I don't have any references to give, but I believe I've read that this sag is actually just the voltage drop of the battery / cells, based on the current multiplied by the internal resistance. If that is true, the "bounce" that you are seeing is just the battery representing its true voltage with no load.
Internal Resistance is a simplistic metric for a much more complex system.

I went ahead and set LVD to 24.0V last night.

Here is what I saw on a representative cell:

Just before LVD: 2.995V (~9.3875% SOC)
Seconds after LVD: 2.997V (9.4325%)
~1 minute after LVD: 3.005V (9.65%)
~6 hours after LVD: 3.038V (10.64% SOC)

The 2mV delta immediately upon shutdown is likely due to ‘true’ path resistance. Current being drawn just before LVD was ~10A meaning a 2mV delta corresponds to 0.2mOhm.

I’ve got 6” of 2/0 welders wire between cells representing ~0.05mOhm plus the IR of 2 cells in parallel which should be less than 0.25mOhm / 2 = 0.125mOhm but I suspect is higher. Whether contact resistance between lugs and terminals or higher IR than spec, it all pencils out and there is only ~0.025mOhm of true cell resistance unaccounted for.

The additional ~8mV of voltage increase inverter the subsequent minute is much slower and cannot be due to ‘true’ current x resistance drops. It’s caused by redistribution of charge within the cell, meaning ‘deeper’ / further-from-the ‘surface’ charge starts equalizing with more depleted ‘surface’ charge as ~minutes go by.

This charge equalization process continues and while I did not track overnight to understand when the cell had reached equilibrium, by the early next morning 6 hours later that same cell had settled to a voltage of 3.038V, another 33mV higher.

That’s the reason I call it a ‘bounce’. During discharge, the ‘surface’ / ‘closer-to-terminal’ portions of a cell are more depleted of charge than the deeper / further portions and once discharge has ceased, as that charge redistributes, it will result in voltage measured at the terminal ‘bouncing’ back from the seemingly more depleted levels being measured during active discharge.

Of course, resistance is the impeding of current flow and ‘internal resistance’ is what will govern and slow down the time / process needed for a cell to reach equilibrium following cessation of discharge, ‘resistance’ is a one-dimensional metric for a multi-dimensional system of charge transport within these cells.

Hence the reason think of it as a ‘bounce’ back from surface-level charge depletion to equilibrium-level charge depletion (as opposed to the instantaneous delta_V / delta_I effect which is caused by true and simple resistance, both internal and external).

Having said that, I wouldn't set the LVD down to 2.97V. You're making out like a bandit by setting it to 3.0V. Be happy! You won! ?

My main goal in defining my minimum SOC / voltage is because that is where I will bottom-balance my cells.

In actual daily use, my battery has way more capacity than I need, so I’ll be able to set LVD above 20% (3.2V).

As we get through winter and into next summer, I’ll need to use more of my batteries capacity if I want to capture all of the increased daily production.

Not clear yet whether I’ll need to push LVD all the way down to 3.00V to avoid wasting some summertime production or not…

I'm not sure I can answer your second question. The upper knee is much steeper than the lower knee. Without dragging up all those spreadsheets again, I can tell you that there was almost nothing - like 3-4Ah between 3.65V and 3.3V. So your question isn't fair, since the top and the bottom are so different. In addition, another forum member shared with me some data that (counter to the point he was trying to make) showed that LFP is pretty tolerant of overcharging. I don't want to encourage anyone to do anything bad, but I will say that it seems going below 2.5V is way, way worse than going over 3.65V. I'll just leave it at that.

I remember thinking when I saw this data on my cells that it might make more sense for me to use bottom balancing, because there was so much more to be gained squeezing out a bit below the bottom knee than was available above the top knee. However, I determined that in my case - vast majority of the time in float every day - top balancing still made more sense.
That’s helpful to know - thanks.

Until I start generating vastly more daily power, I won’t really know how balanced my cells are near the upper knee. They started out perfectly top-balanced and I only had to bleed two cells to get them bottom-balanced, so that hopefully means 6 my cells remain top-balanced only only 2 higher-capacity cells will be ~3-5% behind those 6 cells approaching the upper knee.

In that case, I should be able to define a Boost Voltage at 3.4V per cell which should translate to switching to Float at 27.2V when 6 cells are slightly above 3.4V and just those 2 higher-capacity cells are below 3.4V.

My BMS disconnects at 3.75V, so that is my worst-case overcharge, but I’d rather avoid the abrupt BMS disconnect if I can.

And the BMS has a passive balance function that kicks in at 3.55V that I’d also rather avoid since it would screw up my painfully-attained bottom balance…
 
Having said that, I wouldn't set the LVD down to 2.97V. You're making out like a bandit by setting it to 3.0V. Be happy! You won! ?
Found this: https://www.google.com/amp/s/www.powertechsystems.eu/home/tech-corner/lithium-iron-phosphate-lifepo4/?amp

‘In standard environment, and for 1C cycles, we can get from the chart the below life cycle estimation for LFP :
  • 3 000 cycles at 100% DoD
  • 4 500 cycles at 80% DoD
  • 10 000 cycles at 60% DoD
  • etc.
It should be noted that following the number of completed cycle, the batteries still have a nominal capacity > 80% of the original capacity.’

So the question of how far to discharge your battery really seems to be a question of what you are aiming for.

I’m discharging and charging at less than 0.1C which should be a factor in my favor but currently not heating my battery to 25C / 77F which might be a factor against me.

4500 cycles to 80% of original capacity is way more than I need, so I can certainly discharge to 90% and can even push discharge closer to 100% when needed.

At 60% DoD, my LiFePO4 battery would outlive me (not a high priority ;)).
 
Last edited:
Found this: https://www.google.com/amp/s/www.powertechsystems.eu/home/tech-corner/lithium-iron-phosphate-lifepo4/?amp

‘In standard environment, and for 1C cycles, we can get from the chart the below life cycle estimation for LFP :
  • 3 000 cycles at 100% DoD
  • 4 500 cycles at 80% DoD
  • 10 000 cycles at 60% DoD
  • etc.
It should be noted that following the number of completed cycle, the batteries still have a nominal capacity > 80% of the original capacity.’

So the question of how far to discharge your battery really seems to be a question of what you are aiming for.

I’m discharging and charging at less than 0.1C which should be a factor in my favor but currently not heating my battery to 25C / 77F which might be a factor against me.

4500 cycles to 80% of original capacity is way more than I need, so I can certainly discharge to 90% and can even push discharge closer to 100% when needed.

At 60% DoD, my LiFePO4 battery would outlive me (not a high priority ;)).
Yeah, I'm 63. I've said here before that I don't care much about the "fixture" compression of cells to get 3500 cycles instead of the uncompressed 2500 cycles. My reasoning is that our cabin only gets used 5-6 months out of the year, so I am confident that I will be dead before the cells are dead. I think it is revenge that my children will have to figure out what to do. :ROFLMAO:

Besides, it is highly likely that a much better technology will over come LiFePO4 before these cells are done.
 
SOCVCellMy LimitsMy LimitsMy LimitsMy Limits
48V setupSelf imposedSelf imposedSelf imposedSelf imposed
12V Setup24V Setup48V Setup
100.00%58.403.653.29413.226.452.7
99.50%55.203.45
99.00%54.003.38
90.00%53.603.353.27713.126.252.4
80.00%53.203.333.26013.026.152.2
70.00%52.803.303.24213.025.951.9
60.00%52.403.283.22512.925.851.6
50.00%52.203.263.20812.825.751.3
40.00%52.003.253.19112.825.551.1
30.00%51.603.233.17312.725.450.8
20.00%51.203.203.15612.625.250.5
14.00%50.403.15
9.50%48.003.003.15612.625.250.5
5.00%44.802.80
0.50%40.602.54
0.00%40.002.502.95011.823.647.2


View attachment 5359
My self imposed limits are based on what the cells return to after a bulk charge. For the first month, I was bulk charging to 3.55V per cell, they always returned to 3.295V on their own, so why try to force them. Let them be happy at 3.295 and "call it 100%".

These are for USED cells.

New cells are just that. Battleborn expressly states to use them 100 to 0, but charging them to 90 will let the cells last far far longer. It takes 3 times more time to go from 80% to 100% as it does from 0% to 80%. But it takes about 30 minutes to go from 100% to 80%, and 24 hours to go from 80% to 0%.

I set my 0% SOC at the top of the slope down to zero at about 15% SOC actual, I simply don't use that very low end so I can avoid further degradation. LiFePO4 cannot be fixed once degraded, it's a one-way process.

I think this is a very good baseline set of values but after spending months researching this stuff and talking to different technicians I have discovered that in reality these voltages are either up or down by a bit depending on the chemistry that the manufacturer is using.
One companies voltage for 50% SOC in a 48V pack could be a volt lower or higher than others. Charge Absorb Voltages can range from 58V to 54.4V. I have also gained a new appreciation for ESR. Mixing of various cells or packs with different ESR can have a huge effect on which pack is doing the heavy lifting and which one is lagging. Also which one charges faster and which one takes a longer time.
 
Last edited:
If it takes 1 hour to get to 80%,it takes 3 hours to get to 100% from 80%

I usually have a 1500 watt load for a 8 hour continuous drain, running my entire house.
I have 400ah and it takes about 4 hours to charge my batteries with 5kw of panels.
 
If it takes 1 hour to get to 80%,it takes 3 hours to get to 100% from 80%
I use that logic all the time when on trips in an EV. In the time it takes me to pee I can put 100 miles in my EV and be on the road to the next fast DC charger.
Of course it takes a lot less energy to get the batteries from 80 to 100 percent because Lithium is more efficient than Pb.
 
I use that logic all the time when on trips in an EV. In the time it takes me to pee I can put 100 miles in my EV and be on the road to the next fast DC charger.
Of course it takes a lot less energy to get the batteries from 80 to 100 percent because Lithium is more efficient than Pb.
A lot of the wasted time getting to 100% is really going from 98% to 100% as you climb that exponential curve on the Graph.
I get very mixed replies when asked if it's worth it. Most companies tell me to set the charge cutoff at a lower voltage and it will save a lot of battery life. Makes sense as going from 100% to 98% SOC only takes minutes during discharge.
 
I think this is a very good baseline set of values but after spending months researching this stuff and talking to different technicians I have discovered that in reality these voltages are either up or down by quite a bit depending on the chemistry that the manufacturer is using.
One companies voltage for 50% SOC in a 48V pack could be a volt or two lower or higher than anthers. Charge Absorb Voltages can range from 58V to 54.4V. I have also gained a new appreciation for ESR. Mixing of various cells or packs with different ESR can have a huge effect on which pack is doing the heavy lifting and which one is lagging. Also which one charges faster and which one takes a longer time.
Absolutelly agree with you.
Those table are good for baseline then I can do more research by my self
 
I Have 4 Packs of Grade A Cells (Fortress Power eFlex 5.4KW Batteries) and I just ran some test on the most recent one with only 10 cycles on it. It will charge to the factory recommended 55V but will drop down to 53.5V after an hour or so. I did some Log checks on the other three batteries and found the same thing with no measurable degradation as they reached the 340th cycle. They started at 53.5V and are still at 53.5V. This equates to 3.43V per cell at the 100% charge point.

I can force them higher but they will eventually still settle down at 3.43V per cell. I also have one EG4LL which brings the bank size up to 27KWh.

Here is a night time battery usage chart from my home on Sol-Ark 12K set to go fully on battery at 10pm and run until the PV takes over in the morning.

FinishedBat2.jpg
 
Last edited:
Back
Top