diy solar

diy solar

GUIDE to properly Top-Balance and Charge a LFP Battery: Part 1

That’s the question, and we really won’t know if our batteries last 30 years but could have made it 35, or if we added 5 years to them. I guess nobody really knows.

What I do know is I’m going against Victron’s LifPo profile, LiTime’s settings, and going with this group’s advanced thinking approach.

If the batteries can be charged to 3.5V (14V) no matter what charge rate is being allowed, then I’ll set the chargers to 14V and let them run. Some days will get less charge and some days more, but I’m not going to babysit batteries every day.

If it is beneficial to charge them every 6 months until the BMS shuts them down, then I’ll do that also.

I’m not looking to change minds or offer suggestions, I’m looking for someone to tell me how to set my solar chargers and let them work.

I’ll see if I can attach a snap of my solar settings, and the history graph of what these settings do. I get a big hockey stick when the charge voltage gets above 13.6. It climbs rapidly to 14V or 14.2V, whatever I have set.

I can relate to what you're saying in this thread, and I also use a Victron controller. If you don't have perfect knowledge then there is no perfect way, only acceptable compromises.

For example, unless your charge controller reads a shunt at your battery, the charge controller can not know the current flowing into your battery. It only knows the total of the inverter current plus the charge current. Like you have mentioned, it can randomly be a high or a low inverter current, and that will mess with the current taper cutoff in the controller. Another issue is that the Victron doesn't have voltage sense cables. It measures the cell voltage at its battery terminals. So unless you get an external sense dongle, it will only be accurate at low currents.

To make sense of this myself, here is how I deal with it: A battery getting slightly undercharged is no big deal. I seriously doubt it has any long term effect. Some say LFP has memory effects, but I've seen very little evidence of it. However, overcharging LFP on a daily basis probably degrades the battery, so I try to stay away from it.

The only bulk voltage you can be 100% sure will not overcharge your battery is the float voltage, 3.37 V. But of course that'll not be practical. The charging will take forever. So instead you use 3.45 V to 3.55 V and adjust your absorption time and current cutoff so that you get to 95-97% SOC in most common scenarios. You look at typical inverter current, typical controller current, and check the charge diagrams (a good one included below). You configure the controller to hit the desired SOC. Then you live with the fact that once in a blue moon it will be over 100%. Also keep in mind that LFP cells are not kittens. They are designed to be durable and tough, to be used in EVs.

Or you just ignore all of the theory as pointless navel gazing, and just do what Victron tells you. I suspect that they know more about this than all of us together.

LF280 AH battery charge discharge 0.1C-1.0C.png
 
You can utilize the tail-current feature of your Victron hardware. Set a voltage like 3.475 V/Cell for bulk and 3.37 V/cell for float and corresponding tail current. It is better than any absorption timer.
The tail current goal changes as the charge rate changes, correct?

I’ve played with settings and was able to monitor the green flash opportunity as the controllers all acted out their roles today.

I was charging along at 14v and between 40-55amps today. The bank is 920ah, so about 0.05C

When the bank reached 13.6V the amperage held at 40a and remained there, damned be clouds or sun, the amperage quit fluctuating. When the bank reached 13.75V at 40a there were straight diverging graph lines as the voltage came up to 13.96V and amperage dropped to 4a. I’m assuming at this point the batteries were full as I was still making good solar, but the batteries didn’t want it. Then there was a rapid rise to 14V as amperage dropped to zero, and I have scheduled a 3 minute float which held the batteries there at 14V using fractional amperage. Cue the float stage at 13.45V and they sit there until the solar dies out.

So I’m back to my original question: where to set controllers for best battery health. On these slow charge days I didn’t want anything beyond the 13.75V or maybe 13.8V setting. Is this where I want to utilize the tail current setting and apply a 0.02C (18 amps on my system) limit to discontinue charging?

Leave the 14V target for when solar and alternator or generator are running dumping 200a-250a into the bank, but set the tail current to 0.02C? Or will the tail current override all the other settings?
 
So I’m back to my original question: where to set controllers for best battery health. On these slow charge days I didn’t want anything beyond the 13.75V or maybe 13.8V setting. Is this where I want to utilize the tail current setting and apply a 0.02C (18 amps on my system) limit to discontinue charging?

I've set my 2 Victron charge controllers to 14.08v Absorption and 13.4v Float and doing the 10-year test now, so far it seems good. I'll report back in 8 years...
 
I've set my 2 Victron charge controllers to 14.08v Absorption and 13.4v Float and doing the 10-year test now, so far it seems good. I'll report back in 8 years...
Pretty much the same.

4 cell 300Ah Sinopoly LiFePO4 battery.
50A Solar controller, 80A alternator and 30A mains charger all limited to 14.10V maximum.
According to my two shunt based battery monitors the pack is always at 100% SOC once the terminals reach 14.1V regardless of charge current. Absorption occurs but is unnecessary.
"Float" is 13.40V and maintains 98-100% SOC while the sun shines.
10 years of full-time RV travel so far without any issues.
 
So I’m back to my original question: where to set controllers for best battery health.
You have already set your controllers close to the best they're capable of.
The next step if you're still not satisfied can be implementing BMS CANBus comms.
1709711178060.png
The tail current goal changes as the charge rate changes, correct?
I don't think this is implemented. At the same time, you shouldn't worry too much about 0.1% edge case where your battery is nearing full charge coincident with the solar output suddenly dropping and that too for an extended periods of time.
You've already got better hardware than I'd say 95% of the people out there.
There is no need to worry and 13.4 V is a great FLOAT voltage.


I've set my 2 Victron charge controllers to 14.08v Absorption and 13.4v Float and doing the 10-year test now, so far it seems good. I'll report back in 8 years...
That is a clever way of describing a two year old battery.
 
Hi All. I’m about 5 pages in out of 11. Apologies if this is addressed later. Thought I’d add my 5c commentary.

I am a BSc engineer specialising in building lithium ion batteries. I own a company that builds batteries and consult for large telecommunication operators in South Africa where the storage technology is mainly LFP and NMC.

The funny thing is that manufacturing is difficult and the goal is not to squeeze every bit of life out the battery but meet the warranty/ROI period and not have to get called out into the field to deal with issues caused by trying to operate the batteries at the settings that will cause issues. Right at the top or right at the bottom.

A top end balancing is VERY necessary to ensure all cells in the pack behave in a similar manner and don’t cause battery disconnection…site going down…angry customer. I see that many say they didn’t do it on their pack and it’s fine, pointless exercise etc. To be fair the quality of cells from suppliers has increased and typically a small order of cells is impedance matched and at 30%SOC for transport (UN38,3). They are very much in line and hence no issues. In high volumes this isn’t always the case and hence proper top balancing is a necessary step for battery production.

As the pack ages, differences in the cells will become more prevalent. The set points must ensure this doesn’t cause battery disconnection (Bear in mind the energy in this region is inconsequential to battery operation). Typically a charger would aim for around 3,458V per cell. This gives room for Overshoot if the chargers PID isn’t great or if it’s a dumb charger (no communication with dynamic current limits on CAN). The BMS/ A good BMS will Jump the SOC to 100% once one cell hits this target, the rest are within an acceptable bandwidth And the current drops below a threshold.

The cell will be effectively fully charged if the above is met. Review the SOC OCV curve from the manufacturer to confirm this. I don’t like people saying it’s 3,37V for LFP always- not true as it varies with temperature and internal resistance- Again not an argument as it’s supplied by the cell manufacturers. As the cell ages the SOC OCV 100%point reduces closer to 3,3V range.

3,65V is the cut off voltage where the battery must disconnect. It will be damaged beyond this but not before- The notion that it would be is ridiculous and not one I’ve seen damage any pack. Operating in this region is like flogging a dead horse. You could make the argument that operating in this region makes the cell work harder and increases temperature which adversely impacts cycle life. But it’s negligible for a product.

On Balancing with a DC power supply there is talk of a tail current slowly overcharging the cell. The physics of the system prevents this. If the PSU is set to 3,65V and the cell reaches 3,65V to current goes to 0A as there is no driving potential to cause current to flow. The same as your battery not charging if charger voltage =Battery voltage. It is important to ensure current reduction at the top (described in the datasheet) as high current will cause overshooting.

Cell imbalance is caused by differential aging of the individual electrochemical systems (increase in impedance) and differential temperature across the pack. Not to mention current surges ,Calendar aging, parasitic loading and self discharge rate variations. You have 15 or 16 individual systems that you are trying to match.

I don’t agree with the method of obtaining the full charge voltage (FCV) as it implies all cells are electrochemically identical (across all brands) and is based off capacity. Your linear regression (FCV, 0 C) and (3.65 V, 0.05 C) uses Current based on cell capacity to determine the set points. This should be done with the SOC OCV curve at the desired temperature for the cell supplied by the manufacturer. This can be done empirically by discharging the cells in increments, allowing for rest, logging the data and forming the SOC OCV curve. Vary this with temperature to correctly classify the cell.

Important to note that a 230Ah cell will exceed 230Ah from start. Especially with Eve and Catl cells. On average we see up to 10% surplus. This alone violates your method.


But Yes. If you set the charge voltage to between 3,3…V and 3,65V for all practical purposes the cell is fully charged. No harm would be done by only charging to this point. In a perfect world it would be great as there would be an additional safety factor preventing overcharging of the cells. In practice (at pack level) chargers need dead-bands and targets to work within to safely charge to 100% without overshooting.

Hope it helps! Very interesting discussion and I look forward to the ongoing debate.
 
A top end balancing is VERY necessary to ensure all cells in the pack behave in a similar manner and don’t cause battery disconnection…site going down…angry customer. I see that many say they didn’t do it on their pack and it’s fine, pointless exercise etc.

The main reason why is because active balancing BMS are common now. You can just throw the cells together have the BMS balance the cells automatically over time. This means you don't need to go through the step of balancing with a power supply, which we've seen quite a few people screw up with bloated cells as a result...

Welcome to the forum!
 
This should be done with the SOC OCV curve at the desired temperature for the cell supplied by the manufacturer.
This can be done empirically by discharging the cells in increments, allowing for rest, logging the data and forming the SOC OCV curve. Vary this with temperature to correctly classify the cell.
Thank you for mentioning the crucial "discharging the cells in increments, allowing for rest" part.

Many people here routinely love to "prove others wrong" with basic SOC- OCV curves that were not made with discharging cells in increments and allowing for battery rest in-between . I can only hope they see your post and quickly realize why they are wrong.

Your linear regression (FCV, 0 C) and (3.65 V, 0.05 C) uses Current based on cell capacity to determine the set points.
As long as that end point corresponds to end of charging , I fail to see much difference.
Important to note that a 230Ah cell will exceed 230Ah from start. Especially with Eve and Catl cells. On average we see up to 10% surplus. This alone violates your method.
Violates the method in what way ?

I am not concerned much with decimals. As long as we manage to target the current tapering behaviour at the end of charge cycle, I expect the cells to be just shy of a full-charge with some safety margin factored in.
So, even if someone uses 0.08 C say, instead of 0.05 C as cut-off current, They will still end with a battery just marginally shy of full charge.

Me (with a CALB 100 Ah) and many others (EVE 230 Ah Cells) are successfully utilizing this logic in their own production system for months now.
It has functioned as expected with me routinely discharging my batteries to 80-90% DOD on some days in the summer heat.

The main reason why is because active balancing BMS are common now. You can just throw the cells together have the BMS balance the cells automatically over time.
Large current (Active) balancing is overkill (provided proper convergent balancing implementation) even for large capacity battery packs. The problems of cells getting unbalanced at the end of every charge cycle is down to improper balancing parameters plus charging woes already discussed earlier.
It's a slippery slope of not recognising the root cause of the imbalance and the unnecessary race for more active balancing Amps.

I personally manually BMS balance my cells on the order of weeks and that too finishes in just few hours.
That should really put into perspective how proper balancing should look like.
 
Large current (Active) balancing is overkill (provided proper convergent balancing implementation) even for large capacity battery packs. The problems of cells getting unbalanced at the end of every charge cycle

I wasn't talking about cells getting imbalanced at the end of the charge cycle. I was talking about newly bought cells, to be put in a pack without any preparation, and having the BMS balance the cells over time without a specific balancing step during pack construction. It doesn't even have to be a large current active balancer: the smallest JK with 0.6A will do fine even with 300Ah cells. It just takes some time over summer and assumes the pack will be fully charged at frequent intervals as is usually the case in summer with typical solar applications.
 
As long as that end point corresponds to end of charging , I fail to see much difference.

Violates the method in what way ?
Thank you. I’m interested in your thoughts. Perhaps I’m just not grasping your method. In my mind this will affect the gradient of the regression line and I’m concerned about setting a fixed value that should vary with temperature.

Using your example. Could you give us an example of the FCV you would calculate?

The SOC OCV 100% point for the EVE230Ah cells comes out as 3,37V@ 0 degrees C , 3,469V @ 25 degrees C and 3,543V @ 45 degrees C.

That’s a 170mV delta depending on temperature. This is why I don’t agree with damage to cycle life below 3,65V.


Me (with a CALB 100 Ah) and many others (EVE 230 Ah Cells) are successfully utilizing this logic in their own production system for months now.
It has functioned as expected with me routinely discharging my batteries to 80-90% DOD on some days in the summer heat.
Does it change based on temp?
Large current (Active) balancing is overkill (provided proper convergent balancing implementation) even for large capacity battery packs. The problems of cells getting unbalanced at the end of every charge cycle is down to improper balancing parameters plus charging woes already discussed earlier.
It's a slippery slope of not recognising the root cause of the imbalance and the unnecessary race for more active balancing Amps.
Inherently the cells will become unbalanced due to differences in internal resistance, differences in temperature across the pack, differences in voltage tap cable length, terminal torque and vibration if motive and most importantly charge current overshoot at the end of the charge cycle.

Standard balancing circuits (150mA) won’t be able to rectify a pack that wasn’t top end balanced. They will also loose the fight in the long term depending on pack capacity and number of cycles. The large current active balancers are a good idea. The balancing current is also dependent on the power rating of the balancing resistor- This often results in less than full balancing current being applied.

I personally manually BMS balance my cells on the order of weeks and that too finishes in just few hours.
That should really put into perspective how proper balancing should look like.
Once the battery leaves the facility it is very costly to go manually do anything. Hence the care in tolerance to avoid problems. Don’t even get me started on what manually balancing a HV BESS system would take😂 But hey, To each his own.
 
The SOC OCV 100% point for the EVE230Ah cells comes out as 3,37V@ 0 degrees C , 3,469V @ 25 degrees C and 3,543V @ 45 degrees C.
Could you share the datasheet for this?
I have 32 EVE in basement and currently run a slightly higher voltage in the winter when the temperature is cooler... It sounds like I shouldn't be doing that.
 
When the pack is in bulk charge (constant current), it reaches 100% SOC at a certain voltage, which depends on charge current. This can be seen in your diagram. A lower charge current will get the pack to 100% SOC at a lower voltage. Termination should be at that specific voltage, to not over charge the pack.

But the chemistry also has an upper voltage limit of around 3.65V. So for higher charge currents, you hit that voltage limit first without fully reaching 100% SOC, so you have to switch to absorption (constant voltage) and wait for the tail current.

In other words: For high current charging, you monitor absorption tail current, which is the topic of this thread. For low current charging, you instead monitor bulk charge voltage. Looking at your diagram, you can see that the the breakpoint seems to be around 0.03C.
Spot on. The aggressive charge current is a big contributor to imbalance.
 
Could you share the datasheet for this?
I have 32 EVE in basement and currently run a slightly higher voltage in the winter when the temperature is cooler... It sounds like I shouldn't be doing that.

Could you share the datasheet for this?
I have 32 EVE in basement and currently run a slightly higher voltage in the winter when the temperature is cooler... It sounds like I shouldn't be doing that.
Cheers,
 

Attachments

  • LF230 SOC-OCV .pdf
    1 MB · Views: 11
Heres a nifty excel sheet from lithium balance. just adjust number of datasets to 3.
 

Attachments

  • Lithium_Balance_Tools - SOCvsOCV + ChargerPID.zip
    202.3 KB · Views: 7
Thank you. I’m interested in your thoughts. Perhaps I’m just not grasping your method. In my mind this will affect the gradient of the regression line and I’m concerned about setting a fixed value that should vary with temperature.
You can always inspect the source code here at https://github.com/Sleeper85/esphom...lopment/packages/smart_bms_cutt-off_logic.cpp

Using your example. Could you give us an example of the FCV you would calculate?

The SOC OCV 100% point for the EVE230Ah cells comes out as 3,37V@ 0 degrees C , 3,469V @ 25 degrees C and 3,543V @ 45 degrees C.

That’s a 170mV delta depending on temperature. This is why I don’t agree with damage to cycle life below 3,65V.
Before I am in a position to say anything, I want to know how was this FCV figure determined ?

Standard charging model according to EVE documentation is:
CC 0.5 C CV 3.65 V with 0.05 C cutoff rate

Was this FCV determined after 30 minute rest at (0, 25, 45 C ) temps?
1713764959229.png

Once the battery leaves the facility it is very costly to go manually do anything. Hence the care in tolerance to avoid problems. Don’t even get me started on what manually balancing a HV BESS system would take😂 But hey, To each his own.
I meant to say that I turn ON the BMS Balancer every once in a while to MANUALLY control when balancing takes place. It is off otherwise.
I am not aware of any BMS available currently implementing features that ensure convergent balancing.
 
Back
Top