diy solar

diy solar

LiFePO4 Voltage Chart?

The charge profile is set to LFP (14.4V) but i have been suspicious of this charger (cristec 40A) i think i will be looking to swap out for a victron charger inverter next season/over the winter.
I will also check the connectors to ensure loose connections are the cause for fluctuating voltage readings.
The charging is settling to 13.73V now at ~3.8A
13.73 @ 3.8A is going the right way, the Amps should continue to drop as the cells saturate at that voltage level. Without individual cell monitoring via SmartBMS or other it's hard to say what each cell is doing and taking.
 
For a "12V" LFP Battery the MAX Voltage is 14.6 Volts (3.650Vpc) which is UNHEALTHY ! they should MAX at 3.500Vpc or 14.0V.
I totally agree that 14.0V is more than enough voltage to get you to 100% capacity with an AC charger or SCC. I’m confused as to why you would say 14.6V is UNHEALTHY when some battery manufacturers say they need between 14.2 and 14.6V in order to balance the cells.

Would it be fair to say that you should mostly only ever charge to 14.0V and only occasionally charge to 14.2V and higher(depending on battery manufacturers recommendation) in order to properly balance the cells using the batteries BMS?
 
Last edited:
The kick with 14.6 is that it is at Gross Max of 3.650Vpc from 3.500-3.650 is the Cliff Climb and not much in the way of AH capacity. We generally say 10-90% OR 20-80% of total capacity for Maximum Cycles & Lifespan. This is factored with the actual "Working Voltages" as well because LFP will always settle post charge. The settling amount is determined by the saturation of the charge level. Saturation is just how deeply the cell is charged to the "set" voltage you are charging to. The saturation of individual cells within an assembled pack varies dependent on the Grade/Quality & matching of the cells used, which immediately impacts the overall amount of charge which can be retained.

I'm sorry but this starts to get into far deeper things which I cannot get into at this time (Real Life is happening and needs my attention) as there are many contributing things going on. Saturation is one such topic alone which has raised a pile of "flotsam of pinions".

In simplest form, it is best to charge to the Target Voltage with CC IE: 14.0V or even 14.2 and then Float Charge (CV) @ 13.8 or 14.0 which will allow teh cells to remain below cutoff points while allowing saturation (full deep charge) and balancing out.

When you actively watch a pack activity @ cell level during the float stage, you see the cells voltages vary and fluctuate as they take what they need and as the balancing levels them up (provided there is a Balancer function on BMS or external module). As they fully saturate out the pack takes less & less till it goes to storage mode. Then float from an SCC would just continue feeding whatever demand from inverter etc is left, as it's capable of providing.
 
The kick with 14.6 is that it is at Gross Max of 3.650Vpc from 3.500-3.650 is the Cliff Climb and not much in the way of AH capacity. We generally say 10-90% OR 20-80% of total capacity for Maximum Cycles & Lifespan. This is factored with the actual "Working Voltages" as well because LFP will always settle post charge. The settling amount is determined by the saturation of the charge level. Saturation is just how deeply the cell is charged to the "set" voltage you are charging to. The saturation of individual cells within an assembled pack varies dependent on the Grade/Quality & matching of the cells used, which immediately impacts the overall amount of charge which can be retained.

I'm sorry but this starts to get into far deeper things which I cannot get into at this time (Real Life is happening and needs my attention) as there are many contributing things going on. Saturation is one such topic alone which has raised a pile of "flotsam of pinions".

In simplest form, it is best to charge to the Target Voltage with CC IE: 14.0V or even 14.2 and then Float Charge (CV) @ 13.8 or 14.0 which will allow teh cells to remain below cutoff points while allowing saturation (full deep charge) and balancing out.

When you actively watch a pack activity @ cell level during the float stage, you see the cells voltages vary and fluctuate as they take what they need and as the balancing levels them up (provided there is a Balancer function on BMS or external module). As they fully saturate out the pack takes less & less till it goes to storage mode. Then float from an SCC would just continue feeding whatever demand from inverter etc is left, as it's capable of providing.
Thank you for this. This absolutely makes more sense than anything I’ve gotten from anyone or anywhere else.
Basically there is little/no real world benefit to charge past the cliff climb or to discharge past the tail drop of the battery pack. Simply keep your battery in its nominal voltage range and your battery pack will last longer without having to worry about over voltage/under voltage or stressing the battery pack in other ways (which could cause bloated cells).
It sounds like you are also saying that saturation is the key to keeping your battery pack “balanced” or “equalized” and your cells will take what they need and want.
I think this all makes perfect sense to take the slow and consistent approach as opposed to the alternative for the longevity of cell life. This is why so many of us have a difficult time knowing how to properly set up our SCC systems. It doesn’t help when the market is saturated with chargers that say they are designed for LiFePO4 and charge to 14.6V and use the fast and hard approach. It confuses people. I appreciate people like you and this forum that take the time to really explain things from a common sense and your practical use experiences. Who better to learn from than someone with real world experiences?
That’s why I had a hard time taking anything my economics professor had to say once I found out he filed for bankruptcy, divorce and didn’t have a retirement plan. He wasn’t creditable.
You sir are the very definition of creditable and I appreciate that!
 
Some of the Befuddlement comes from WIVES TALES, some from FUD (Fear, Uncertainty, Doubt) and of course Confusion between Chemistries & Classes & Grades. The last one being the absolute worst problem.

EV Grade Cells are NOT ESS Grade Cells, and not Utility Grade cells or "Device Grade" (electronics like phones). and there are variations between all of those classes as well.

ESS LFP Cells for example typically have a Max Discharge Rate of 1C and Max Charge Rate of 0.5C. Where EV Grade LFP LFP can deliver 5C Max Discharge & 2-4C Charge (pending on chemistry variant). BOTH grades are "Burst Discharge Rate" capable X5 their Max CRate.) Cross into LTO, Li-Ion NMC etc and a whole new game starts.

Up until 5 Years Ago, Joe & Jane Public only knew AA, AAA, C, D, NiCad & NimHi and of course the Lead Acid/AGM crowd. Virtually NONE of that knowledge is transposable at the test level, even the terms are divergent relative to which is being discussed.

Batteries like Driving your car.
Your speedo goes from 0-200 kmh. The car CAN do up to 200kmh, but should it ? Can it do that daily ?? Will it last 5 years or more if you do ???
If our reasonable and stay within limits <110 kmh and don't Pedal Mash at every stop light, you'll get a long low maintenance life. Even better if you drive an EV ! (10,000 less parts to fail LOL)...

Be good to your batteries, use them within THEIR Working Specs for the specific grade & type of cells and you will be happy for a long time.
 
I have had a bit more time to observe. I believe the fluctuating voltage is due to the BMS cutting out due to a over Voltage on one cell. I've turned off charging for the time being and the cell voltages have settled into a tight range. I will have to take the pack apart to retest for capacity (potentially have to replace a cell for next season or reduce the capacity of the whole pack to accomodate the one cell. I should move my discussion to the DIY LiFePO4 discussion.
 

Attachments

  • Screenshot_20210824-103815.png
    Screenshot_20210824-103815.png
    413.2 KB · Views: 46
I believe the fluctuating voltage is due to the BMS cutting out due to a over Voltage on one cell.
All the weird chit seems to happen in the cell 1 and cell 4 positions. You might want to swap cell 4 to a different position to see if the problem follows the cell or if it stays at the position.

Just a thought. Otherwise lowering your max pack voltage is a decent solution.
 
All the weird chit seems to happen in the cell 1 and cell 4 positions. You might want to swap cell 4 to a different position to see if the problem follows the cell or if it stays at the position.

Just a thought. Otherwise lowering your max pack voltage is a decent solution.
Yeah. Cell 4 seems to be the weak link. I will swap out the positions tomorrow, but tonight, after using only 29ah, the voltage on the BMV is 12.63V, the BMS is reporting 13.1V but cell 4 drops drastically when a load is drawn (ie 11A when the water pump goes) other cells were 3.3V but 4 went down to 2.9V.
Is there any explanation for the BMS to read a different voltage than the BMV? But have leads to the same positive terminal and the BMS is the negative, whereas the BMV would get it's negative after the BMS.

Update- based on another thread I took the multimeter to the cells. And did not find a cell that was that much lower than the rest. I checked the busbars (not warm) and redid all the connection the BMS is now happily showing a 0.002V cell diff and the BMS and BMV are showing closer pack voltage readouts. Must have been a loose connection somewhere.
 
Last edited:
Multiple battery voltage chart needed via temperature, charge and/or discharge current.
while charging the full charge voltage is over 3.45V, but, while discharge full charge voltage is around 3.35V.
every battery manufacturer has the different voltage profile.
And, need to track the lowest voltage CELL while discharge, and highest voltage CELL while charge.
 
I think it is better to use the battery voltage chart while charging, at a nominal 5-10A.
I can view the OverKill solar BMS voltages and currents, as well as the Victron Smart Shunt.
 
I am currently charging my LifePo4 (16s) 48v battery using MPPT with 56V CC/CV
After reading this thread and realising how little there is to be gained above 55V - I am considering dropping to 55V.
Would this increase the amperage into the battery also and would that not speed up the charging process?
 
I am currently charging my LifePo4 (16s) 48v battery using MPPT with 56V CC/CV
After reading this thread and realising how little there is to be gained above 55V - I am considering dropping to 55V.
Would this increase the amperage into the battery also and would that not speed up the charging process?
Using 56V for bulk / absorption (3.5V per cell) gets your cells to a fully charged state quicker. In Absorption / CC at that voltage the current will drop off fairly quickly.

Using 55V for bulk / absorption (3.4375V per cell) can still get you to fully charged, but the current will take a longer time to drop down to 2-5%, which would indicate the cells are full.

So to answer your question: You have it exactly backwards. Charging at a lower voltage means slowing down the charging process. It will take longer to get to full charge.

I personally am staying with 3.5V-3.525V per cell for bulk / absorption (CV/CC). If I find that my cells start to get out of balance at that level, I will drop it down. However, I will strive to stay above 3.45V per cell.
 
Thanks for the answer - Oh dear I am losing it again!!?? My thought process was that 2000 watts coming off the roof at a lower voltage = more amps therefore quicker time to achieve CC. I am obviously seeing this all wrong. It does seem to increase the amps from the MPPT to the battery but there is something I am missing - feel free to teach an 8 year old (62 year old) :D
 
Thanks for the answer - Oh dear I am losing it again!!?? My thought process was that 2000 watts coming off the roof at a lower voltage = more amps therefore quicker time to achieve CC. I am obviously seeing this all wrong. It does seem to increase the amps from the MPPT to the battery but there is something I am missing - feel free to teach an 8 year old (62 year old) :D
Hey, I'm 63, so don't go talking about age. ;)

What you're seeing is not what the cells can absorb, but what your PV array can produce. Chances are your 16S battery can take as much current as the SCC can provide. So you are right that the CC putting out a lower voltage will mean it can produce more current. But at a lower voltage the battery pack will continue to take all that current for a longer time. If it were at a higher voltage the pack would take less current over time, as the cells became fully charged.
 
Sorry to be a pain - I read that the time to charge a battery was H = AH/A
My battery is 150 AH - so at 15A the battery should charge in 10 hours - at 10A the battery should charge in 15 hours.
So by reducing the voltage to increase the amperage should theoretically shorten the charge time?
Am I being stupid or is there something I am missing? OR BOTH :D
 
Sorry to be a pain - I read that the time to charge a battery was H = AH/A
My battery is 150 AH - so at 15A the battery should charge in 10 hours - at 10A the battery should charge in 15 hours.
So by reducing the voltage to increase the amperage should theoretically shorten the charge time?
Am I being stupid or is there something I am missing? OR BOTH :D
The voltage differential between the charger and the battery determines the current flow.
More voltage differential means more current flow.

How charging works:
During the bulk phase(constant current) part of the charge, the charger controls the current flow into the battery.
It does this by adjusting the charge voltage down from the configured value to maintain the configured amperage.

During the absorption phase(constant voltage) part of the charge, the battery controls the current flow into the battery.
The charger no longer has to adjust the charge voltage down to keep the current flow at or below the configured level.
The current flow is approaches zero as the voltage differential approaches zero.
 
Last edited:
Sorry to be a pain - I read that the time to charge a battery was H = AH/A
My battery is 150 AH - so at 15A the battery should charge in 10 hours - at 10A the battery should charge in 15 hours.
So by reducing the voltage to increase the amperage should theoretically shorten the charge time?
Am I being stupid or is there something I am missing? OR BOTH :D
I started to type in a response but @smoothJoey beat me to it. So I'm starting over.

Tying in to what @smoothJoey said: If you are setting your charge controller to charge at 3.5V per cell, the charger will be in constant current mode until the voltage of the battery gets up to 3.5V per cell. That will take longer than it would if you set the charger to charge to 3.4V per cell. Like I said, I'm pretty sure your charge controller is putting out everything it can, so it takes longer for it to bring the battery up to 3.5V per cell than it would to bring it up to 3.4V per cell. Then it will switch to constant voltage. In the case of 3.5V per cell, it is already closer to being fully charged.

Try watching this video, which specifically shows the charge times at 3.5V per cell and at 3.4V per cell:
I think that will clear it up for you. Compare the two charging tests that included absorption. If you are charging to 3.5V, the absorption time is almost zero to get to fully charged. If you charge to 3.4V, the absorption time will be much longer.
 
I've been watching my battery bank voltage along with my battery monitor kit which seems to be a good judge of +/- AH. I reset my AH meter at full charge yesterday and now my 345 AH bank is down 110AH and showing 46.0V (my 14S configuration is fully charged at 51.0V). According to the voltage chart on the first page, I am at approx 3.285V per cell which is right around 60% SOC.

I understand the SOC % is not going to be nearly as accurate as a shunt reading, but it seems to be in the close ballpark.
 
Back
Top