diy solar

diy solar

‘Normal’ LiFePO4 post-charge settling?

to summarise: settling is different that self-discharge.
Remember its the coloumbs.
Even after the voltage settles the coloumbs are still there.
 
I think it would be easier for you to come to terms with the fact that all batteries "settle" including Pb and if there was actually something there to be discovered it would have been figured out over 40 years of Pb usage. I am distinguishing between settling and long term self discharge over time. We do know that Pb self discharges significantly over time and that is why they are often Float charged to maintain them. Lithium batteries have significantly less self discharge.
Is Pb lead-acid (LA)?

Yeah, I’m not claiming to have discovered anything new - just responding to the ‘beer foam’ analogy.

I have a significant motivation to understand whether my first 90Ah 8S LiFePO4 battery is performing roughly as expected or not,

A modest motivation to understand whether it’s coming close to delivering the 90Ah I paid for,

And a mild motivation to understand how LiFePO4 cells behave in terms of repeated charge cycles (on a daily basis) without any discharge in between.

That last is because the 280Ah battery I’m planning to build next is intended to be used for home backup and high TOU self-consumption.

I’d planned on only draining the battery with self-consumption over the 4-month peak season with high TOU rates.

This would mean for 8 months a year, the battery gets charged daily without any discharge.

I was planning to do this to preserve charge-discharge cycles but if this is actually worse for the battery, sounds like I may be better-off draining it to 50% and disconnecting the solar charger for 8 months..,

Hold on, now I’ve gone ahead and gone off-Topic in my own thread ;).
 
to summarise: settling is different that self-discharge.
Remember its the coloumbs.
Even after the voltage settles the coloumbs are still there.

Assuming that is correct, the ‘head of beer’ analogy is perfect.

But this also means it should be possible to fill a LiFeOI4 battery to more than 100% rated capacity (based on a single charge cycle to 3.64V).

I think everything I’ve read stated that it is the high voltage that damages LiFePO4 cells rather than excessive coulombs (more than rated at 100%).

Is there any reason to think that jamming additional coulombs into a full cell (without ever exceeding 3.65V) can cause damage?
 
Assuming that is correct, the ‘head of beer’ analogy is perfect.

But this also means it should be possible to fill a LiFeOI4 battery to more than 100% rated capacity (based on a single charge cycle to 3.64V).

I think everything I’ve read stated that it is the high voltage that damages LiFePO4 cells rather than excessive coulombs (more than rated at 100%).

Is there any reason to think that jamming additional coulombs into a full cell (without ever exceeding 3.65V) can cause damage?
The battery will eventually not take anymore current but it is still subject to voltage stress until the charge is terminated or the charger goes to float mode.
 
The battery will eventually not take anymore current but it is still subject to voltage stress until the charge is terminated or the charger goes to float mode.

My 10A Charger seems to switch to float when battery reaches 28.6V (3.58V) and continues to provide some low level of current until battery reaches 28.75V (3.6V).

Fan is on as long as it is in CC mode supplying 10A, then fan turns off but red light stays on as long as it is supplying some low level of current in CV mode, then light turns from red to green when battery hits 28.75V (3.6V).

The cells don’t start settling down until I disconnect the charger or turn it off.

So is the ‘voltage stress’ I’m subjecting my cells to by holding them at 3.6V anything I should be worried about?
 
My 10A Charger seems to switch to float when battery reaches 28.6V (3.58V) and continues to provide some low level of current until battery reaches 28.75V (3.6V).

Fan is on as long as it is in CC mode supplying 10A, then fan turns off but red light stays on as long as it is supplying some low level of current in CV mode, then light turns from red to green when battery hits 28.75V (3.6V).

The cells don’t start settling down until I disconnect the charger or turn it off.

Your battery should not be charging on float.
cc ~= bulk
cv ~= absorb
float is basically power assist mode charger voltage should drop to <=3.4 volts per cell.
What is your float voltage set to?

So is the ‘voltage stress’ I’m subjecting my cells to by holding them at 3.6V anything I should be worried about?

Leaving your batteries subject to voltage higher than their resting voltage accelerates their demise... not sure how to quantify it.
I wouldn't leave my batteries exposed to 3.6 for weeks or monthes.
 
Your battery should not be charging on float.
cc ~= bulk
cv ~= absorb
float is basically power assist mode charger voltage should drop to <=3.4 volts per cell.
What is your float voltage set to?
It’s a canned charger, no programmability. I’ve probably used the wrong terms.

When the red light is on and the fan is in, the charger is driving 10Amps at whatever voltage the battery is dropping 28.6 volts to. That’s probably ‘Bulk’ or Constant Current or CC, right?

Then once the battery is up close to 28.6V the fan turns off meaning current has dropped down below some threshold but the red light stays on and the battery keeps charging up towards 28.6V - that’s probably absorb or Constant Voltage or CV, right?

That continues until the light turns green with battery reaching 28.75V or 3.6V per cell at which point if I disconnect the charger and measure the charger output voltage, it reads 28V or 3.5V per cell, so that’s ‘float’, right?


Leaving your batteries subject to voltage higher than their resting voltage accelerates their demise... not sure how to quantify it.
I wouldn't leave my batteries exposed to 3.6 for weeks or monthes.

It’s only 3.5V but still, higher than 3.4V and I get your point.

My MPPT Charger will be fully programmable so once I’ve got a charger hooked up for ‘weeks or months’ I’ll be able to float at 3.4 or wherever I want.

This little 10A AC charger is just being used to characterize / test the battery and is disconnected soon after the green light comes on, so probably nothing to worry about, right?
 
Woods! Looks like I forgot a ‘[/quote]’ somewhere...

Sure miss the ‘edit’ function’ typical for these sorts of Forums.
 
The formatting above is just to hard to deal with.

Float is power assist mode.
If you disconnect the charger it can't assist the battery.
I think I've already explain cc/cv charging pretty thoroughly earlier in the thread.
Gonna check.
 
The formatting above is just to hard to deal with.

Float is power assist mode.
If you disconnect the charger it can't assist the battery.
I think I've already explain cc/cv charging pretty thoroughly earlier in the thread.
Gonna check.

From Battleborn: https://battlebornbatteries.com/charging-battleborn-lifepo4-batteries/

“We recommend a bulk and absorption voltage of 14.4 V. A float is unnecessary, since li-ion batteries do not leak charge, but a floating voltage under 13.6 V is fine.”

So according to Battleborn, Float is unnecessary but in any case, my chargers float voltage of 3.5V exceeds their recommended maximum Float voltage of 3.4V...
 
Chemistry is complicated, but think of it this way.

There are different electrical potentials required to move the lithium ions. Moving from the graphite anode to the cathode (discharge) produces around 3.4V when the anode is mostly full, and 2.5V when the anode is empty. To reverse this requires greater potential. The more filled the anode gets the higher this potential needs to be. Its linear for most of the charge/discharge curve, but at either end it goes briefly exponential. Hence why around 3.65V is required to move the last couple percent of the ions, but 9X% will move at 3.45V.

For charge you need to overcome the voltage/energy level required to free the ion from the cathode. Physics requires that this must be higher than the discharge voltage at that SOC. Chemical batteries are also capacitors (electrodes separated by a dielectric). For this (and other reasons) once the charge current stops, they will remain at a higher voltage than their discharge chemical potential for a short period. An easy way to bleed this charge off, is to apply a 0.1C load for 10 seconds.

If instead we were dealing with capacitors instead of chemical batteries, the terminal voltage would always directly correspond to stored charge, and it would be perfectly linear from 0 to 100% SOC.

Battleborns voltage specs are a bit high, likely driven by how they balance their cells.
 
Chemistry is complicated, but think of it this way.

There are different electrical potentials required to move the lithium ions. Moving from the graphite anode to the cathode (discharge) produces around 3.4V when the anode is mostly full, and 2.5V when the anode is empty. To reverse this requires greater potential. The more filled the anode gets the higher this potential needs to be. Its linear for most of the charge/discharge curve, but at either end it goes briefly exponential. Hence why around 3.65V is required to move the last couple percent of the ions, but 9X% will move at 3.45V.

For charge you need to overcome the voltage/energy level required to free the ion from the cathode. Physics requires that this must be higher than the discharge voltage at that SOC. Chemical batteries are also capacitors (electrodes separated by a dielectric). For this (and other reasons) once the charge current stops, they will remain at a higher voltage than their discharge chemical potential for a short period. An easy way to bleed this charge off, is to apply a 0.1C load for 10 seconds.

If instead we were dealing with capacitors instead of chemical batteries, the terminal voltage would always directly correspond to stored charge, and it would be perfectly linear from 0 to 100% SOC.

Battleborns voltage specs are a bit high, likely driven by how they balance their cells.

Thanks for the helpful explanation - it sure is complicated electro-chemistry.

I made the (uninformed) mistake of skipping top-balancing before building this 8S 90Ah battery and having been trying to do so by objecting and draining charge through a 50W 2.5ohm resistor.

The challenge is that it takes time for that injected or drained charge to settle as well and so if you do so up in the hockey stick above 4.5V where measurement precision of 1mV means something, time becomes a variable as everything is also settling / shifting while you are taking measurements.

I’m hoping to avoid ripping the whole battery apart to top-balance it - from what you’ve written above it sounds like I could drain ~0.03C after each charge cycle to greatly speed up the settling process...

I’ve got a 2kW 7.3ohm heating element that drains 3.7A / 0.04C when the battery is at 3.4V (27.2V), so I could connect that for ~24 seconds or better yet just monitor battery voltage and stop when it reaches 3.4V (27.2V) - would that work to turbo-settle the battery after charging?
 
I think everything I’ve read stated that it is the high voltage that damages LiFePO4 cells rather than excessive coulombs (more than rated at 100%).
I suspect you could run the voltage up to 3.8 volts and get a few more Ahrs into them. Maybe some more if you went to 4.0 volts but why?
The cells get damaged because dendrites grow based on the time that the cells spend at the top part of the charge curve. Hence the theory of storing them at 50%. There are no simple rules but i have developed some practices that I use with my EVs that gives me some comfort that I might get 8 to 10 years out of the batteries with no more than 10-15% degradation of the pack.
Based on your plans above I would be happy to have a dialogue about what has been working for me.
 
I’ve got a 2kW 7.3ohm heating element that drains 3.7A / 0.04C when the battery is at 3.4V (27.2V), so I could connect that for ~24 seconds or better yet just monitor battery voltage and stop when it reaches 3.4V (27.2V) - would that work to turbo-settle the battery after charging?
A one Ohm resistor will drain 3 Amps per hour. Two in parallel is 1/2 Ohm and that would be 6 Amps per hour based on Ohms law. My recommendation is not to use voltage as your target but use something that will measure Amphours or Watthours. Then next cycle watch that cell as they approach the knee. Volages below 3.4 are unreliable in terms of telling you if they are in balance.
 
Charge until one cell hits 3.5V. Then put shunt resistors on that cell. As each cell reaches 3.5V, install a resistor. As needed bump the charge voltage up to keep the current at 1-1.4A. Eventually the pack will balance.
 
That last is because the 280Ah battery I’m planning to build next is intended to be used for home backup and high TOU self-consumption.

I’d planned on only draining the battery with self-consumption over the 4-month peak season with high TOU rates.

This would mean for 8 months a year, the battery gets charged daily without any discharge.

I was planning to do this to preserve charge-discharge cycles but if this is actually worse for the battery, sounds like I may be better-off draining it to 50% and disconnecting the solar charger for 8 months..,
Hold those thoughts for another topic about your new build. Include some information about your power company and the kind of NEM agreements they have. Do you have a history of bills so you can model your consumption by TOU period. I started out load shifting with an EV 8 years ago. I had solar panels then. Three years later when the first Tesla Powerwall came out I bought an Outback Radian and did some load shifting.
 
Charge until one cell hits 3.5V. Then put shunt resistors on that cell. As each cell reaches 3.5V, install a resistor. As needed bump the charge voltage up to keep the current at 1-1.4A. Eventually the pack will balance.
If you are charging with a charger with a fixed rate it would be easy to figure out the shunt resistance needed to hold a cell at 3.5 volts while the others catch up. The charge amperage of my system changes over time as solar is generated so I have chosen to use a couple of one Ohm resistors in parallel to give me an equivalent of a half Ohm that will pull about 6 Amps out of a cell for each hour. There are a lot of good techiques out there. Thanks for sharing.
 
Hold those thoughts for another topic about your new build. Include some information about your power company and the kind of NEM agreements they have. Do you have a history of bills so you can model your consumption by TOU period. I started out load shifting with an EV 8 years ago. I had solar panels then. Three years later when the first Tesla Powerwall came out I bought an Outback Radian and did some load shifting.
Sounds like we’ve got a few things to discuss but I’ll save it for a separate thread.

My utility (PG&E) is changing the terms of the 20-year agreement I entered into in 2016 when I put in a 4kW grid-tied PV System and I’ll be screwed by 2022 if I don’t do something.

That and it get’s old going days on end with no power due to Fire Outages and having to explain to family members why the beautiful solar system on the roof can’t be used to run the refrigerator to keep the food from spoiling...

If you’ve got experience with the Radian, would love to pick your brain before I commit to an inverter - but let’s take it elsewhere.
 
I suspect you could run the voltage up to 3.8 volts and get a few more Ahrs into them. Maybe some more if you went to 4.0 volts but why?
The cells get damaged because dendrites grow based on the time that the cells spend at the top part of the charge curve. Hence the theory of storing them at 50%. There are no simple rules but i have developed some practices that I use with my EVs that gives me some comfort that I might get 8 to 10 years out of the batteries with no more than 10-15% degradation of the pack.
Based on your plans above I would be happy to have a dialogue about what has been working for me.

I’d appreciate that, but maybe after I’ve finished debugging this first battery.

For backup use, my plan will be to keep the battery stored (on standby) at ~50% SOC then fully charge it in ~5 hours when we get warned about a possible fire-related outage.

Then I’ll use it as needed for however long that lasts, hopefully with an MPPT Charger to replenish during the day.

Over the course of 3-4 days, I’m not too worried about cells getting out of balance and capacity dropping more than it needs to.

So for this first battery, I figure my worst-case outcome is needing to screw around with some manual rebalancing / top-balancing after a few days of use (assuming the cells hold charge during storage months as expected).

If I can figure out how to get this battery balanced and running fine without needing that manual intervention after use, I’ll learn a lot.

I’m going to be much more cautious about my 280Ah build and will spend a few months characterizing individual cells before making any decisions about BMS.
 
A one Ohm resistor will drain 3 Amps per hour. Two in parallel is 1/2 Ohm and that would be 6 Amps per hour based on Ohms law. My recommendation is not to use voltage as your target but use something that will measure Amphours or Watthours. Then next cycle watch that cell as they approach the knee. Volages below 3.4 are unreliable in terms of telling you if they are in balance.

Totally agree these cells are hard to manage using voltage as the control.

There’s really only three things I care about for this first ‘learner’ battery:

1/ I’ve run two discharge cycles and identified 2 ‘weakest’ cells which hit 2.5V while the other cells appear to be at ~10% SOC.

2/ At full charge (28.75V / 3.6V avg), if these two cellls are fully charged when the charger cuts out, I’ll be getting as much capacity from the battery as is possible (while if they are low by ~10% or more, I’ll be sacrificing available capacity).

3/ Once I’ve got the cells balanced in a way to maximize capacity (meaning weakest / fastest cells are fully-charged with the others), the battery will either maintain that balance or will lose balance over time / cycles. If balance is getting lost, I believe that means either I have a bad cell or a bad BMS which is why I’m trying to nail down stability right now.

My thinking is that all cells reaching 3.65 together is a good approximation of balance, but if the cells settle to different levels after several days, that may not be so.

Measuring charge into each cell would be nice but seems complicated and anyway, the BMS controls cell balance using voltage, so it’s something worth characterizing to understand whether the BMS is behaving as expected or not...
 
If you’ve got experience with the Radian, would love to pick your brain before I commit to an inverter - but let’s take it elsewhere.
I can think of a couple of thread titles to frame this future discussion. Certainly a teaser headline with PG&E in the title would attract attention.
I initially was motivated by a presentation for the unveiling of the first Tesla Powerwall in 2015. Since then I moved north and sold the Radian. I installed solar in the new home and put a deposit on a PowerWall. My NEM 2.0 agreement was not as favorable as my old NEM 1.0 with SCE in the previous home. I also realized i only needed the inverter capacity of one Powerwall but wanted the battery capacity of two Powerwalls. By then Outback had come out with the Skybox which was much more suited to my goals than the Radian.
 
Charge until one cell hits 3.5V. Then put shunt resistors on that cell. As each cell reaches 3.5V, install a resistor. As needed bump the charge voltage up to keep the current at 1-1.4A. Eventually the pack will balance.

Sounds intriguing but complicated. I have a parallel port to the sense wires which is how I’ve been injecting and draining charge from individual cells. Are you talking about a parallel ‘keeper’ to maintain individual cells at 3.5V using resistors?

I understand how I could do that with 8 individual 2.5V supplies (with independent grounds) but can’t picture how to do it with only 1...
 
I can think of a couple of thread titles to frame this future discussion. Certainly a teaser headline with PG&E in the title would attract attention.
I initially was motivated by a presentation for the unveiling of the first Tesla Powerwall in 2015. Since then I moved north and sold the Radian. I installed solar in the new home and put a deposit on a PowerWall. My NEM 2.0 agreement was not as favorable as my old NEM 1.0 with SCE in the previous home. I also realized i only needed the inverter capacity of one Powerwall but wanted the battery capacity of two Powerwalls. By then Outback had come out with the Skybox which was much more suited to my goals than the Radian.

I can see we’ve got a lot to discuss. I’m planning to leave my NEM1 agreement as is (which means I basically can’t touch the existing microinverter-based grid-tied system) and compliment it with a DC backup system which will do double-duty powering much of the loads during peak TOU hours the 4 months of the year they are going to jack up rates to double what they are crediting for the solar production.

Trying to do anything that sensibly pays for itself through savings within one lifetime is a challenge, but these low-cost LiFePO4 cells take a big bite out of the most expensive piece of the puzzle and make it seemingly-doable...
 
Some new data on my 90Ah battery:

Cells have continued to settle over the past 24 hours, losing close to what they lost over the 12 hours yesterday night (lost 241mV or 30mV / cell over 24h).

Battery has now settled to 27.6V or an average of 3.45V per cell.

I think I’m going to let this settling continue for another 24 or 48h to see if it stops (or greatly slows down) as I get under 3.4V, as several have suggested).

Once it has stopped, I’ll reconnect the BMS sense wires to confirm that does not accelerate voltage drop for the ‘problem cell’ I’m struggling with.

If yes, the BMS is probably faulty (‘balancing’ that cell when it shouldn’t be).

If no, the BMS may be fine and I just need to better-balance the problem cell.

I ‘balanced’ the low cell in the past by injecting charge until it matched the voltage of the other cells, but if that injected charge also requires a similar timeframe of several days to ‘settle’ (meaning the voltage of the cell with injected charge will drop over the subsequent 2-3 days), then I falsely thought that cell was balanced when it wasn’t.

Unless anyone has a simpler idea, my approach at that stage may just be to inject 1% charge (~0.9Ah) into the low cell, give that charge 3 days to settle in, then run a charge cycle to see whether that cell reaches 3.6V first or in parallel with the other cells.

Draining-off 0.3% SOC (~0.03Ah) from all cells immediately after charging to eliminate the charge ‘foam’ and speed up the settling time is a good idea but I’m not in a rush, so I’ll keep it simple and just wait for now...
 
Sounds intriguing but complicated. I have a parallel port to the sense wires which is how I’ve been injecting and draining charge from individual cells. Are you talking about a parallel ‘keeper’ to maintain individual cells at 3.5V using resistors?

I understand how I could do that with 8 individual 2.5V supplies (with independent grounds) but can’t picture how to do it with only 1...

Once you hit the current which the resistor will bypass (V=I*R) across that cell, the cell will stop charging. The current passing through the resistor will continue to charge the other cells. Eventually you will have all but one cell bypassed, and that cell will be charging at the resistor bypass current. Takes a little fiddling, but as long as your BMS is working, there isn't any risk of cell damage.
 

diy solar

diy solar
Back
Top