diy solar

diy solar

Charge acceptance vs temperature

KenDan

New Member
Joined
Mar 5, 2020
Messages
77
Location
Central Pennsylvania
This is my first winter with LifePO4 batteries and I've noticed that cold cells seem to have much reduced charge acceptance. I suspect it is because the internal resistance is higher at colder temperatures. This makes them difficult to charge quickly - which is a must with the short winter days. I don't like the idea of jacking up the bulk charging voltage super high in order to get them to draw more current because of the risk of overcharging and damaging them. The batteries are in my unheated basement which is now down to about 45F. I'm wondering if using some of the battery capacity to gently heat the battery temperature up to 60F or so is worthwhile - and is anyone doing this with some type of heating pad, etc?
 
What do you mean "reduced charge acceptance?"
What behavior are you observing?
What is your absorption voltage?

You don't alter the charge voltages of LFP ever.
 
By reduced charge acceptance I mean lower current draw at the same voltage than what is seen when the battery is warmer and at the same SOC.
Absorption voltage - or bulk charge voltage - as it is called on my LV5048 is typically set to no more than 54.4v (16S).
 
I am running with 4 battery packs in parallel. They are on a shelving system in my powerhouse. Because of how the setup is and how the heating of the building happens, I get some Temperature difference between the pack closest to ground level versus the one on the top. The differential is usually 2 degrees celsius between packs, so 14, 16,18 & 20 degrees respectively as an example. I have not observed any significant difference during charge/discharge or even "storage" cycles.

That being said, I did do some tests with the building at 25°C but with the Inverter dumping heat on pack 1 at the top of the rack which actually pushed the top pack to 28°C and at that time I did notice a slight difference with some cells running off faster than previously, but it was not that significant.
 
If you're seeing lower current at voltages below bulk/absorp, then that has nothing to do with the battery. That's the input.

If you're seeing lower current at 3.4V/cell. That's your problem. 3.4V/cell can be absurdly low for any temperature. You can always count on extended charge times due to absorption voltage being hit much sooner. I ran some tests charging individual cells to 3.4V @ 0.3C. They hit absorption almost immediately. It took 6.5 hours to achieve 95% SoC at a 0.03C cut off. This was at 75°F.
 
If you're seeing lower current at voltages below bulk/absorp, then that has nothing to do with the battery. That's the input.

If you're seeing lower current at 3.4V/cell. That's your problem. 3.4V/cell can be absurdly low for any temperature. You can always count on extended charge times due to absorption voltage being hit much sooner. I ran some tests charging individual cells to 3.4V @ 0.3C. They hit absorption almost immediately. It took 6.5 hours to achieve 95% SoC at a 0.03C cut off. This was at 75°F.
Yes - I'm talking about current at bulk/absorb CV and a relatively high SOC - 80% for example. There is plenty of current available, which can be easily demonstrated by increasing the charge voltage. It's not a limitation on the input. Below bulk/absorb voltage the charger is in CC mode so current is always input limited.
Under the same 80% SOC conditions, the current is noticeably lower at 54.4v when the cells are at 45F (7C), vs 70F (21C).
I'm pretty sure this is because IR increases at low temps.
I think this characteristic is described in this white paper that Will quoted elsewhere in the forums.
My question was really about using some of the battery power to run a heating pad or something keep the temperature up. Does anyone do that, and if so, what did you use?
 
Is it possible that your LV5048 is doing temperature compensation?

My batteries sit in my RV trailer year round. During the winter months they are still connected to the solar charger and I work in the trailer once a month, using 12v power each time. Consequently, the batteries cannot be allowed to go below 32° F for charging (per the battery manufacturer). I installed battery warmers to keep the batteries at a minimum of 35° F. When the battery warmer is active, there isn't much charge/discharge of the battery, so it's a very low C rate. If the battery was reluctant to charge/discharge at a high C rate I wouldn't know it.

When I'm actively camping in the trailer in freezing conditions, the furnace is on and the batteries are usually above 55° F.
 
My question was really about using some of the battery power to run a heating pad or something keep the temperature up. Does anyone do that, and if so, what did you use?

If you're going to use a battery warmer then you also need to insulate the battery bank. Otherwise, you're just wasting battery amps to keep them warm.

My write-up of my battery warmer can be found here:

 
Just up your voltage to 3.5Vpc, that will solve the issue. Otherwise insulate the pack and keep it at 70F with a heat pad.

LFP does benefit from temp compensation, but it's much lower coefficient than lead.
 
Just up your voltage to 3.5Vpc, that will solve the issue. Otherwise insulate the pack and keep it at 70F with a heat pad.

LFP does benefit from temp compensation, but it's much lower coefficient than lead.
Pushing the voltage up to 3.5v/cell is not an option because of the capacity mismatch on these commodity cells. Even with top balancing limited to 3.35v or above, the lower capacity cell(s) rather quickly exceed their maximum safe voltage when left on a float charge of a level that high.
I'll look into a way to try to keep them warm.
 
If you're going to use a battery warmer then you also need to insulate the battery bank. Otherwise, you're just wasting battery amps to keep them warm.

My write-up of my battery warmer can be found here:


Thanks - that is basically what I was looking for.
 
There are charts that show LFP has reduced capacity as the battery temp is lowered, but they last longer. The more research I do the more I come to the conclusion that each battery will flow so much power over its life. When colder it flows less while in charge/discharge so uses less of its "life span" and when warm can flow more and uses more of its life span.

There are also info that charging between 50% and 80% will lengthen the life of the battery.....but in the end you are charging less and using less of its life span so it is going to last longer. Just my take on it all.

There are also many studies of a similar thing for your heart. Some say that on average everyone's heart only pumps so many times. You have a fast heart rate because you don't take care of your body then you die sooner.

Much like engines and water pumps the more power you put through them the more work they do the less they last.
 
I would like to clarify something a bit.

We often say: " Push 3.5V charge at the cells" when talking about charging. That translates to 14V for a 12V battery pack.
The "entire" pack receives that 14.0V BUT how the cells take it depends on each individual cell within the pack, internal resistance & impedance being the factors that affect that.
Appreciate that the "Lowest Common Denominator" cells are the limiters. The runner which reaches HVD is the cell that will ultimately decide where the top is because it will consistently create an HVD Cutroff (HVD = Hich Volt Disconnect). The "Lazy" is on the other end and NOT necessarily the "Runner" which is the cell that will sink to LVD (Low Volt Disconnect) before the rest of the cells within the pack.
* Temps if very high or low can affect the runners/lazies to some extent. LFP LOVES the same temps that humans are most comfortable at (25°C/77°F).

When "Pushing" 14.0V it is obviously divided between the cells, the BMS cannot & does not control how many Volts/Amps any particular cell gets. The BMS can only monitor the "state" of each cell and act according to the parameters set to cutoff when conditions call for it. ANY Form of balancing is outside of it, entirely. As cells "fill up" their IR changes, stronger cells will take more charge while weaker cells will take less and therefore "run". Weaker cells will actually dip lower than the others when under load. Example: If all cells in pack are 3.100V and then you discharge say at 100A the cells will all sag under load but NOT evenly, the weakling will sag lower faster than the rest and will hit LVD cutoff point before the others, even if they are hundreds on Millivolts above the weakling.

I have done other tests at varied temps, I am fortunate that I can control the temps in the Powerhouse. I have had the PH at 0°C/32°F and observed discharge & charge process. Discharge is not significantly changed to that point (Most LFP is fine to discharge to -10°C/14°F) some can do lower temp discharge.
Charge on the other hand IS affected, it slows and takes longer. The Charge C-Rate also has to be lower than 0.5C (per manufacturer docs) which varies a bit depending on cell type (make & model).

A CATCH / GOTCHA
When a manufacturer states in their specs 1C Discharge & 0.5C Charge, that is the "optimal average" for calculating lifecycles / charge cycles. If you never charge at higher than say 0.2C that equates to lower "stress" on the chemistry. If you charge at 0.7C that will reduce the cycles as it is "pushing" the chemistry.

** Charge & Discharge rates are relative to Make & Model of the cells.
ESS cells are "typically" 1C Discharge & 0.5C Charge. EV Grade LFP can be as high as 5C Burst Discharge and 2.0C Charge rate capable. Utility Grade cells are most often rated between the ESS & EV grades. Utility cells being those used for Large Commercial Storage Systems.

Hope this helps a bit.
Steve
 
I would like to clarify something a bit.

We often say: " Push 3.5V charge at the cells" when talking about charging. That translates to 14V for a 12V battery pack.
The "entire" pack receives that 14.0V BUT how the cells take it depends on each individual cell within the pack, internal resistance & impedance being the factors that affect that.
Appreciate that the "Lowest Common Denominator" cells are the limiters. The runner which reaches HVD is the cell that will ultimately decide where the top is because it will consistently create an HVD Cutroff (HVD = Hich Volt Disconnect). The "Lazy" is on the other end and NOT necessarily the "Runner" which is the cell that will sink to LVD (Low Volt Disconnect) before the rest of the cells within the pack.
* Temps if very high or low can affect the runners/lazies to some extent. LFP LOVES the same temps that humans are most comfortable at (25°C/77°F).

When "Pushing" 14.0V it is obviously divided between the cells, the BMS cannot & does not control how many Volts/Amps any particular cell gets. The BMS can only monitor the "state" of each cell and act according to the parameters set to cutoff when conditions call for it. ANY Form of balancing is outside of it, entirely. As cells "fill up" their IR changes, stronger cells will take more charge while weaker cells will take less and therefore "run". Weaker cells will actually dip lower than the others when under load. Example: If all cells in pack are 3.100V and then you discharge say at 100A the cells will all sag under load but NOT evenly, the weakling will sag lower faster than the rest and will hit LVD cutoff point before the others, even if they are hundreds on Millivolts above the weakling.

I have done other tests at varied temps, I am fortunate that I can control the temps in the Powerhouse. I have had the PH at 0°C/32°F and observed discharge & charge process. Discharge is not significantly changed to that point (Most LFP is fine to discharge to -10°C/14°F) some can do lower temp discharge.
Charge on the other hand IS affected, it slows and takes longer. The Charge C-Rate also has to be lower than 0.5C (per manufacturer docs) which varies a bit depending on cell type (make & model).

A CATCH / GOTCHA
When a manufacturer states in their specs 1C Discharge & 0.5C Charge, that is the "optimal average" for calculating lifecycles / charge cycles. If you never charge at higher than say 0.2C that equates to lower "stress" on the chemistry. If you charge at 0.7C that will reduce the cycles as it is "pushing" the chemistry.

** Charge & Discharge rates are relative to Make & Model of the cells.
ESS cells are "typically" 1C Discharge & 0.5C Charge. EV Grade LFP can be as high as 5C Burst Discharge and 2.0C Charge rate capable. Utility Grade cells are most often rated between the ESS & EV grades. Utility cells being those used for Large Commercial Storage Systems.

Hope this helps a bit.
Steve
Thanks Steve,

Your statement "Charge on the other hand IS affected [by low temperature], it slows and takes longer" matches my 25+ years of experience with Li-Poly batteries. It is clearly also true of LFP cells (although perhaps to a lesser degree).
Cold Lithium cells have a higher internal resistance. Therefore more voltage must be applied to achieve the same charge current. The end result is that a cold battery will take longer to reach a full SOC if the exact same constant voltage charge source is used.

We can increase the voltage (as mentioned above) to get the charge current back up, but doing so puts the cells at risk if we don't back off on the charging voltage as the cell reaches 100% SOC. This can be particularly problematic in a series string of cells that are not well matched for capacity and IR.
The LV5048 does not have an adaptive way of handling this. I don't know what method other more sophisticated LFP chargers might use, but something a lot better than a simple timer is clearly needed. Solar charging is often time limited (only so many good sun hours in a day), so a way of taking maximum advantage of those limited hours is needed.
 
Thanks - that is basically what I was looking for.
I experienced something similar and can tell you what I did about it.

I did a first top-balance on 16 new 280Ah cells at 10A (0.0022C). This lasted weeks and was done indoors at 65-70C.

I then reconfigured into a 16S battery and performed a capacity test at 150W (3A / 0.011C). This allowed me to assess the capacity of my weakest cell and rank the others up to the strongest cell.

I charged the strongest 8 cells back to ~50% SOC using a 10A 8S LiFePO charger but it had become cold by then, so charging was performed at 45-50F (and at 0.036C).

I then charged the strongest individual cell to 3.65V using a 10A supply (0.36C), rested for one hour, and discharged at 10A (0.36C). All of this was performed in a cold room at 45-50C.

This strongest cell that delivered close to 280Ah in the 16S capacity test delivered only 160Ah of capacity when tested individually.

This was obviously very concerning so I repeated the exact same sequence with the second-strongest cell to the same result (<60% of the capacity from the 16S test charged at 65-70F).

After this second result of occurred to me that the change in weather and colder temps could have impacted these results so I reviewed EVE’s specs and decided to recreate them as closely as possible using a temperature control chamber.

My quick and dirty temp controlled chamber involved placing a brewer’s 40W heatpad on a layer of packing foam from the 280Ah cell packing. Cell was placed on the heatpad, heatpad connected to terrarium thermostat, thermostat probe taped to cell, and a cardboard box lowered over the entire assembly.

I set the thermostat to 77F (25C) and waited for the thermostat to settle (meaning cycling between in and off) before charging.

Charge cycle @ 10A / 0.36C was performed to 3.65V, followed by a 1 hour rest, followed by a 10A / 0.36C capacity test (all at 25C).

This time both the strongest cell as well as the second strongest cell delivered a full 280Ah.

I was relieved my cells have delivered rated capacity and have moved on to other priorities, but once my 8S battery is assembled, I plan to circle-back and compare capacity charged at 50F to capacity charged at 77F.

If capacity is as different as this initial experience indicated (~60%), I will build my battery into a closed and insulated container which I will heat control for at least 65F if not 75F.

If I don’t see much difference when testing the full 8S battery charged at 50F, I’ll chalk it up to some newbie mistake I made when charging individual cells for the first time.

Reviewing the EVE datasheet in detail, note that they specify discharge performance at -20C (>=70%) but this was following ‘standard’ charge at 25C. There is absolutely no specification indicating capacity when charged at temperatures lower than 25C.

So I don’t know if my bargain grey-market 280Ah cells underperform Grade A cells when charged at temps below 25C or not (since EVE’s spec says nothing about the performance to expect under those non-‘standard’ charge temperatures), but the fact that delivered rated capacity when charged under the conditions EVE specified s makes me satisfied with the performance of these cells.

If I confirm charge capacity degredation at lower temps and I’m the only DIYer unfortunate enough to recieve cells with that limitation, I’ll use temperature control in use and chalk the hassle up to the cost of getting the bargain I did.

But putting together a temperature control chamber for characterization is as easy as I outlined:

-Insulation from below (foam, wood, or cardboard)

-40W heating pad & terrarium/vivarium thermostat

-cardboard box to insulate the entire space from ambient air / temps
 
Back
Top