diy solar

diy solar

Why you cannot charge LiFePO4 below 0 degrees Celsius

this is not a battery, its a Powerwall type product.
I would not have thought there was a difference. What are the defining features of a Powerwall?

When Will interviewed the main dude at Battleborn he said -20 was the new chemistry.
I think this was the video
 
I would not have thought there was a difference. What are the defining features of a Powerwall?
I think the distinction they are making is that a battery is 'just a battery' whereas a Powerall can be a battery + other goodies, some of which might do things (like heat the cells, or regulate current) to make charging at low (ambient) temperatures more feasible. At least that is how I interpreted it.

When Will interviewed the main dude at Battleborn he said -20 was the new chemistry.

I think this was the video
In that video the Battleborn CEO mentioned +25*F as the lowest charging temperature and with the assumption that it's less than 1C. That is below the normal 32*F limit by 7 degrees but still 45 degrees abvoe -20*F
 
In that video the Battleborn CEO mentioned +25*F as the lowest charging temperature and with the assumption that it's less than 1C. That is below the normal 32*F limit by 7 degrees but still 45 degrees abvoe -20*F
[emphasis mine]
due to thermal mass of battery and placement of temperature sensor resulting in the temperature sensor leading rather than lagging the cell material temperature?

less than 1C = 1CAmpere?
 
[emphasis mine]
due to thermal mass of battery and placement of temperature sensor resulting in the temperature sensor leading rather than lagging the cell material temperature?
I'm not sure what the full reasoning is, the gist of it is that they feel that based on what they've observed about their average customers use-profile, a slightly lower low temp cutoff is okay. The main reason discussed is that battleborn deemed that their average customer did not charge anywhere near 1C, so they feel a slightly lower charging limit is acceptable in their context. Not sure how much you have looked into low temp charging, but its a 3 way relationship between (1) cell temperature (2) c-rate, and (3) state of charge. We simplify that complex relationship down to the black/white 32*F hard limit, but thats a tiny bit arbitrary as a hard limit, BB just uses a slightly less conservative semi-arbitrary hard limit (note: that BB makes a point of not being overly conservative or complicated, at the probable cost of some cycles, its part of the value proposition to their customers 'set and forget' lifepo4 that you dont have to baby--but if you use it hard within there parameters I'm sure you will see degraded cycle life).

The way I prefer to look at it, don't charge at high c-rates at low temperature, and especially not high c-rates, low temperature, and high SOC. 32*F isn't a magic number, damage can occur above or below, but its a reasonable line in the sand. Just know that 1C at 35*F could be worse than 0.1C at 28*F (numbers chosen just for example, not necessarily accurate).

There are some good threads from 2019/2020 where we chewed up the low temp issue half to death if you are interested in it. (Edit 2: I'm an idiot this is the thread I was referring to, or one of them..)

edit: and your point/question about thermal mass and temp sensor placement is worth considering as well, not sure if its a factor in BB's reasoning, but it seems to me the opposite could/would be true also. After a very cold night, a temp sensor could register warmth long before the battery warms up.
 
Last edited:
You guys (@Dzl and @curiouscarbon) have the thermal mass concept right: It just means that the temperature of the cells will change much more slowly than the ambient temperature, both up and down.

Last night the temp was going to get into the high 30's (°F) so I decided to test out my temperature and voltage loggers to see how well they did, and I also thought it would be testing my heating mechanism. The insulated box and the cells had been sitting in garage all day, and were resting at 68°F. I rolled the box out of the garage and into the driveway last night at about 9pm, and started the loggers.

This morning it was 42°F when I got around to checking on it all. I pulled the two temperature loggers (one on top of the cells and one on the aluminum sheet where the cells sit and where the heating pads are). First I checked the voltage logger, and it seemed to indicate that the heating pads were never turned on. I figured I had a bad connection that I would have to fix before tonight's test. Then I looked at the two temperature loggers. To my surprise, the temperature on both drifted down from 68°F but never even got to 61°F. So yeah, the heater never ran. Doh!

The 2" XPS in my battery box isn't THAT good. That was all about the THERMAL MASS of 8 230Ah cells.
 
I'm not sure what the full reasoning is, the gist of it is that they feel that based on what they've observed about their average customers use-profile, a slightly lower low temp cutoff is okay. The main reason discussed is that battleborn deemed that their average customer did not charge anywhere near 1C, so they feel a slightly lower charging limit is acceptable in their context. Not sure how much you have looked into low temp charging, but its a 3 way relationship between (1) cell temperature (2) c-rate, and (3) state of charge. We simplify that complex relationship down to the black/white 32*F hard limit, but thats a tiny bit arbitrary as a hard limit, BB just uses a slightly less conservative semi-arbitrary hard limit (note: that BB makes a point of not being overly conservative or complicated, at the probable cost of some cycles, its part of the value proposition to their customers 'set and forget' lifepo4 that you dont have to baby--but if you use it hard within there parameters I'm sure you will see degraded cycle life).

The way I prefer to look at it, don't charge at high c-rates at low temperature, and especially not high c-rates, low temperature, and high SOC. 32*F isn't a magic number, damage can occur above or below, but its a reasonable line in the sand. Just know that 1C at 35*F could be worse than 0.1C at 28*F (numbers chosen just for example, not necessarily accurate).

There are some good threads from 2019/2020 where we chewed up the low temp issue half to death if you are interested in it.

edit: and your point/question about thermal mass and temp sensor placement is worth considering as well, not sure if its a factor in BB's reasoning, but it seems to me the opposite could/would be true also. After a very cold night, a temp sensor could register warmth long before the battery warms up.

this research paper really caught my interest on the topic of low temperature charging for LiFePO4 cells.
Tc = temperature of charging : °C
Td = temperature of discharging : °C
DR = degradation rate : (Ah / cycle) / Ah
1632084853164-png.65385

this image paints a more nuanced picture of low temperature charging LiFePO4 than I have heard from most companies. (not that they are "hiding" anything, communicating more than a single cutoff temperature just complicates an already challenging communication task)

some tests are done at same Tc and Td and that is what most users will experience I assume.

the degradation rate of charging at -20°C was -0.00208 Ah/Cycle/AhReferenceCapacity
the degradation rate of charging at +12°C was -0.00315 Ah/Cycle/AhReferenceCapacity
the degradation rate of charging at +30°C was -0.00780 Ah/Cycle/AhReferenceCapacity

normalized, this is:

the degradation rate of charging at -20°C was 1x Rate
the degradation rate of charging at +12°C was 1.5x Rate
the degradation rate of charging at +30°C was 3.75x Rate

many of the lowest DR measures occurred with cold cells. how representative this is of all LiFePO4 chemistries being sold and shipped today, I cannot comment. but it's interesting to me! :)
 

this research paper really caught my interest on the topic of low temperature charging for LiFePO4 cells.
Tc = temperature of charging : °C
Td = temperature of discharging : °C
DR = degradation rate : (Ah / cycle) / Ah
1632084853164-png.65385

this image paints a more nuanced picture of low temperature charging LiFePO4 than I have heard from most companies. (not that they are "hiding" anything, communicating more than a single cutoff temperature just complicates an already challenging communication task)

some tests are done at same Tc and Td and that is what most users will experience I assume.

the degradation rate of charging at -20°C was -0.00208 Ah/Cycle/AhReferenceCapacity
the degradation rate of charging at +12°C was -0.00315 Ah/Cycle/AhReferenceCapacity
the degradation rate of charging at +30°C was -0.00780 Ah/Cycle/AhReferenceCapacity

normalized, this is:

the degradation rate of charging at -20°C was 1x Rate
the degradation rate of charging at +12°C was 1.5x Rate
the degradation rate of charging at +30°C was 3.75x Rate

many of the lowest DR measures occurred with cold cells. how representative this is of all LiFePO4 chemistries being sold and shipped today, I cannot comment. but it's interesting to me! :)
I briefly skimmed that journal article, but I skipped my regular coffee this morning and am feeling a little slow. I am not sure I am interpreting the results correctly, but it sounds like this experiment had findings that were substantially different from and opposite to what most literature and most manufacturers seem to agree upon. But I have a feeling I am misinterpreting something, or there is a devil in the details of the test parameters or the methodology. The table appears to show a temperature of -20*C / -5*F as having the least capacity degradation. I believe this was for charge/discharge and at a rate of 1C/1C. I can't reconcile this with everything else I've read, unless I am misunderstanding it, which I believe that I am.

A couple possibilities that come to mind (1) I am misunderstanding the findings (2) Negative effects of low temp charging are delayed or slow to emerg -- this test was 100 cycles, (3) or maybe the negative effects of low temp charging manifest in ways other than clearly measurable and immediate capacity degradation (4) some issue with the testing methodology.

What are your thoughts?
 
1) me too, parsing the data in table form is hard for me, and the paper is difficult for me to parse. 2) agree that 100 cycles might only capture the “beginning of the curve” of degradation and thus apply for the first 100 cycles but may not accurately represent 1000 cycles. 3) i wish to learn more about this concept 4) i need to review the paper more thoroughly to identify issues with the testing methodology, but it did seem fairly rigorous to me at first read.

They seemed to assemble the cells from raw parts and place the cell in a temperature regulated box, and forced the temperature to stay constant, so any self-heating damage ought to be attenuated. I think the device regulated the temperature of the air inside the chamber, they state the model name of the test chamber in the paper.

Basically, I consider the methodology to be sound, but given the number of cycles is only 100, being careful to not automatically extrapolate these degradation rates to the full rated cycle counts of 1000-2000++++

nevertheless it still has me looking for more knowledge and answers. calendar aging from natural electrolyte decomposition is slower at lower temperatures too and i wish to learn more about cold LiFePO4 chemistry

TL;DR am under the impression that their findings are valid for 100 cycles but i would not automatically extrapolate the results to 2000 cycles
 
1) me too, parsing the data in table form is hard for me, and the paper is difficult for me to parse. 2) agree that 100 cycles might only capture the “beginning of the curve” of degradation and thus apply for the first 100 cycles but may not accurately represent 1000 cycles. 3) i wish to learn more about this concept 4) i need to review the paper more thoroughly to identify issues with the testing methodology, but it did seem fairly rigorous to me at first read.

They seemed to assemble the cells from raw parts and place the cell in a temperature regulated box, and forced the temperature to stay constant, so any self-heating damage ought to be attenuated. I think the device regulated the temperature of the air inside the chamber, they state the model name of the test chamber in the paper.

Basically, I consider the methodology to be sound, but given the number of cycles is only 100, being careful to not automatically extrapolate these degradation rates to the full rated cycle counts of 1000-2000++++
Methodology can be (and probably is) sound and still have caveats or specific design parameters that can lead to misinterpretations / misunderstandings by laypeople like me, if we dont fully understand or we mistakenly misapply or overbroaden/overgeneralize the significance of the findings. This is partially what the discussion section of the paper is for, giving contexts to the results, but I didn't it to be that illuminating in this case (was too dense for my quick read, and didn't contextualize the findings against the larger body of research, or at least if it did I skipped over it).

One thing that I think needs more unpacking is the 'recoverable' and 'non-recoverable' degradation, and the connection between this summary statement and the chart:
As a summary, it can be concluded that cycling at (-5 °C, -5 °C), (0 °C, -20 °C), (5 °C, 5 °C), (12 °C, -10 °C) and (15 °C, -20 °C) over 100 cycles
led to almost no degradation. The samples tested at Td = -20 °C proved to be stable (recovery in capacity at +25 °C, Figure 4c), making these
samples suitable for sub-room temperature applications. This capacity recovery is less impressive when increasing Tc. The behavior shown by
this set of samples indicates that there is a big component of reversible degradation at low temperatures (kinetic component)

Also I'm struggling to understand (in the chart) CR longterm % vs DR
If I understand correctly DR is the degradation rate per cycle, -20/-20 seems to be among the best here, but CR longterm is the capacity retention after 100 cycles (measured against the initial capacity), and it appears -20/-20 is among the worst in this regard (only better than 30/30). I dont see how both of these things can be true at the same time. Logically, the cells that end with the highest capacity loss would have to also have the highest degradation rate, would they not? I sense that I am missing an important point.
 
One thing that I think needs more unpacking is the 'recoverable' and 'non-recoverable' degradation, and the connection between this summary statement and the chart:
I think this is partially captured by Reference Cycles. Long Term Aging cycles would observe loss associated with both recoverable and non-recoverable loss, but Reference Cycles ought to test higher due to recovery of recoverable type, while of course still missing unrecoverable.


Translating their paper..
They do 100 "long term aging" cycles
First a cell conditioning.
Then a reference cycle
Then 25 long term aging cycles
Then a reference cycle.. etc.

Summary of the different cycling protocols they use and refer to:

Electrochemical Cycling​

Cell Conditioning:​

  1. set temperature of chamber to 25°C
  2. perform Three charge/discharge cycles (steps 3-6 repeat three times)
  3. CC-CV charge at 0.1C to 3.7V (0.01C termination)
  4. wait 30 minutes
  5. CC discharge at 0.1C until 2.7V
  6. wait 30 minutes

Reference Cycle:​

  1. set temperature of chamber to 25°C
  2. wait until temperature equilibration rate is below 1°C per hour
  3. perform Two CC charge/discharge cycles
  4. charge CC-CV at 0.3C rate until 3.6V (10mA cutoff)
  5. wait until temperature equilibration rate is below 1°C per hour
  6. discharge CC at 0.3C until 2.5V
  7. wait until temperature equilibration rate is below 1°C per hour

Long Term Aging:​

  1. perform One Hundred charge/discharge cycles (steps 2-8 repeat 100 times)
  2. set temperature of chamber to Tc (charge temperature)
  3. if it's new temperature, wait until <1°C per hour change, if same as last temperature, wait 30 minutes
  4. charge CC-CV at 1.0C rate until 3.7V (0.1C/1hour cutoff)
  5. set temperature of chamber to Td (discharge temperature)
  6. if it's new temperature, wait until <1°C per hour change, if same as last temperature, wait 30 minutes
  7. discharge CC at 1.0C rate until 2.7V
  8. Every 25th Cycle, Perform A Reference Cycle (see above).
A total of 100 charge/discharge cycles were carried out. Every 25 cycles a reference cycle was performed to assess the reversible and irreversible capacity degradation.
method kill cell:
It was found that the temperature combination for charging at +30 °C and discharging at -5 °C led
to the highest rate of degradation
method preserve cell:
On the other hand, the cycling in a temperature range from -20 °C to 15 °C (with various combinations of temperatures of charge and discharge), led to a much lower degradation.
various quotes about degradation
The authors of this work described a higher fade in capacity at the higher temperature of discharge, which was attributed to solid electrolyte interface (SEI) layer growth and lithium plating(21) (ref 21 is : Jalkanen, K., et al. Cycle aging of commercial NMC/graphite pouch cells at different temperatures. Applied Energy. 154, 160-172 (2015).)
As a general rule, higher testing temperatures accelerate degradation(1,11,12), enhance the growth of the SEI(11,23,24), and promote variations in the SEI(11,23).
Wang et al.8 published that the fade in capacity followed a power law relationship with the charge throughput (temperatures between 15 °C and 60 °C). Other authors have described a square-root of time relationship with fade in capacity(10,30,31,32,33,34). This is supposed to represent the irreversible capacity loss attributed to the growth of SEI where active lithium is consumed.(30,31)
Finally, some simulations of the fade in capacity at various temperatures were validated with experimental results and the data showed an exponential dependency of degradation and temperature.(8,10)
on calendar aging
Capacity degradation also may have a share of linear degradation with time.(33,34,35)


Also I'm struggling to understand (in the chart) CR longterm % vs DR
If I understand correctly DR is the degradation rate per cycle, -20/-20 seems to be among the best here, but CR longterm is the capacity retention after 100 cycles (measured against the initial capacity), and it appears -20/-20 is among the worst in this regard (only better than 30/30). I dont see how both of these things can be true at the same time. Logically, the cells that end with the highest capacity loss would have to also have the highest degradation rate, would they not? I sense that I am missing an important point.
They say Degradation Rate is calculated from Reference Cycle after 100 Long Term Aging cycles.
CR-longterm is "capacity retention relative to first cycle" but this feels ambiguous to me. first aging cycle?
I agree that if it's taken a certain way the numbers seem confusing.
Note that the Reference Cycle involves bringing the temperature of the cell to 25°C. This happens regardless of the Tc or Td.

Tried to compile their chart into a more readable format.
1635635944646.png

TermNameUnit
TcCharging Temperature°Celsius
TdDischarging Temperature°Celsius
∆TCharge-Discharge Delta Temperature (Td-Tc)°Celsius
C1First cycle capacity of long term agingAmpere hours
CR-longtermCapacity retention relative to First Cycle% Percent
CiInitial capacity calculated by the Reference CycleAmpere hours
CR-refCapacity retention relative to first Reference Cycle% Percent
DRDegradation rate calculated from Reference Cycle after 100 Long Term Aging CyclesAmpere hours per cycle per Ah
 
  • Like
Reactions: Dzl
Try compare C1 to Ci. Capacity observed from first cycle of long term aging compared to capacity observed by Reference Cycle.

Note that the reference cycle (Ci) happens at 25°C every time and that the Long Term Aging cycle (C1) happens at Tc and Td which vary.

For cold temperatures, it can be observed that the long term aging cycle in Amp hours is wayyyy less than the Reference Cycle Amp hours.

This is one type of recoverable capacity loss if I am not misunderstanding the paper. Temperature related reversible capacity loss. Go to cold temperature, lose capacity, go to 25°C and regain it. For -20/-20 Tc/Td that translates to a working cycle of 3.0 Amp hours when the cell test at 5.6 Amp hours! that's a 46% reversible capacity loss if I am somehow following their paper.
 
  • Like
Reactions: Dzl
The reversible capacity loss seems to be in line with other sources I have read. Similar capacity reduction can be seen in lead batteries at low temperatures.

Another observation is that charging at low temperatures is very slow, with a fair bit of energy going into heat do to the higher internal resistance.

Low temperature charging is of less interest to me, simply because battery performance is reduced enough to inhibit the system from functioning normally in many situations. So adding a pack heater to keep them around 40F seems to be a good balance between capacity reduction and heating requirements.
 
Last edited:
I would not have thought there was a difference. What are the defining features of a Powerwall?

When Will interviewed the main dude at Battleborn he said -20 was the new chemistry.
I think this was the video
A powerwall is a packaged system that includes the battery and all required support circuitry. Systems can include the inverter and chargers. Makes sense a powerwall would include a heating system to permit charging below freezing.
 
I think you might want to check out that video
 
I found this thread interesting. My REC active BMS has factory default settings to charge down to -10C... at a reduced charging rate I am told.
 
I found this thread interesting. My REC active BMS has factory default settings to charge down to -10C... at a reduced charging rate I am told.

I can't speak for REC but for other BMS the charge "rate" is either on or off. There is no throttling of the charge.
 
Back
Top