diy solar

diy solar

On Keeping LFP Warm

Horsefly

Solar Wizard
Joined
Dec 12, 2020
Messages
1,820
Location
Denver, mostly
I’ve looked / searched a bit and didn’t find anyone addressing this specifically, so I thought I would put down some thoughts and results of my computations and hopefully others will correct and/or contribute to a better picture.

We all know that LFP batteries need to be kept above 32°F/0°C for charging, and in fact they behave much better if kept closer to 50°F or more. There are numerous great threads here (and YouTube videos) of people building insulated boxes with heating pads. It got me thinking: Is there a way to know - or at least estimate - the specifics of what you NEED in a given installation?

That’s where my quest began a few days ago. I’m a retired engineer, so I started down the rabbit hole. The below is my initial take on calculating some requirements.

I think there are really two aspects to keeping cells warm in a battery box: Heating up the LFP pack, and the heat loss to the outside of the box.

Item 1: Heating an LFP pack
I realize there are lots of secondary items to consider here. For example: How much air / dead-space there is in the box around the pack. I’m happy to hear some thoughts about how to incorporate such items into a solution, but for now I’m going to concentrate on the heating of the pack.

After lots of Google searches, I found a source that said an LFP battery has a specific heat capacity of 1306 Joules / (kilogram x °C). (I can provide the source if someone really wants it, but it was a doctoral dissertation from the University of Waterloo) This is clearly a pretty inaccurate estimate, since any given battery is made up of varying proportions of several different materials. Each of those materials has a specific heat capacity, and the specific heat capacity of the cell depends on the proportions of each of the materials. The real number will be different for larger cells than it will be for smaller cells, and it will be different for aluminum case vs. plastic. However, for this rough estimate, lets use 1306. (Again - Please jump in with suggestions)

Now we can take the mass of the cells. For example, the aluminum-cased 280Ah cells weigh about 5.6kg each. A pack of 4 (for a 12.8V battery) would then weigh 4 x 5.6 = 22.4kg. Now if we multiply this weight by the specific heat capacity of 1306, we get:

22.4 x 1306 = 29,254.4 Joules per °C

A Watt is one Joule per second, so a Joule is one “Watt second”. To get to Watt-hours, we need to divide this number by 60 x 60 = 3600 seconds per hour:

29,254.4 / 3600 = 8.126 Wh per °C

So, ignoring all the inefficiencies that may be present, this means:

To heat one 4x280Ah pack one degree Celsius requires 8.126 watt-hours of energy

With no inefficiencies, you can raise the temperature of one pack from 32°F (0°C) to 50°F (10°C) in one hour if you use about 80W of heaters. You can use 40W of heaters and do the same in two hours. Because we know our box, heating pads, metal sheet etc. are not the perfect theoretical world, we could bump the number up by say 25% (?), so we could use 50W of heating pads, using around 4 amps continuously from our pack, until the thermostat turned them off in about two hours.

Thoughts?

Item 2: Heat loss from a battery box
We also know that the warm insides of our battery box will over time cool down to approach the temperature outside the box. How fast this happens depends on temperatures inside and outside the box, the material we use to build and insulate the box and how much it resists the transfer of heat, and the exposed area of this material. The degree something resists the transfer of heat is the “R-value” (or RSI-value in metric) that is used to rate insulation. All material has an R-value.

The imperial / American R-value has the units of

Square feet (area) x °F x hours / BTU

I find this much less handy to use than the metric RSI value, which has the units of

Square meters x °K / Watts (note that in this context °K is the same as °C)

Either one is really expressing how much a material resists the transfer of power (watts) between two different temps (°C) for a given area (square meters). You can convert an R value to an RSI value by dividing the R value by 5.678.

As an example, I’ve been planning out a battery box which is built from ¾” plywood on the outside, with the walls, top, and bottom lined with 2” XPS insulation. 2” XPS insulation has an R value of 10. Plywood has an R-value of about 1.25 per inch of thickness, so my ¾” plywood would have an R value of 0.75 x 1.25 = 0.9375. Combining the two layers of my box, the R value would be 10.9375. The RSI value would then be 10.9375 / 5.678 = 1.92629.

Now I need to add up the surface areas on the inside of the box, including the top, bottom, and all four sides (again, I’m ignoring some subtleties of the bottom losing less heat than the top, etc). Lets say my box had inside dimensions of 15” left-to-right x 12” front-to-back x 12” top-to-bottom. The total surface area would then be (2 x 12 x 12)(left and right side) + (2 x 12 x 15)(front and back) + (2 x 12 x 15)(top and bottom) for a total of about 1000 square inches.

1 inch is 0.0254 meters, so 1 sq inch = 0.0254 x 0.0254 = 0.00064516 square meters

Thus my 1000 sq inches of inside surface area is 0.64516 square meters.

Now let's assume that we want the inside of our box at 50°F (10°C), and outside the box it will be 32°F (0°C). The difference between the inside and the outside is then 10°C. Then it’s just algebra:

Rsi = sq meters x °C / Watts
Watts = sq meters x °C / Rsi
Watts = 0.64516 x 10 / 1.92629 = 3.3492

This means that a constant “leak” of 3.3492W of power is escaping the box. If the temperature difference is larger between the inside and outside of the box, the watts go up.
If I chose to use the cheaper, thinner 1” XPS (with an R value of 5 instead of 10), the watts escaping the box almost doubles.

Other Considerations?
My intent here was to define a way to at least estimate things. We don’t want to use heating pads that are too small, or the batteries will never really get / stay warm. On the other hand we don’t want them too big either. We should insulate our battery box, but how much?

I’ve admitted that what I’ve provided is theoretical, and in the real world there are other losses and inefficiencies. At the very least, what one builds should use these formulas only for a baseline, to which you add a fudge factor of +20% or +50% or ?

Hopefully I’ve got the formulas right, and hopefully they will help. What else can you guys think of to consider?
 
I’ve looked / searched a bit and didn’t find anyone addressing this specifically, so I thought I would put down some thoughts and results of my computations and hopefully others will correct and/or contribute to a better picture.

We all know that LFP batteries need to be kept above 32°F/0°C for charging, and in fact they behave much better if kept closer to 50°F or more. There are numerous great threads here (and YouTube videos) of people building insulated boxes with heating pads. It got me thinking: Is there a way to know - or at least estimate - the specifics of what you NEED in a given installation?

That’s where my quest began a few days ago. I’m a retired engineer, so I started down the rabbit hole. The below is my initial take on calculating some requirements.

I think there are really two aspects to keeping cells warm in a battery box: Heating up the LFP pack, and the heat loss to the outside of the box.

Item 1: Heating an LFP pack
I realize there are lots of secondary items to consider here. For example: How much air / dead-space there is in the box around the pack. I’m happy to hear some thoughts about how to incorporate such items into a solution, but for now I’m going to concentrate on the heating of the pack.

After lots of Google searches, I found a source that said an LFP battery has a specific heat capacity of 1306 Joules / (kilogram x °C). (I can provide the source if someone really wants it, but it was a doctoral dissertation from the University of Waterloo) This is clearly a pretty inaccurate estimate, since any given battery is made up of varying proportions of several different materials. Each of those materials has a specific heat capacity, and the specific heat capacity of the cell depends on the proportions of each of the materials. The real number will be different for larger cells than it will be for smaller cells, and it will be different for aluminum case vs. plastic. However, for this rough estimate, lets use 1306. (Again - Please jump in with suggestions)

Now we can take the mass of the cells. For example, the aluminum-cased 280Ah cells weigh about 5.6kg each. A pack of 4 (for a 12.8V battery) would then weigh 4 x 5.6 = 22.4kg. Now if we multiply this weight by the specific heat capacity of 1306, we get:

22.4 x 1306 = 29,254.4 Joules per °C

A Watt is one Joule per second, so a Joule is one “Watt second”. To get to Watt-hours, we need to divide this number by 60 x 60 = 3600 seconds per hour:

29,254.4 / 3600 = 8.126 Wh per °C

So, ignoring all the inefficiencies that may be present, this means:

To heat one 4x280Ah pack one degree Celsius requires 8.126 watt-hours of energy

With no inefficiencies, you can raise the temperature of one pack from 32°F (0°C) to 50°F (10°C) in one hour if you use about 80W of heaters. You can use 40W of heaters and do the same in two hours. Because we know our box, heating pads, metal sheet etc. are not the perfect theoretical world, we could bump the number up by say 25% (?), so we could use 50W of heating pads, using around 4 amps continuously from our pack, until the thermostat turned them off in about two hours.

Thoughts?

Item 2: Heat loss from a battery box
We also know that the warm insides of our battery box will over time cool down to approach the temperature outside the box. How fast this happens depends on temperatures inside and outside the box, the material we use to build and insulate the box and how much it resists the transfer of heat, and the exposed area of this material. The degree something resists the transfer of heat is the “R-value” (or RSI-value in metric) that is used to rate insulation. All material has an R-value.

The imperial / American R-value has the units of

Square feet (area) x °F x hours / BTU

I find this much less handy to use than the metric RSI value, which has the units of

Square meters x °K / Watts (note that in this context °K is the same as °C)

Either one is really expressing how much a material resists the transfer of power (watts) between two different temps (°C) for a given area (square meters). You can convert an R value to an RSI value by dividing the R value by 5.678.

As an example, I’ve been planning out a battery box which is built from ¾” plywood on the outside, with the walls, top, and bottom lined with 2” XPS insulation. 2” XPS insulation has an R value of 10. Plywood has an R-value of about 1.25 per inch of thickness, so my ¾” plywood would have an R value of 0.75 x 1.25 = 0.9375. Combining the two layers of my box, the R value would be 10.9375. The RSI value would then be 10.9375 / 5.678 = 1.92629.

Now I need to add up the surface areas on the inside of the box, including the top, bottom, and all four sides (again, I’m ignoring some subtleties of the bottom losing less heat than the top, etc). Lets say my box had inside dimensions of 15” left-to-right x 12” front-to-back x 12” top-to-bottom. The total surface area would then be (2 x 12 x 12)(left and right side) + (2 x 12 x 15)(front and back) + (2 x 12 x 15)(top and bottom) for a total of about 1000 square inches.

1 inch is 0.0254 meters, so 1 sq inch = 0.0254 x 0.0254 = 0.00064516 square meters

Thus my 1000 sq inches of inside surface area is 0.64516 square meters.

Now let's assume that we want the inside of our box at 50°F (10°C), and outside the box it will be 32°F (0°C). The difference between the inside and the outside is then 10°C. Then it’s just algebra:

Rsi = sq meters x °C / Watts
Watts = sq meters x °C / Rsi
Watts = 0.64516 x 10 / 1.92629 = 3.3492

This means that a constant “leak” of 3.3492W of power is escaping the box. If the temperature difference is larger between the inside and outside of the box, the watts go up.
If I chose to use the cheaper, thinner 1” XPS (with an R value of 5 instead of 10), the watts escaping the box almost doubles.

Other Considerations?
My intent here was to define a way to at least estimate things. We don’t want to use heating pads that are too small, or the batteries will never really get / stay warm. On the other hand we don’t want them too big either. We should insulate our battery box, but how much?

I’ve admitted that what I’ve provided is theoretical, and in the real world there are other losses and inefficiencies. At the very least, what one builds should use these formulas only for a baseline, to which you add a fudge factor of +20% or +50% or ?

Hopefully I’ve got the formulas right, and hopefully they will help. What else can you guys think of to consider?
I did a similar set of calculations about a year ago. I was planning on a box with R30 all around (6" of insulating foam) It was surprising how little energy was needed to keep the batteries at an even temperature for even a 30deg C differential in temperature. I think I came up with something like 8Whr/day to keep the batteries above freezing with the outside at -30C.
 
I'm sure you have already considered this, but it needs to be said out loud by somebody, so I'll be the one to say it. Usually when it's that cold you have the heater running. Something as simple as a small duct into the battery area would keep the batteries warm, no? I've also seen times where people locate the batteries near the water heater, or the inverter. That way there's always a little bit of heat in the immediate vicinity. Just a thought
 
Usually when it's that cold you have the heater running. Something as simple as a small duct into the battery area would keep the batteries warm, no?
In the case I was dealing with...No. I was dealing with a situation where I needed to store the cells in a mountain location over the winter. (No charging or discharging whatsoever)

The normal power source was Hydro and is completely shut down over the winter. I ended up with a super well insulated battery box and one small solar panel.

 
BTW: I have considered *many* ways to deal with keeping things warmer in cold climate. I even considered earth-tubes.

https://www.greenbuildingadvisor.com/article/all-about-earth-tubes.

If I had a back-hoe at the mountain site I might have even tried it. The layout of the site would work well.

I thought about biomass heating.... but after I sobered up decided that was too far out to even consider. :)
 
In the case I was dealing with...No. I was dealing with a situation where I needed to store the cells in a mountain location over the winter. (No charging or discharging whatsoever)

The normal power source was Hydro and is completely shut down over the winter. I ended up with a super well insulated battery box and one small solar panel.

What got me started on this is a similar situation. Our off-grid cabin is at 9,000 ft elevation in the mountains of Colorado, with 30 miles or so of dirt / unmaintained roads. We pretty much shut down the place in mid-November and don't get back up there until late-April or mid-May. In between our batteries (which currently are lead-acid, but aiming for LFP) are sitting in the basement of an unheated structure.

The current setup has four big lead-acid AGM batteries. The charge controller stays on all winter, and the panels are seldom covered with snow for more than a couple of days. Lead-acid batteries do fine down to something like -70°F, so its all been fine. If not for the inevitable decline of the lead-acid batteries, we could probably stay with it for many years.

I've got a temperature logger capturing and logging the temps on the battery shelf every hour all winter long. I predict there will be a few days that don't get above 0°F. In the end I may decide it is just too risky to go to LFP.
 
What got me started on this is a similar situation. Our off-grid cabin is at 9,000 ft elevation in the mountains of Colorado, with 30 miles or so of dirt / unmaintained roads. We pretty much shut down the place in mid-November and don't get back up there until late-April or mid-May. In between our batteries (which currently are lead-acid, but aiming for LFP) are sitting in the basement of an unheated structure.

The current setup has four big lead-acid AGM batteries. The charge controller stays on all winter, and the panels are seldom covered with snow for more than a couple of days. Lead-acid batteries do fine down to something like -70°F, so its all been fine. If not for the inevitable decline of the lead-acid batteries, we could probably stay with it for many years.

I've got a temperature logger capturing and logging the temps on the battery shelf every hour all winter long. I predict there will be a few days that don't get above 0°F. In the end I may decide it is just too risky to go to LFP.

Gonna cost more, but maybe consider LTO .... I think @Craig is using his in a cabin?
 
I’ve looked / searched a bit and didn’t find anyone addressing this specifically, so I thought I would put down some thoughts and results of my computations and hopefully others will correct and/or contribute to a better picture.

We all know that LFP batteries need to be kept above 32°F/0°C for charging, and in fact they behave much better if kept closer to 50°F or more. There are numerous great threads here (and YouTube videos) of people building insulated boxes with heating pads. It got me thinking: Is there a way to know - or at least estimate - the specifics of what you NEED in a given installation?

That’s where my quest began a few days ago. I’m a retired engineer, so I started down the rabbit hole. The below is my initial take on calculating some requirements.

I think there are really two aspects to keeping cells warm in a battery box: Heating up the LFP pack, and the heat loss to the outside of the box.

Item 1: Heating an LFP pack
I realize there are lots of secondary items to consider here. For example: How much air / dead-space there is in the box around the pack. I’m happy to hear some thoughts about how to incorporate such items into a solution, but for now I’m going to concentrate on the heating of the pack.

After lots of Google searches, I found a source that said an LFP battery has a specific heat capacity of 1306 Joules / (kilogram x °C). (I can provide the source if someone really wants it, but it was a doctoral dissertation from the University of Waterloo) This is clearly a pretty inaccurate estimate, since any given battery is made up of varying proportions of several different materials. Each of those materials has a specific heat capacity, and the specific heat capacity of the cell depends on the proportions of each of the materials. The real number will be different for larger cells than it will be for smaller cells, and it will be different for aluminum case vs. plastic. However, for this rough estimate, lets use 1306. (Again - Please jump in with suggestions)

Now we can take the mass of the cells. For example, the aluminum-cased 280Ah cells weigh about 5.6kg each. A pack of 4 (for a 12.8V battery) would then weigh 4 x 5.6 = 22.4kg. Now if we multiply this weight by the specific heat capacity of 1306, we get:

22.4 x 1306 = 29,254.4 Joules per °C

A Watt is one Joule per second, so a Joule is one “Watt second”. To get to Watt-hours, we need to divide this number by 60 x 60 = 3600 seconds per hour:

29,254.4 / 3600 = 8.126 Wh per °C

So, ignoring all the inefficiencies that may be present, this means:

To heat one 4x280Ah pack one degree Celsius requires 8.126 watt-hours of energy

With no inefficiencies, you can raise the temperature of one pack from 32°F (0°C) to 50°F (10°C) in one hour if you use about 80W of heaters. You can use 40W of heaters and do the same in two hours. Because we know our box, heating pads, metal sheet etc. are not the perfect theoretical world, we could bump the number up by say 25% (?), so we could use 50W of heating pads, using around 4 amps continuously from our pack, until the thermostat turned them off in about two hours.

Thoughts?

Item 2: Heat loss from a battery box
We also know that the warm insides of our battery box will over time cool down to approach the temperature outside the box. How fast this happens depends on temperatures inside and outside the box, the material we use to build and insulate the box and how much it resists the transfer of heat, and the exposed area of this material. The degree something resists the transfer of heat is the “R-value” (or RSI-value in metric) that is used to rate insulation. All material has an R-value.

The imperial / American R-value has the units of

Square feet (area) x °F x hours / BTU

I find this much less handy to use than the metric RSI value, which has the units of

Square meters x °K / Watts (note that in this context °K is the same as °C)

Either one is really expressing how much a material resists the transfer of power (watts) between two different temps (°C) for a given area (square meters). You can convert an R value to an RSI value by dividing the R value by 5.678.

As an example, I’ve been planning out a battery box which is built from ¾” plywood on the outside, with the walls, top, and bottom lined with 2” XPS insulation. 2” XPS insulation has an R value of 10. Plywood has an R-value of about 1.25 per inch of thickness, so my ¾” plywood would have an R value of 0.75 x 1.25 = 0.9375. Combining the two layers of my box, the R value would be 10.9375. The RSI value would then be 10.9375 / 5.678 = 1.92629.

Now I need to add up the surface areas on the inside of the box, including the top, bottom, and all four sides (again, I’m ignoring some subtleties of the bottom losing less heat than the top, etc). Lets say my box had inside dimensions of 15” left-to-right x 12” front-to-back x 12” top-to-bottom. The total surface area would then be (2 x 12 x 12)(left and right side) + (2 x 12 x 15)(front and back) + (2 x 12 x 15)(top and bottom) for a total of about 1000 square inches.

1 inch is 0.0254 meters, so 1 sq inch = 0.0254 x 0.0254 = 0.00064516 square meters

Thus my 1000 sq inches of inside surface area is 0.64516 square meters.

Now let's assume that we want the inside of our box at 50°F (10°C), and outside the box it will be 32°F (0°C). The difference between the inside and the outside is then 10°C. Then it’s just algebra:

Rsi = sq meters x °C / Watts
Watts = sq meters x °C / Rsi
Watts = 0.64516 x 10 / 1.92629 = 3.3492

This means that a constant “leak” of 3.3492W of power is escaping the box. If the temperature difference is larger between the inside and outside of the box, the watts go up.
If I chose to use the cheaper, thinner 1” XPS (with an R value of 5 instead of 10), the watts escaping the box almost doubles.

Other Considerations?
My intent here was to define a way to at least estimate things. We don’t want to use heating pads that are too small, or the batteries will never really get / stay warm. On the other hand we don’t want them too big either. We should insulate our battery box, but how much?

I’ve admitted that what I’ve provided is theoretical, and in the real world there are other losses and inefficiencies. At the very least, what one builds should use these formulas only for a baseline, to which you add a fudge factor of +20% or +50% or ?

Hopefully I’ve got the formulas right, and hopefully they will help. What else can you guys think of to consider?
Thoughtful analysis.

Gor done real-world data, I wanted to test the capacity of my cells charged under spec conditions, meaning 25C / 77F.

To do this I rigged up a simple temperature-controlled chamber consisting of:

-1” of packing foam to insulate from the floor (the foam the cells were shipped in).

-a 40W brewer’s heating pad (for warming carboys) placed on the foam

-a terrarium / vivarium thermostat connected to the heat pad with a temperature probe that could be suspended in the air of the chamber

-a 280Ah EVE cell placed in the heating pad and connected to charge cables

-an inverted ~1/4” cardboard box to cover everything and form a chamber

Ambient temps varied between 60 and 65F and as I said, I had the thermostat set for 77F.

After placing a fresh cell into the chamber, the thermostat would turn on and it would take a good hour before it started cycling.

Once temps stabilized at 77F, the duty cycle needed for the 40W heating pad to compensate for the heat loss through the thin cardboard ‘insulation’ to the surrounding air 12-17 degrees cooler was surprisingly low (~10%).

This was a temperature difference of only 12-17F (6.7-9.4C) but I was amazed at how effectively this simple 1/4” of cardboard could cut down heat loss to the ambient environment allowing a 40W heating pad to maintain temps with ease...
 
I've got a temperature logger capturing and logging the temps on the battery shelf every hour all winter long. I predict there will be a few days that don't get above 0°F. In the end I may decide it is just too risky to go to LFP.
Do you also have a logger for the outside temp? It would be interesting to see the difference between the outside and the basement. Depending on the construction and depth of the basement, you might find the basement is surprisingly warmer than the outside (but still cold)
 
Thoughtful analysis.

Gor done real-world data, I wanted to test the capacity of my cells charged under spec conditions, meaning 25C / 77F.

To do this I rigged up a simple temperature-controlled chamber consisting of:

-1” of packing foam to insulate from the floor (the foam the cells were shipped in).

-a 40W brewer’s heating pad (for warming carboys) placed on the foam

-a terrarium / vivarium thermostat connected to the heat pad with a temperature probe that could be suspended in the air of the chamber

-a 280Ah EVE cell placed in the heating pad and connected to charge cables

-an inverted ~1/4” cardboard box to cover everything and form a chamber

Ambient temps varied between 60 and 65F and as I said, I had the thermostat set for 77F.

After placing a fresh cell into the chamber, the thermostat would turn on and it would take a good hour before it started cycling.

Once temps stabilized at 77F, the duty cycle needed for the 40W heating pad to compensate for the heat loss through the thin cardboard ‘insulation’ to the surrounding air 12-17 degrees cooler was surprisingly low (~10%).

This was a temperature difference of only 12-17F (6.7-9.4C) but I was amazed at how effectively this simple 1/4” of cardboard could cut down heat loss to the ambient environment allowing a 40W heating pad to maintain temps with ease...
This is great data! Thanks @fafrd ! I'd like to try and plug your data into my calculations and see what I get. That could establish how much of a fudge factor is needed to get from theoretical to real-world. Let me see if I understand correctly:
  • The bottom of the chamber was the 1" packing foam. I need to figure out what the R value is for that.
  • All four sides and top of the chamber just cardboard. I'm sure I can figure out the R value for that.
  • Do you remember the dimensions of the box / chamber?
  • We have the temp difference between outside the chamber and the desired temp inside.
  • It took an hour to initially heat the chamber with the single cell.
  • The thermostat probably has a start and stop temp. That is, when you set it to 77°F, it maybe turns on at 72°F and back off at 77°F. Do you have any specs that might indicate what the start-to-stop delta is for the thermostat?
  • I can go with the duty cycle, but do you recall roughly how long a "cycle" was? I.e., from the time the thermostat turned on to when it turned on again?
Obviously the original heating up was about 40Wh, as it was 40W for one hour. One cell would weigh 5.6kg, and multiplied by the 1306 specific heat capacity would give 7,313.6 J/°C, which would be 2Wh/°C. If the original heating was the full 9.4°C, that only works out to be 18.4Wh, or about half of what you used. During that first hour, you'd also have an hour's worth of loss, and I'll have that when I figure out the R value of your box. The outside air wasn't much cooler than the inside, so I would think that even if cardboard has a really low R value (and I'm sure it does) the amount of loss through the walls of the box won't be much.

I think this probably means that my theoretical calculations are pretty far off from your real world experience. It may be that the 1306 specific heat capacity is wrong, or that I'm not accounting for something.

I'll think on it some more. Thank you again for the data!
 
Do you also have a logger for the outside temp? It would be interesting to see the difference between the outside and the basement. Depending on the construction and depth of the basement, you might find the basement is surprisingly warmer than the outside (but still cold)
That's a debate we've been having amongst the owners of the cabin (my brother, sister, and I). The floor of the basement is about 3 ft below grade. My sister and her husband think that the "solar room" where the batteries are doesn't get much below freezing, maybe to 20°F or so. We do have a little battery operated digital temp display in the cabin with an outside probe, and the display does keep track of the lowest and highest temp outside. My brother-in-law reset it before closing up the cabin, so I should at least have a guess at the lowest temp it got to outside over the winter.
 
This is great data! Thanks @fafrd ! I'd like to try and plug your data into my calculations and see what I get. That could establish how much of a fudge factor is needed to get from theoretical to real-world. Let me see if I understand correctly:
  • The bottom of the chamber was the 1" packing foam. I need to figure out what the R value is for that.
And I forgot, but I also lined the top of the box with another layer of that foam. Have a piece in my hand and it is actually 15/16” thick...
  • All four sides and top of the chamber just cardboard. I'm sure I can figure out the R value for that.
  • Do you remember the dimensions of the box / chamber?
This was one of the boxes 4 280Ah LiFePO4 cells shipped in.
W 9.5”
L 15.5”
H ~10” (don’t have a box with me, so estimating)

And those were made from pretty heavy shipping cardboard, so at least 1/4” thick and possibly a bit more...
  • We have the temp difference between outside the chamber and the desired temp inside.
  • It took an hour to initially heat the chamber with the single cell.
  • The thermostat probably has a start and stop temp. That is, when you set it to 77°F, it maybe turns on at 72°F and back off at 77°F. Do you have any specs that might indicate what the start-to-stop delta is for the thermostat?
I’m guessing that thermostat had a sensitivity of ~2F or 1C.
  • I can go with the duty cycle, but do you recall roughly how long a "cycle" was? I.e., from the time the thermostat turned on to when it turned on again?
I’d guess it might have cycled every couple minutes or so, but take that with a grain of salt - I really wasn’t paying attention to tha
Obviously the original heating up was about 40Wh, as it was 40W for one hour. One cell would weigh 5.6kg, and multiplied by the 1306 specific heat capacity would give 7,313.6 J/°C, which would be 2Wh/°C. If the original heating was the full 9.4°C, that only works out to be 18.4Wh, or about half of what you used.
It may well have been only ~30 minutes to heat up to temp. I set it all up and went to eat dinner. When I came back everything had settled...

During that first hour, you'd also have an hour's worth of loss, and I'll have that when I figure out the R value of your box. The outside air wasn't much cooler than the inside, so I would think that even if cardboard has a really low R value (and I'm sure it does) the amount of loss through the walls of the box won't be much.

I think this probably means that my theoretical calculations are pretty far off from your real world experience. It may be that the 1306 specific heat capacity is wrong, or that I'm not accounting for something.

I'll think on it some more. Thank you again for the data!
The energy needed to heat the cell is really not all that material. It only matters once when you’ve got the transient of heating up cold cells to desired temp.

Once at temp, all that matters is Watts of heat out through the chamber walls and Watts of heat in through the (duty-cycled) heating pad. I estimated ~10% duty cycle with my 40W pad which translates to 4W of heat being added on a continuous basis. Take that with a grain of salt too. Could have been 20%, could have been 5%.

At this stage of my testing, I was only concerned with whether the 40W heating pad could ‘keep up with’ the heat list through the cardboard (which it could, with ease).

I’m planning to house my cells in an insulated box likely with a heating pad as well, so I’m mulling some of the same issues you are raising here.

The one important factor you are overlooking is the heat being generated by the cells as they charge and discharge.

With Internal Resistance of 0.25 mOhm, busbar resistance of 0.05 mOhm, and (2) contact resistances on that same order of magnitude (I’m guessing), there is close to 1 mOhm of resistance per cell or ~8 mOhm of resistance for the entire pack.

Charging or discharging at 100A will generate ~ heat through the resistance of the battery (I^2R).

When planning your insulation, you do not want it to be too ‘perfect’. If you had perfect insulation (no heat loss), that 80W of heat during charge and discharge would raise battery temperatures too high.

Of course, you could control a fan with another thermostat to bring down temps if they get too high but my point is that for whatever you average charge-discharge current is, there will be a certain number of Watt hours generated by your battery on a daily basis and so losing about that same amount of Watt Hours through your insulation is the best way to balance the needs of staying warm enough with the needs of staying cool enough (especially since better insulation generally involves more $$$).

100A is a high current and in my case, average charge/discharge current will be more like 20-30A, but still, that represents 3-7W of heat being generated over much of the day/night that needs to be accounted for.

If my rough estimate that 10% duty cycle with a 40W heating pad was able to compensate for heat lost through cardboard with a 10-15F temperature difference, that means with any real insulation at all, I’ll generally need to be more worried about cooling rather than heating my cells...

My case is a bit unusual because I’m in a warm climate that will be 65F most of the year and will dip into the high 40s / low 50s for only 2-3 months over winter. So planning my insulated box to maintain temps of 10-15F above ambient based on my average daily current charge/discharge cycle is the best way to have a balanced solution that rarely needs either active heating or active cooling...
 
All good points @fafrd. I had considered the heat generated by the cells themselves, but I decided it would be negligible for many of us, based only on my experience. Here's my thinking:

The energy budget that I calculated for our cabin has a max energy consumption of around 3000Wh per day, although it has a maximum instantaneous energy load of 3300W. That may seem a bit strange, but I went over it many times before I built the system in 2017.

Like most off-grid homes, most of my loads are pretty small: LED lights, phone chargers, laptops. We have two large loads: A 0.5hp well pump, and a 800W microwave. The well pump runs when the pressure tank requires it to. For two of us visiting the cabin it generally runs between 5 and 8 times per day. Each run pulls about 55A from my 24V battery for about 2.5 minutes. The 800W microwave (which uses a bit over 1000W of power) pulls about 44A from my battery. It often doesn't get used at all, but some days may get used for 15-20 minutes per day. The probability of both the well pump and microwave running at the same time has got to be extremely small, but I have to allow for it and so plan on a 120A BMS. Also, we've found that somewhere between 25% and 50% of the pump / microwave runs are when there is solar, reducing the pull from the batteries.

So you see, we are unlikely to ever have more than around 60A pulled from the batteries at any one time, and then it will only be for a few minutes each time. My charge controller is limited to 60A, and I've only see it produce that much (for my current lead-acid battery bank) a couple of times while the batteries were trying to charge and the well pump was running. For that reason I don't think the LFP pack will ever generate much heat of its own.

I know some folks have other medium-sized loads, such as a fridge or washing machine. These certainly would increase the daily consumption, but I would still say the pull from the pack will almost never be 100A, and if it ever is, it will be for a short amount of time.

Off-grid solar is a very different animal than EVs, where large charge and discharge currents can go on for quite a while. I don't want to dive into a religious war, but that's why I'm not too worried about compressing LFP cells. I think the repeated expand / contract of the cells is an issue for the EVs, but not for most solar off-grid. Certainly not for mine.
 
Interesting topic, I've been kind of following along. Something occurred to me, I have a bunch of 100 ah lifepo4 batteries with internal temperature sensors and an empty chest freezer. I can leave one in there overnight, and get it down to, say -20° F. And then put a load on it of different sizes and record the temperature over time. That would be pretty close to a real-world test. It's on my to-do list for a while, but I haven't gotten around to it. Maybe this is the time. If you have any ideas of what size load I could try, I will start there if you want.
 
-20°F may be a little low, as I think most cells are rated for discharge down to -30°C, which is -22°F. So at -20°F you'd be pretty close to the limit. Other than that though, I think it would be a great idea to do such a test. If you have the patience, it would be great if you could do some different loads, like maybe 10A for a long period, then 50A, then 100A. That would be 0.1C, 0.5C, and 1C for you 100Ah cells. Besides checking the temperature of the cell, if you had some precision calipers you could see if there is a measurable change in the size/shape of the cell. That may be a bridge too far though.

Let us know how it goes!
 
I’ve started to look at this a little bit. Now, I’m in the Bat Area so ‘cold’ for us is anything in the 40s and a drop to freezing is a once-a-decade event, so my design criteria is different than most of you.

My cellar temp is a fairly constant 65F year-round. In case we have a dip into the 40s, I want my battery compartment to remain close to 65F, and assuming I confirm capacity loss with my cells at cooler temps, I may want to keep the pack close to 24C / 77F throughout the warm part of the year.

So I figure an 18F / 10C temperature differential is a reasonable specification for what I will design my insulated chamber for.

R value of soft wood is 1.41 per inch of thickness, or 1.06 for 3/4” pine.

I’m estimating the surface area for my housing holding an 8S 280Ah battery will be ~0.76m^2, so watts of heat loss with a 10C temperature differential will be 10C x 0.76m^2 / 1.06 = 7.17W, or 172Wh/day.

I’m figuring my cells and busbars represent about 4 mOhms of resistance so discharging 280Ah at an average of 20A translates to 22.4Wh and charging 280Ah at an average of 30A translates to 33.6Wh, so I can probably figure on a total of 56Wh per day being contributed by the charge + discharge cycle, meaning the heating pad should only need to deliver ~116Wh over 24 hours.

The 1” foam insulation board has an R Value of 6, so using that instead of wood would cut the power needed to maintain a 10C temp differential down to 40.2W and the chamber could very possibly increase in temperature just from charging and discharging the battery.

But if we assume foam insulation limiting heat loss with a 10C temp differential to 40.2W and 56Wh of heat being contributed by the daily charge/discharge cycle, this would just mean that the chamber heats up to a 14C heat differential at which point heat loss through R6 insulation will have increased to 56W...

At warmest ambient temps of 65F (18.3C), a 14C heat differential would translate to 32.3C or 90.2F (not a disaster).

And there are obviously further ‘tune’ the insulation (by using wood instead of foam for the top, for example) so that daily heat loss balances daily charge/discharge heat addition at a temperature differential closer to 10C than 14C.

So I’m thinking I can probably set this up so that the thermostat & heat pad are only used during cold snaps over winter and how most of the year, I can rely on daily heat injections from the charge & discharge cycles to maintain my target of a stable heat differential of ~10C...
 
Freudian slip? My daughter lives/works in Santa Rosa. I'm going to have to start telling her she lives in/near the "bat area". :ROFLMAO:
I’ve started to look at this a little bit. Now, I’m in the Bat Area so ‘cold’ for us is anything in the 40s and a drop to freezing is a once-a-decade event, so my design criteria is different than most of you.

My cellar temp is a fairly constant 65F year-round. In case we have a dip into the 40s, I want my battery compartment to remain close to 65F, and assuming I confirm capacity loss with my cells at cooler temps, I may want to keep the pack close to 24C / 77F throughout the warm part of the year.

So I figure an 18F / 10C temperature differential is a reasonable specification for what I will design my insulated chamber for.

R value of soft wood is 1.41 per inch of thickness, or 1.06 for 3/4” pine.

I’m estimating the surface area for my housing holding an 8S 280Ah battery will be ~0.76m^2, so watts of heat loss with a 10C temperature differential will be 10C x 0.76m^2 / 1.06 = 7.17W, or 172Wh/day.

I’m figuring my cells and busbars represent about 4 mOhms of resistance so discharging 280Ah at an average of 20A translates to 22.4Wh and charging 280Ah at an average of 30A translates to 33.6Wh, so I can probably figure on a total of 56Wh per day being contributed by the charge + discharge cycle, meaning the heating pad should only need to deliver ~116Wh over 24 hours.

The 1” foam insulation board has an R Value of 6, so using that instead of wood would cut the power needed to maintain a 10C temp differential down to 40.2W and the chamber could very possibly increase in temperature just from charging and discharging the battery.

But if we assume foam insulation limiting heat loss with a 10C temp differential to 40.2W and 56Wh of heat being contributed by the daily charge/discharge cycle, this would just mean that the chamber heats up to a 14C heat differential at which point heat loss through R6 insulation will have increased to 56W...

At warmest ambient temps of 65F (18.3C), a 14C heat differential would translate to 32.3C or 90.2F (not a disaster).

And there are obviously further ‘tune’ the insulation (by using wood instead of foam for the top, for example) so that daily heat loss balances daily charge/discharge heat addition at a temperature differential closer to 10C than 14C.

So I’m thinking I can probably set this up so that the thermostat & heat pad are only used during cold snaps over winter and how most of the year, I can rely on daily heat injections from the charge & discharge cycles to maintain my target of a stable heat differential of ~10C...
I think your heat loss calculation is off a bit. I'd need to double check this, but on first blush: You are using the American "R" value which doesn't have units that get you to Watt hours. You want to use the metric "RSI" value, which is simply R / 5.678. So your RSI value would be 1.06/5.678 = 0.1866854 °C*m²/Watt. Dividing your 10°C differential by this RSI value gives you 10/0.1866854 = 53.566 W/m². Multiply that by your area of 0.76m², and you get 40.7W, or 977Wh/day. That seems high to me, so maybe my calculation is wrong too!

You are right that increasing the R value 6 by using foam will decrease the Wh by the ratio 1.06/6, so 40.7W would go down to 7.19W, or 172Wh/day.

You say "cellar temperature" is constant 65°F, but from all the rest I assume you are NOT putting your batteries below grade (in the cellar). If you are, you have no calculating to do!

Like I said, I haven't been counting the heat generated by the cells, and now you are pointing out heat generated by the buss bars. I'll have to noodle on that a bit. You are using 4 mOhm, and I would guess good bus bars are way less than that. I guess I need to go do some more research!
 
Back
Top