Just for fyi, the spreadsheet predicts an internal cell temp of 45 degs C at a 1 CA sustained discharge rate. This is why I believe the thick electrode cells should not be used with a sustained discharge or charge current above 0.5 CA. Think about this when cells are tightly packed together like sardines in a can. The irony of contradiction is cell compression does not provide much benefit until current is so high to cause cell internal heating but tightly compressing the cells together makes it harder for cells to dissipate internal heating.
I never thought of that, but it's a very important point. Could it be actually unnecessary to compress cells then? And a much better idea to design a pack to always stay below 0.5C ?
Don't cells expand when fully charged anyway independently from the C rate at which they are charged? And so if kept below 0.5C makes it worth compressing but if you exceed that then the rate of degradation will be due to heat and not delamination so it's better to have them uncompressed?
I'm not experienced enough and just asking question trying to better understand.
Case example, I'll have 280Ah and my maximum amp draw (from my energy audit) will be 92A, so 0.33C so below the limit.
thank you taking the time to type this out!
i believe you!
the video references “lipo” but the fundamental thermodynamic phenomenon of tab heating is applicable to LiFePO4 cells in my estimation.
TL;DW the video claims that cooling the tabs/terminals of the cell will more effectively regulate the temperature of the entire cell material.
They say that cooling the cell wall caused the outer layers of jelly roll to operate cold and less efficiently while simultaneously not directly cooling the hottest inner parts.
That's very interesting. A liquid-cooled system is not practical, but maybe a couple of fans blowing on top of all the terminals, activated by temperature could actually help a little.