diy solar

diy solar

Best LifePo4 charge controller settings known to man for Maximum Service life and Minimum battery stress!!! 5,000-10,000+ cycles?

Yes, lots of data and published literature on this topic:


Starting at page 60:

Quick Excerpt pertaining to temp and calendar aging: "Over time, though, cells degrade and lose capacity in accordance with two different aging phenomena: cycling and calendar aging. It is imperative to understand how these degradation phenomena occur as the loss in capacity results in a loss in vehicle range. Through understanding how these phenomena occur, mitigation efforts can be designed to prevent or lessen their effects. This thesis will focus primarily on studying the effects of calendar aging on commercial LiFePO4 cells. Cells are aged at varying temperatures and states of charge (SOC) to determine the extent of capacity fade and degradation. Additional testing methods are then utilized to attempt to determine which aging phenomena are promoting the losses within the cell. Capacity loss in cells stored at high temperature and fully charged conditions resulted in faster degradation rates. Temperature had the most significant role in the degradation of the cell and then the cell’s SOC. Comparing capacity losses between cells stored at the same temperature, but with different SOCs, found that the cells with higher SOC experienced increased rates of degradation in comparison to their fully discharged counterparts. In addition, storage at high SOC and high temperatures promoted such severe losses that the cells in question were unable to recapture capacity that they had lost reversibly. The primary degradation mode for the cells was the loss of cyclable lithium, and was found to occur under all of the storage conditions. Cells stored at much more severe conditions, though, also demonstrated a loss of active material at the anode. The extent of the loss of the active material was largely predicated on whether or not the cell was stored at fully charged or discharged conditions. Storage of lithium-ion batteries at high temperatures has a dramatic effect on the continual usage of the cells after storage conditions have changed. Despite shifting temperatures or states-of-charge to a lower value, the initial storage conditions leads to increased degradation rates throughout the cell life. Thus, the history of storage for the cell must be also be taken into account when considering losses in capacity."
 
I love the information you can glean from these type of tests and data collection but the actual use of lithium batteries at least for me is way outside the testing parameters of what I am reading. For instance right now I am testing an
experimental heating system on my travel trailer that so far allows me to achieve a 68 degree inside temperature at -31 degrees Fahrenheit outside that uses a heat recovery system to keep the holding tanks and water pipes from freezing. So for this winter my batteries have been between 75 and 50 degree temperatures. In storage however I could expect my batteries to reach close to -20 Fahrenheit at some point during winter. During regular use the highest continuous discharge I can reach is about.7c and I have a theoretical max charge rate if 1.2c but the highest continuous I have ever attained is .4c. At times the trailer is connected to shore power and the batteries tend to hang out at around 80 to 90% charge with sporadic discharge and recharge events whenever we lose power. During dry camping the batteries spend most of their time between 20 and 80 percent charge. Depending on how cloudy it is I have not seen any long term tests that emulate the real world conditions that rvs are subjected to. Maybe I just don’t know where to look. Here are the parameters I expect my batteries to see during their lifetime.
1.Storage temps between -30 to 100 degrees usage temperatures between 32 and 100 degrees.
2.SOC in constant flux when off grid and between 70 and 90 percent when connected to shore power.
3.Discharge rates typically less than .2c and averaging much less than that. Typical recharge rates of less than .2c with an occasional .4c recharge using the generator.
4.During completely off grid use. The batteries will reach and maintain full state of charge only during sunny days in the summer 5.the batteries will tend to have multiple mini cycles during the day as power is sporadically used above and then reduced below the rapidly changing solar input. (We have a lot of cloudy and overcast days in our area)

Much of the testing data is helpful but I am basically using it as a means of taking the best care of my batteries from a guess to a slightly educated guess because no tests that I am aware of emulate the “real world” conditions I presently operate in. If I go even 100 miles south of my location everything except the max c rates will change. If I want to use AC (more than a few hours) I would need to triple or quadruple my battery bank and that would reduce my c rates even more. My purpose for this post is to explain why the aging data available is only marginally helpful and to inquire if anyone is aware of testing that nor accurately represents typical rv usage. (Vibration is another factor I’m wondering about)
 
dear @Bobert,

thank you for lovely overview of your recent system operating parameters?

the performance of delta temperatures sound like the result of good insulation and efficiency structures.

ill try to learn more about how this was done. thank you for sharing these results, cheers
 
I read through this whole thing...
No one mentioned One Obvious Point which is Real-Time, Real-World observed & observable. I am Deep North so it get's serious, I heat my powerhouse to 10C/50F and my battery packs because they are stacked ATM will have 9C (bottom shelf) to about 12C on the top shelf. It may only be a 3 Degree spread BUT the Charging & Discharging Rates are observably different with Identical Packs ! Note that in Summer the "Bottom" pack is also cooler than the uppers and similar things happen and it get's "interesting" if the battery packs go over 30C/86F.

Because it is close it isn't wild or crazy BUT it is noticeable as the colder packs will take Less Amps during charge than the warmer ones and that can be up to 3A difference per pack. As they take charge & warm-up from it, they will increase the Amps taken and will level up at midpoint typically. At the end of a long charge cycle (>4 hours) all of the packs are usually at the same temp +/- 0.1Celsius

ALSO I want to point out that Charge Settings are RELATIVE to the Installation. Every System is different/unique and each has it's own Line Losses relative to the build... If someone has 15' long Inverter Cables to Batter & another has 6' teh derating will obviously be different. IF I say charge to 28.0V based on my numbers on my SCC that's for my system, with my losses compensated for and as such that only would actually put 27.7V to the Battery Terminals !

In other words, we can all say "Use these Voltages" which is a Generic Start Point BUT this always has to be tweaked for YOUR SPECIFIC BUILD it isn't universal. If you have 0.3V Line Loss between SCC & Battery Terminal(s) then you have to correct the Voltage so that it is at "28.0" at the Battery Terminal so the SCC would be 28.3 for charging. But when Discharging only (no solar input) the Line drop CHANGES so the Inverter settings have to be tweaked for cutoffs etc accordingly. IF you want to cut off batteries at 21.0V but there is a 0.4V drop between Inverter & Battery(s) then when the Inverter sees 21.0V it may really be 21.4V @ Battery so again you have to correct for it. Line Loss/Voltage-Amperage drop changes from Charging & Discharging as well ! Measure it with a good DVOM and see for yourself. SCC Terminals & Batt Terminals and Inverter/Charger Terminals to Batt Terminals. So so when charging (without big output) and when discharging (without input).

Lastly.
Please remember every system will have its own "personality" and peculiarities that need to be adjusted & compensated for. Lithium Based Batteries unlike FLA/AGM/Gel ARE MilliVolt & MilliAmp sensitive and the working voltage curve is very flat so accuracy is required if you want to know the Real State of Affairs with the packs and their SOC. AND JUST TO COMPLICATE IT... AIO's have their quirks unique to them, just like component systems with separate SCC, Inverter - Charger and all the stuff in between.
 
Thank you Steve

In other words, we can all say "Use these Voltages" which is a Generic Start Point BUT this always has to be tweaked for YOUR SPECIFIC BUILD it isn't universal. If you have 0.3V Line Loss between SCC & Battery Terminal(s) then you have to correct the Voltage so that it is at "28.0" at the Battery Terminal so the SCC would be 28.3 for charging. But when Discharging only (no solar input) the Line drop CHANGES so the Inverter settings have to be tweaked for cutoffs etc accordingly. IF you want to cut off batteries at 21.0V but there is a 0.4V drop between Inverter & Battery(s) then when the Inverter sees 21.0V it may really be 21.4V @ Battery so again you have to correct for it. Line Loss/Voltage-Amperage drop changes from Charging & Discharging as well ! Measure it with a good DVOM and see for yourself. SCC Terminals & Batt Terminals and Inverter/Charger Terminals to Batt Terminals. So so when charging (without big output) and when discharging (without input).


This the type of tangible information im am looking for , I am personally aware of this BUT did not take into consideration that others may not be. . . and it needs to be added to the post because of this. There are several people on this thread saying this is wrong or that is wrong, BUT not actually bringing articulatable tangible data , you my friend did!!! Thank you . will be adding . in regards to your other data, It is good but would not be sure how to summarzize into a tangible data points for addition to the main post


Going to summarize your post as the following [let me know if missed anything]. .

Every connector in your system as well as wire length attributes to a certain amount of resistance and which results in voltage loss. in other words If you set the charge controller to 3.5 volts the battery may only see 3.2 due to this voltage loss You have to adjust the controller to compensate for the loss. you may also have different losses for different components in your system depending on where there located for example if your inverter is on a longer length of cable than where the charge controller is located.

HOW TO RESOLVE use a good quality meter . . .measure the terminals at the battery pack , measure the terminals at the controller or inverter, subtract the difference and add this difference to your charge controller or inverters settings. [when testing for charger controller you want to be under charge, for inverter you would want to be under discharge]. measure and test again to ensure this was sufficient compensation
 
Last edited:
The Voltage Drop is also an Amperage loss as well and that is harder to measure but with a Very Good SmartBMS with active monitoring that yoi can observe it's not that hard.

Always check your components SCC, Inverter/Charger under BOTH Conditions, Charge & Discharge. Also when under "average" Load and when Idle (static normal load).

Every single component affects Voltage & Amperage and those little 0.05 & 0.1 Volts add up FAST ! People would be surprised (shocked even) at how much loss there is just going through a Fuse and it is different for each type, IE MRBF versus T-Fuse for example, even different breakers at differening Amp ratings have a varying drop. Drop is different between 100A or 200 even same brand & type of fuse.

This si why I have a link in my signature on System Calibration but noone looks....
Anyways, I can't add more to this thread so back to lurk mode.
 
I read through this whole thing...
No one mentioned One Obvious Point which is Real-Time, Real-World observed & observable. I am Deep North so it get's serious, I heat my powerhouse to 10C/50F and my battery packs because they are stacked ATM will have 9C (bottom shelf) to about 12C on the top shelf. It may only be a 3 Degree spread BUT the Charging & Discharging Rates are observably different with Identical Packs ! Note that in Summer the "Bottom" pack is also cooler than the uppers and similar things happen and it get's "interesting" if the battery packs go over 30C/86F.

Because it is close it isn't wild or crazy BUT it is noticeable as the colder packs will take Less Amps during charge than the warmer ones and that can be up to 3A difference per pack. As they take charge & warm-up from it, they will increase the Amps taken and will level up at midpoint typically. At the end of a long charge cycle (>4 hours) all of the packs are usually at the same temp +/- 0.1Celsius

ALSO I want to point out that Charge Settings are RELATIVE to the Installation. Every System is different/unique and each has it's own Line Losses relative to the build... If someone has 15' long Inverter Cables to Batter & another has 6' teh derating will obviously be different. IF I say charge to 28.0V based on my numbers on my SCC that's for my system, with my losses compensated for and as such that only would actually put 27.7V to the Battery Terminals !

In other words, we can all say "Use these Voltages" which is a Generic Start Point BUT this always has to be tweaked for YOUR SPECIFIC BUILD it isn't universal. If you have 0.3V Line Loss between SCC & Battery Terminal(s) then you have to correct the Voltage so that it is at "28.0" at the Battery Terminal so the SCC would be 28.3 for charging. But when Discharging only (no solar input) the Line drop CHANGES so the Inverter settings have to be tweaked for cutoffs etc accordingly. IF you want to cut off batteries at 21.0V but there is a 0.4V drop between Inverter & Battery(s) then when the Inverter sees 21.0V it may really be 21.4V @ Battery so again you have to correct for it. Line Loss/Voltage-Amperage drop changes from Charging & Discharging as well ! Measure it with a good DVOM and see for yourself. SCC Terminals & Batt Terminals and Inverter/Charger Terminals to Batt Terminals. So so when charging (without big output) and when discharging (without input).

Lastly.
Please remember every system will have its own "personality" and peculiarities that need to be adjusted & compensated for. Lithium Based Batteries unlike FLA/AGM/Gel ARE MilliVolt & MilliAmp sensitive and the working voltage curve is very flat so accuracy is required if you want to know the Real State of Affairs with the packs and their SOC. AND JUST TO COMPLICATE IT... AIO's have their quirks unique to them, just like component systems with separate SCC, Inverter - Charger and all the stuff in between.
temperature dependence of internal resistance resulting in charge imbalance, thank you for bringing this issue to bear, interesting..

even if it is just a few degrees C, i'll try to keep this in mind.

one of the papers earlier in the thread shows internal resistance going up by about 10% going from room temperature to freezing.

thank you for sharing!
 
The Voltage Drop is also an Amperage loss as well and that is harder to measure but with a Very Good SmartBMS with active monitoring that yoi can observe it's not that hard.

Always check your components SCC, Inverter/Charger under BOTH Conditions, Charge & Discharge. Also when under "average" Load and when Idle (static normal load).

Every single component affects Voltage & Amperage and those little 0.05 & 0.1 Volts add up FAST ! People would be surprised (shocked even) at how much loss there is just going through a Fuse and it is different for each type, IE MRBF versus T-Fuse for example, even different breakers at differening Amp ratings have a varying drop. Drop is different between 100A or 200 even same brand & type of fuse.

This si why I have a link in my signature on System Calibration but noone looks....
Anyways, I can't add more to this thread so back to lurk mode.
good point!

when setting an MPPT parameters, checking the raw battery bank voltage (excluding BMS; just lowest negative terminal to top positive terminal. check the voltage with a digital multimeter.

even with victron MPPT the voltage of the battery shown will be slightly different for me.

scale factor = real value / measured value

multiply scale factor by desired setting

current changes the drop, so doing this while charging at an appreciable rate is best

admittedly, i do not always do this calibration step, but without it, even entering the "optimal" parameters will be off by a bit.

edit: clarify that current should be flowing when correcting for drop
 
Last edited:
Very good point raised by @Steve_S
Every System is different/unique and each has it's own Line Losses relative to the build...

This is the cause of so many questions I receive like "why the charger goes to float before the battery is full?" In addition you must take into account that this voltage drop (loss) is not constant (proportional to the current)... and that makes it a nightmare to set all the chargers and regulators.

The original question posted by @Go2Guy is a very important one to balance energy availability and battery life. I agree with most people on this thread who said it cannot be done with voltage based regulator.

Six years ago I asked myself the same question and I came up with those considerations:
  • it has to be managed as close to the battery as possible (the BMS is the closest to the battery - but these management fonctions must be separated from the safety /protection functions of the BMS, and cannot interfere with them)
  • accurate estimation of state of charge (< 3% error after 30 days) -> need for an accurate current measure, most importantly when current is low which is most of the time (an error of current measure of 0.1 A leads to 72 Ah SOC error after 30 days)
  • many outputs to control different equipment (charger, inverter, contactors, boiler, cell heater, fan...) each with different control mechanisms (plus CAN bus)
  • ability to activate each output based on multiple criteria (low/high cell voltage, differential in cell voltage, low/high temperature, differential in cell temperature, SOC) with a user defined reset value (for example turn on the cell heater when cell lowest temperature is below 5°C and stop it when it reaches 9°C; turn on the water boiler when SOC > 70% and off below 50%.....)
  • an independent charge cycle management function to set the range of SOC to stop/start the chargers and make sure that from time to time it lets the chargers do a full charge and resets the SOC to 100%
... and that makes for a centralized Energy Management System... but the next question is "how do you know it will work as expected?" when a bad connection or a human error in the settings can make the whole system useless or even dangerous?
You need a way to simulate each activation / deactivation criteria and make sure it produces the expected outcome (a bit like the trip button on a circuit breaker, or any industrial system where you need to validate regularly that the safety features are operational).

The last challenge I have not yet found a solution for is the discharge floor. Lets say you set to restart the chargers at 15% SOC and that situation happens at the end of the day... you probably will not have enough energy to last the night. Same if the next day is cloudy. So you need to take some margin to account for energy usage during the night and for times of low energy production.
I am thinking about the need to set different charge profiles (min / max SOC) and easily change from one to the other based on the situation. Something like:
  • summer full load when you know your battery will be full by early afternoon
  • storage when the cabin or boat is not used, but still needs power and you want the battery to stay within 40-55% SOC
  • low energy production (bad weather or winter) where you may want to keep the battery a bit more charged
  • keep battery full without float - say between 95 and 100% SOC - in anticipation of running an energy hungry equipment (water maker, heater...)
  • "forced balance" profile where the SOC stays between 99 and 100% to keep the battery close to the upper knee and give time for the balancer to do its work
 
Last edited:
I have a set of decade old cells that prove just about every statement made in this thread wrong.

It concerns me that there is so much misinformation around especially on this forum regarding LiFePO4.


I asked tom 2 times to let me know what to correct on my main post and he straight ignorned me,
tom please do clarify what exactly is wrong, im wish to correct any errors to the main post, i want to take all data into consideration for that write up in the first post, again my goal is a single place where others can go the have all the details how to program there controllers IF maxium longevity and minium cell stress is there goal, If that is not there goal there is plenty of of other resources, BUT on this exact goal very little. ,I am aware each system is different hence the write in the first post up accounts for that by explaining and being flexible with the voltages you choose all while keeping maxium longevity in mind

please let me know what exactly is wrong i will update the first main post.

my goal here really is is accuracy.
But ironically he wrote, this:

. . . Anyone that is keen to ask further questions feel free to do so via PM like most people do.

. . . At the end of the day were just a little bit different, not saying tom doesnt have knowledge and expreience. i actually would like his input on any issue i may have missed, but instead of Contributing he just tells us that every body on this thread is more or less wrong, but whatever. what matters is all the amazing people who did contribute, not sure how to tag people yet, but again Thank you to all those who did contribute
 
temperature dependence of internal resistance resulting in charge imbalance, thank you for bringing this issue to bear, interesting..

even if it is just a few degrees C, i'll try to keep this in mind.

one of the papers earlier in the thread shows internal resistance going up by about 10% going from room temperature to freezing.

thank you for sharing!
Yesterday for example I allowed my bank to drain to 20% and no sun (max was 75W input so No Sun), so I fired up Genset.
Powerhouse temp was at 10.2 Celsius. Bottom Pack was 9.5C. next up was 11.0C and next was 11.0 until charging. (Temp config while I shift things around). Bottom 280 was taking 22A Identical 280 above was taking 25A, the next a 175AH 18A (it is always proportional to cell/pack AH). Current Temp Config is 280AH Bulk-Cells on Bottom 2 shelves & 1x 175 on Top Shelf.

The Observable Effects: In the current configuration config, the 175AH's always taking the proportional charge and that is Great, and eventually they reach full & the BMS' start to cycle for cutoff typically once they reach about 2A being taken. The bigger packs of course absorb the extra quite happily which then helps fill them with a higher charge, meanwhile, the 175's are busy balancing & leveling up nice & full at MY SET FULL of 3.425 Volts per cell. The Bottom 280AH will be a bit behind because of the temps (catching up as during charge it warms up) and takes a Higher Charge so it is always the last pack to reach my 100% designation. As the second last pack reaches full and starts the BMS Cycling pretty much everything goes to the last man standing. At this point it sucks in everything given but then it hits full pretty quick after that (you can actually watch the resistance push back against the charge input) and then it when it gets low enough the "Collective EndAmps/TailCurrent" of the bank hits 16A and the charging flips to Float. Now because teh Last Pack was hit with heavy amps which then get dropped with resistance Internal Cell Deviation can get up to 200mv (It is a Bulk Cell Pack) but that is quickly eliminated with the QNBBM Active Balancer.
As soon as FLOAT is triggered it's weird because you will see the packs within the bank balance up as all of the cells in all of the packs get closer to even and little trickles come up from Float and in a fairly short timeframe (< 1hour) All packs are at the same voltage, the cells have all settled & balanced up even and I have to say, it sure is Pretty to look at.

* BMS Cycling: When you reach full and the amps drop the BMS will eventually reach a state that the cells really don't take too much and so the BMS will cycle on/off as balancing processes while the other packs are taking charge still. Essentially the BMS opens to allow a trickle in to level up during balancing in pulses when it needs it. After a little while the full packs go into Storage Mode (built-into the Chargery BMS') and will idle as everything is topped. This allows the balance charge amps to fill the others.

* Not all BMS' have a "Storage Mode" but may have a sleep or rest mode or nothing at all and may take charge all the way down to less than 1A being accepted.

EndAmps/TailCurrent:
The simple explanation. This is determined by the Largest Capacity battery in your battery bank. The formula is ( IE: 200AH X 0.05 = 10A) (280AH X 0.05 = 14A.). Now with a Single Battery this is simple as pie because once 14A is reached it triggers float and starts to top off the pack. When there s Multiple Packs in a Bank, then the "Collective" has to reach that point to be seen by SCC Charger for it/them to switch over modes. This can look odd as the last pack finishes charging but as soon as that stops within 15 minutes or less they will all be levelled up and balancing out.

ATM, I am using Chargery BMS8T-300's & QNBBM-8S Active Balancers on all production packs. and this is the observable situation. I am switching everything over to a single model of JKBMS with 2A Active Balancers, Bluetooth etc built into one unit. I do not think that much will change in essence but as the conversion is completed and they are replaced onto the One Big Shelf, they will more than likely be within 0.5C Temp difference across them all, so that will change things up a little bit.

SERVER/TELECOM Rack-mounted batteries, or Battery Packs on shelves.
Again, remember heat rises and the bottom will always be cooler than the top and while not a Huge Difference it does make a difference. Now if you are in an area where it is always between 20-30 Celsius year-round then no worries EH ! But if you live in an area with -30C to +40C temps, you're in the thick of it.

IR in the cold. During my Thrash Testing last spring which pushed Max Amps at pack limits, one of those cycles was with the Batteries at 0C/32F. The IR was definitely higher as they resisted and would not accept full charge at them but as they warmed during charge the amount of Amps taken went up. @ 0C/32F they took 40A at the start ut within an hour they were up to 100A and another 1/2 or so the 280 was taking 140A but then it started to drop as it filled up. I did that with 2x280's and 2x175's each independently. With the reconfig, I will have to do another Thrash Test Run and may document that whole process on here for everyone to see but it is very time-consuming to do (worse now because the bank grew) and there is a lot in between, fortunately it is not too physically demanding so it should not be a problem to do for the last time. I want to Doc it anyways to go with the Solar System Docs I am doing to go with the house when it gets sold, along with the House design & build documentation that started with the 1st tree being cleared off the land.
 
. . . At the end of the day were just a little bit different, not saying tom doesnt have knowledge and expreience. i actually would like his input on any issue i may have missed, but instead of Contributing he just tells us that every body on this thread is more or less wrong, but whatever. what matters is all the amazing people who did contribute, not sure how to tag people yet, but again Thank you to all those who did contribute
I've been through the PM thing with toms. When we finished, I'll just say he was embarrassed (or should have been) and I felt like my time was completely wasted.

I have no idea if he actually has something to contribute here. I'm guessing not. He seems to just want to walk into a circle of people having a conversation, crap on the floor, and walk back out. Quite a sport.

Fortunately, you are getting some really good information, although it may be hard to coalesce into the neat instructions you had started out trying to fill out.
 
Yesterday for example I allowed my bank to drain to 20% and no sun (max was 75W input so No Sun), so I fired up Genset.
Powerhouse temp was at 10.2 Celsius. Bottom Pack was 9.5C. next up was 11.0C and next was 11.0 until charging. (Temp config while I shift things around). Bottom 280 was taking 22A Identical 280 above was taking 25A, the next a 175AH 18A (it is always proportional to cell/pack AH). Current Temp Config is 280AH Bulk-Cells on Bottom 2 shelves & 1x 175 on Top Shelf.

The Observable Effects: In the current configuration config, the 175AH's always taking the proportional charge and that is Great, and eventually they reach full & the BMS' start to cycle for cutoff typically once they reach about 2A being taken. The bigger packs of course absorb the extra quite happily which then helps fill them with a higher charge, meanwhile, the 175's are busy balancing & leveling up nice & full at MY SET FULL of 3.425 Volts per cell. The Bottom 280AH will be a bit behind because of the temps (catching up as during charge it warms up) and takes a Higher Charge so it is always the last pack to reach my 100% designation. As the second last pack reaches full and starts the BMS Cycling pretty much everything goes to the last man standing. At this point it sucks in everything given but then it hits full pretty quick after that (you can actually watch the resistance push back against the charge input) and then it when it gets low enough the "Collective EndAmps/TailCurrent" of the bank hits 16A and the charging flips to Float. Now because teh Last Pack was hit with heavy amps which then get dropped with resistance Internal Cell Deviation can get up to 200mv (It is a Bulk Cell Pack) but that is quickly eliminated with the QNBBM Active Balancer.
As soon as FLOAT is triggered it's weird because you will see the packs within the bank balance up as all of the cells in all of the packs get closer to even and little trickles come up from Float and in a fairly short timeframe (< 1hour) All packs are at the same voltage, the cells have all settled & balanced up even and I have to say, it sure is Pretty to look at.

* BMS Cycling: When you reach full and the amps drop the BMS will eventually reach a state that the cells really don't take too much and so the BMS will cycle on/off as balancing processes while the other packs are taking charge still. Essentially the BMS opens to allow a trickle in to level up during balancing in pulses when it needs it. After a little while the full packs go into Storage Mode (built-into the Chargery BMS') and will idle as everything is topped. This allows the balance charge amps to fill the others.

* Not all BMS' have a "Storage Mode" but may have a sleep or rest mode or nothing at all and may take charge all the way down to less than 1A being accepted.

EndAmps/TailCurrent:
The simple explanation. This is determined by the Largest Capacity battery in your battery bank. The formula is ( IE: 200AH X 0.05 = 10A) (280AH X 0.05 = 14A.). Now with a Single Battery this is simple as pie because once 14A is reached it triggers float and starts to top off the pack. When there s Multiple Packs in a Bank, then the "Collective" has to reach that point to be seen by SCC Charger for it/them to switch over modes. This can look odd as the last pack finishes charging but as soon as that stops within 15 minutes or less they will all be levelled up and balancing out.

ATM, I am using Chargery BMS8T-300's & QNBBM-8S Active Balancers on all production packs. and this is the observable situation. I am switching everything over to a single model of JKBMS with 2A Active Balancers, Bluetooth etc built into one unit. I do not think that much will change in essence but as the conversion is completed and they are replaced onto the One Big Shelf, they will more than likely be within 0.5C Temp difference across them all, so that will change things up a little bit.

SERVER/TELECOM Rack-mounted batteries, or Battery Packs on shelves.
Again, remember heat rises and the bottom will always be cooler than the top and while not a Huge Difference it does make a difference. Now if you are in an area where it is always between 20-30 Celsius year-round then no worries EH ! But if you live in an area with -30C to +40C temps, you're in the thick of it.

IR in the cold. During my Thrash Testing last spring which pushed Max Amps at pack limits, one of those cycles was with the Batteries at 0C/32F. The IR was definitely higher as they resisted and would not accept full charge at them but as they warmed during charge the amount of Amps taken went up. @ 0C/32F they took 40A at the start ut within an hour they were up to 100A and another 1/2 or so the 280 was taking 140A but then it started to drop as it filled up. I did that with 2x280's and 2x175's each independently. With the reconfig, I will have to do another Thrash Test Run and may document that whole process on here for everyone to see but it is very time-consuming to do (worse now because the bank grew) and there is a lot in between, fortunately it is not too physically demanding so it should not be a problem to do for the last time. I want to Doc it anyways to go with the Solar System Docs I am doing to go with the house when it gets sold, along with the House design & build documentation that started with the 1st tree being cleared off the land.
@Steve_S - I know I've heard repeatedly that if cells at or near 0°C we need to charge at reduced current, if at all. That would imply a charger that would manage the charge current, which most will not. Your info is critical here, because you have found that the cells accept a much smaller current at low temps. This is huge!

I know you have used BMS passive charge balancing in conjunction with the QNBBM active balancing, and from what you reported here it sounds like it works well. Does the JKBMS Active Balancing allow for charge only balancing, similar to the passive balancing from other BMS's? Or is it more like the QNBBM, in that it is balancing all the time?
 
JK is programmable for it's Balancing modes with the BT App.
My Thrash Tests are outside of spec to push extreme edges, not something folks should do unless they are willing to accept the risk of making Big Magic Smoke and hearing their wallets scream. The Temperature Charge Rates vary between manufacturers and within the chemistry variations (their special sauce) which each company has its own blend. Of course LFP Cells doped with Yttrium can take charge & operate in Below Zero temps but that is a different bucket full of dollars there.

Also: Real/True EV LFP are NOT these Blueskinned buggers, these can be used in LEV (Light EV's like low-speed neighbourhood rides) but not in a Real EV like a Tesla or VW Electric. Chemistry, Packaging/Formats are quite different. The Chinese Vendors say they are for EV Use but NEVER MENTION for Low-Speed stuff... and leave people to make the Assumptions that they do. I PLAY with E-Mud Trucks and other things of that kind of intensity and believe me, the Blueies never even get close. It's a side thing I do not talk about here as this is not the place for it. 1500HP Dual Motor MUDDER (latest) should be spewing up muck later this spring.
 
JK is programmable for it's Balancing modes with the BT App.
My Thrash Tests are outside of spec to push extreme edges, not something folks should do unless they are willing to accept the risk of making Big Magic Smoke and hearing their wallets scream. The Temperature Charge Rates vary between manufacturers and within the chemistry variations (their special sauce) which each company has its own blend. Of course LFP Cells doped with Yttrium can take charge & operate in Below Zero temps but that is a different bucket full of dollars there.
Thanks for that info. I'm not suggesting that anyone should do your thrash test as a normal course of business. But it is still interesting that the cells do somewhat self-regulate the charging current at lower temperatures. I've seen people argue for closed-loop coordination between the charging sources and the LFP batteries so that - among other things - the current out of an SCC can be kept low. Most SCCs can't do that though, and your experience says it probably isn't needed.

Since I will be keeping my cells generally between 10°C/50°F and 15°C/60°F it doesn't really affect me, but it is good to know and it is a great contribution to this thread. Thank you!
 
Last edited:
Thanks @Horsefly glad it's helpful.
You also touched on SCC but Inverter Chargers also factor into it too. None are the same and especially when you cross Tiers/Grades. A Midnite Controller can do more & be tweaked that low cost lower tier products. The difference between a $1000 SCC or a $500 SCC. Like trying to compare a Samlex EVO $2000 Inverter/Charger to a cheapo $500 Giandel... Very Few know how programable & configurable a Samlex is. NOT Apples to Apples ! Even the charging algorithms and flexibilities are so vastly different. I have had "Value Grade" Inverters & SCC's and it did not take long to step up to Tier-1 and Eat the loss but functionality/capability for me was key. Many AIO's (won't mention brands) use the lowest of the lowest non IP Restricted software/firmware and the hardware is equivalent to "Value Grade" and lacks many advanced capabilities & functions which all add $$$ to the MSRP. A $1200 AIO sounds great on the surface but it is a Tired Chevette 1.4L engine powered dawg using out of date tech. "compromises"
 
JUST A REMINDER, if any one wants to contribute to the main post on the best controller settings, please try to format it into bullet points / digestible tidbits of information /cliff notes. . there is so much great information here BUT if its to much it becomes very difficult to summarize and put into the main post for others.
 
Thanks @Horsefly glad it's helpful.
You also touched on SCC but Inverter Chargers also factor into it too. None are the same and especially when you cross Tiers/Grades. A Midnite Controller can do more & be tweaked that low cost lower tier products. The difference between a $1000 SCC or a $500 SCC. Like trying to compare a Samlex EVO $2000 Inverter/Charger to a cheapo $500 Giandel... Very Few know how programable & configurable a Samlex is. NOT Apples to Apples ! Even the charging algorithms and flexibilities are so vastly different. I have had "Value Grade" Inverters & SCC's and it did not take long to step up to Tier-1 and Eat the loss but functionality/capability for me was key. Many AIO's (won't mention brands) use the lowest of the lowest non IP Restricted software/firmware and the hardware is equivalent to "Value Grade" and lacks many advanced capabilities & functions which all add $$$ to the MSRP. A $1200 AIO sounds great on the surface but it is a Tired Chevette 1.4L engine powered dawg using out of date tech. "compromises"
Yeah, I don't think as much about the Inverter/Charger as I should. Not to make you artic cats jealous, but we almost never have a need to run the genset. I personally have maybe run it once or twice since we installed the solar in 2017. In fact, my bigger concern is that we let it sit too long and if/when we do need it we may not be able to start it.

Both our SCC (Schneider MPPT60-150) and our Inverter/Charger (SW4024) are pretty good quality and highly configurable. I do know there are a couple of issues that drive me nuts about them:
  1. You can't actually configure the tail current. It is hard-coded for 0.02C. My AGMs really liked to stay in absorption longer, so I had to put in a false Ah rating for the battery bank to get the SCC to stay in absorption until 0.01C. I think this is a really stupid mistake by Schneider.
  2. There is some strange feedback between the SCC and the SW4024. My brother discovered that if he ran the genset to give a hard charge to the battery while the SCC was harvesting solar, the SW4024 got confused. The transfer switch would cycle back and forth every few seconds, almost like it couldn't qualify the input from the genset. But if the PV input to the SCC was shut down, the SW4024 went back to working perfectly, charging the battery and providing power to the loads in the house.
It may very well be that these two issues have been fixed in subsequent firmware revisions, but I don't currently have a way to update firmware. After reading about lots of people "bricking" their Schneider equipment trying to update firmware, I don't think I want to anyway.
 
Yeah, I don't think as much about the Inverter/Charger as I should. Not to make you artic cats jealous, but we almost never have a need to run the genset. I personally have maybe run it once or twice since we installed the solar in 2017. In fact, my bigger concern is that we let it sit too long and if/when we do need it we may not be able to start it.

Both our SCC (Schneider MPPT60-150) and our Inverter/Charger (SW4024) are pretty good quality and highly configurable. I do know there are a couple of issues that drive me nuts about them:
  1. You can't actually configure the tail current. It is hard-coded for 0.02C. My AGMs really liked to stay in absorption longer, so I had to put in a false Ah rating for the battery bank to get the SCC to stay in absorption until 0.01C. I think this is a really stupid mistake by Schneider.
  2. There is some strange feedback between the SCC and the SW4024. My brother discovered that if he ran the genset to give a hard charge to the battery while the SCC was harvesting solar, the SW4024 got confused. The transfer switch would cycle back and forth every few seconds, almost like it couldn't qualify the input from the genset. But if the PV input to the SCC was shut down, the SW4024 went back to working perfectly, charging the battery and providing power to the loads in the house.
It may very well be that these two issues have been fixed in subsequent firmware revisions, but I don't currently have a way to update firmware. After reading about lots of people "bricking" their Schneider equipment trying to update firmware, I don't think I want to anyway.
see PM, we have hijacked this too much already.

My apologies to the readers for the Hijack, was not intentional. Back to your Regular Programming.
 
Back
Top