A SCC “Hack” For Charging Lithium to Less Than 100% State of Charge

Johncfii

Solar Addict
This is a modification to my solar charge controllers, for charging Lithium batteries, that I haven’t seen mentioned elsewhere. My SCC’s are Outback Solar FM80’s, but in reading the manuals for other similar charge controllers, such as the Victron “SmartSolar”, the Schneider “Context”, and the Morningstar “Tristar”, it appears that this modification might apply equally well to other brands of similar SCC.

This modification is just a bit complex to explain, so this is a rather long post.

I have a DIY 42 kWH lithium battery bank that is working wonderfully well for us. It is a lot nicer to rely on than was our old AGM lead-acid bank.

I know that some forum users here consider it “Pollyanna-ish” to want to limit the state of charge of our lithium batteries to something like between 15% to 85% of full capacity. But going the DIY-route, building a large lithium pack from low-cost 280-AH cells direct from China, allowed me to install much “extra” battery capacity, which means not needing to always draw 100% out of the batteries. Yet, I still have the extra capacity available if a long cloudy period is expected. So, I figure, if it might get us an extra two or three years use out of these batteries, why not limit the stress on cells?

The problem is, with my Outback FlexMax charge controllers, how do you reliably control charging to terminate at something like 85% of full charge? Like Victron, and Schneider, and others, the Outback FM80’s are designed to terminate charge based upon some combination of time in ABSORB mode, or decreasing tail current (which Outback calls Charged Return Amps), or a combination of measured battery voltage, time in ABSORB, and declining charge current.

As an aside, it happens that I don’t ever directly rely on my BMS to interrupt charge or discharge, and I don’t pass charging current or discharge current through my BMS. I could not bear the aesthetics of inserting these “little” BMS boards (even the ones rated at 120 amps) in line with my carefully chosen, lovely and expensive, large-gauge welding cables that connect my battery bank. Instead, I use the alarm outputs of my much-less-expensive 30-amp JBD BMS’s, via control relays, to stop charging or discharging as appropriate, if BMS alarm conditions are ever tripped.

One often mentioned way that these charge controllers can be setup to charge lithium, to something less than 100% charge, is to carefully select a lower ABSORB mode voltage, and pick a tail current that terminates charge in the neighborhood of 85%. Then you set a “pseudo-FLOAT” voltage that keeps the SCC supplying current to satisfy your load, while allowing some small current to flow in or out if the battery bank. Of course, we don’t normally FLOAT lithium batteries, but we can use this slightly lower “FLOAT” stage voltage setting to keep the SCC working to satisfy your load without further charging the battery bank.

After weeks and weeks of trying to find ABSORB Voltage/Time/Tail Current/FLOAT voltage settings to optimize charging to about 85% of full charge, I found that to be an exercise in frustration. It was easy to find a combination of settings that would give good results when the batteries were being charged with no load on the system. And it was easy to find a different combination of settings that would give good charging results, while at the same time drawing a moderate load on the system. And it was easy to find a still different combination of settings that would give results if charging the batteries while drawing a heavy load from the system. But it was impossible, at least for me, to find a single combination of settings that gave good results under the full range of sun and load conditions. Since I have to travel for work for two weeks at a time, I needed a wife-proof scheme that would allow her to operate the system with minimal intervention.

The modification that I’m now using is possible because these charge controllers have an external temperature sensor input, intended to adjust lead-acid battery charging voltages, depending on battery temperature. Because we don’t use temperature compensation for lithium batteries, this temperature sensor input is available to utilize in a different way.

The temperature sensor, which the manufacturers call either an RTS (Remote Temperature Sensor) or an RTD (Remote Temperature Detector) are simply metallic elements whose electrical resistance changes with temperature. For the Outback SCC, both the ABSORB stage, and FLOAT stage voltages are calibrated for room temperature. But if an RTS is connected to the system, then for every degree centigrade lower/higher than room temperature that is detected by the RTS, the stage voltages are increased/decreased by .06 volts (for a 24-volt system) up to a user-programmable limit.

But instead of using the actual RTS, a fixed-value resistor can be switched in circuit, across the RTS input terminals by a relay, to fool the solar charge controller into temporarily increasing or decreasing charge stage voltage. By experimentation, I found that switching a 15k ohm resistor across the RTS input terminals, I can increase the stage charge voltage by my pre-programmed 0.4 volts limit.

For utilizing this feature to help control charging of a lithium battery, I use it in conjunction with a coulomb-counting SOC meter having a programmable relay output. The SOC meter that I chose is the Victron BMV712, which I’ve found to be quite accurate, needing recalibration very infrequently once Peukert exponent, and Battery Charge Efficiency factors, were optimized by trial and error. My Victron BMV712 relay output is wired so that relay contacts are in series with both a 15k ohm resistor, and the RTS input terminals. The Victron relay contacts are programmed to open when my desired upper-limit SOC is reached.

For example, my ABSORB stage voltage is set in the FM80 to 27.1 volts. FLOAT stage voltage is set to 26.8 volts. Below my desired SOC charge termination setting, the 15k ohm resistor input tells the FM80 to increase ABSORB and FLOAT voltages to 27.5 and 27.2 volts respectively. When 85% SOC is reached, the 15k ohm resistor is switched out of circuit, ABSORB voltage drops to 27.1 volts, charging current drops substantially, the SCC quickly evaluates the batteries as being fully charged, and the SCC output voltage drops to FLOAT level, now providing current to any load, without further charging the battery bank. By carefully selecting this FLOAT voltage setting, and if the sun is still out, the solar panels continue to satisfy any load, without draining the batteries. The Victron BMV712 can even be programmed to switch the 15k ohm resistor back in circuit if the battery SOC drops by as little as 1%, thereby slowly cycling the SOC between 84% and 85%, if load and sun conditions change through the afternoon. This way, when the sun finally goes down, the batteries are always sitting between 84% and 85% SOC, as long as sun conditions were favorable.

Another big advantage of using this method to terminate SOC is that charging current does not taper off as batteries approach my desired SOC. For example, using the ABSORB voltage, time and tail current method to terminate SOC, I have seen solar energy wasted when charging current starts to taper off at as low as 70% SOC. Bumping the ABSORB voltage up, until exactly the desired SOC is reached, prevents this inefficient tapering of charge current. This increases the efficiency of solar energy harvest under some combinations of load, solar conditions, and battery SOC. In my observation, limiting SOC to 85% allows the batteries to keep charging at the highest rate my system can deliver, right up to 85%. This might still be true at 90% SOC, but I have not tried it.

Yet another advantage is that I also use the RTS input to terminate charging in case of BMS cell or pack high voltage alarm. Because I don’t rely on the BMS directly to interrupt charging in the event of a battery or cell over-voltage condition, I needed another way to shut down charging current. In the event of a BMS over voltage alarm, a different control relay switches a different resistor, this one a 1.5k ohm value resistor, across the RTS input terminals. This fools the charge controller into thinking that the batteries are very hot, and charge voltage drops to zero, without the potential harm that might occur if the charge controllers are just turned OFF while solar input is connected. Or a slightly different resistor value can be selected to reduce charge voltage to some other low voltage.

I hope my long explanation is clear, and that it might be useful to other forum users interested in limiting the SOC of their lithium batteries to a maximum of something less than 100%.
 
Last edited:

snoobler

Solar Honey Badger
Summary:
  1. Spoof FM80 temp sensor with 15kΩ resistor to think it's at a lower temp to raise absorption voltage via FLA/AGM/Gel voltage compensation feature. In series with BMV normally closed relay.
  2. Set BMV relay to open @ X% SoC to take 15kΩ resistor out of circuit to have FM80 drop absorption voltage and drop to float.
1619364885310.png


FWIW, I don't see how the above can work with a Victron MPPT. They get their temperature from:
  1. Internal
  2. Smart battery sense via VE.Smart Network
  3. BMV via VE.Smart Network or GX device
In other words, they have no place to put a 15kΩ or other resistor; HOWEVER, I suspect you could just connect the BMV relay to the control circuit on the MPPT. Relay opens, MPPT stops charging.

How do you deal with SoC drift? I know you say you find it to be accurate, but Victron recommends fully charging "a couple times a month" to ensure it remains accurate.
 

Johncfii

Solar Addict
FWIW, I don't see how the above can work with a Victron MPPT. They get their temperature from:
  1. Internal
  2. Smart battery sense via VE.Smart Network
  3. BMV via VE.Smart Network or GX device
In other words, they have no place to put a 15kΩ or other resistor; HOWEVER, I suspect you could just connect the BMV relay to the control circuit on the MPPT. Relay opens, MPPT stops charging.

How do you deal with SoC drift? I know you say you find it to be accurate, but Victron recommends fully charging "a couple times a month" to ensure it remains accurate.

Snoobler,
Thanks very much for your summary. It is nicely concise compared to my wordy explanation! LOL

You are surely correct about the Victron unit. My familiarity is only from a not-so-careful reading of the user manual. Of course, a disadvantage of using the battery monitor rely to simply shut down the charge controller output is that suddenly any load is fully drawn from the batteries. That would still work if the relay reset value was set to a SOC value that is a few percent lower than ”full charge“ set point. It would just cause faster cycling of the charge/discharge. Is there an external RTS in any of the Victron units that can be replaced by a fixed resistor?

About SOC monitor drift. I startEd out checking calibration once every couple of days. This led me to find that setting the Battery Charge Efficiency factor to 99%, and Peukert Exponent to 1.0, minimizes drift for my system. When I didn’t see any further significant drift, I started checking it once a week. I still didn’t see any significant drift, and lately I have checked it every two weeks. Fortunately, I still see very little drift, and this has encouraged me to think that it will stay sufficient accurate for more than two weeks, I haven’t yet settled on a permanent schedule.

My relatively low charge and discharge rates might be helping to limit drift. I never charge/discharge at rates higher than a C7 or C8 rate.

Further, since I have a large “buffer” of 15% battery capacity on both ends, small errors probably will have no significant Impact.
 
Last edited:

snoobler

Solar Honey Badger
As I indicated, there is no place to put at 15kΩ resistor or any kind of plug-in temperature sensor, and the only temp input options are smart battery sense via bluetooth or a BMV shared via bluetooth or GX.

Agree it would result in "micro" cycling.
 

willo

Solar Addict
That’s a clever hack. YMMV per manufacturer but I can see the value there.
for victron gear, I am just taking advantage of the remote input (on/off) and adjusting my charge voltage thresholds via an arduino that reads status from my BMS along with other things.
In my case I am adding a few modes like storage/active/off-grid options. Mostly that means limiting SoC via voltage (not ideal but works) and then modifying my battery heating programming to account for energy savings vs performance.

Anyhow, another way you could do similar limiting with any SCC is to add an inline PV array cutoff by using a DSSR from electrodacus (or similar) in between the array and the SCC and enforcing your own cutoff regime.
 

Johncfii

Solar Addict
That’s a clever hack. YMMV per manufacturer but I can see the value there.
for victron gear, I am just taking advantage of the remote input (on/off) and adjusting my charge voltage thresholds via an arduino that reads status from my BMS along with other things.
In my case I am adding a few modes like storage/active/off-grid options. Mostly that means limiting SoC via voltage (not ideal but works) and then modifying my battery heating programming to account for energy savings vs performance.

Anyhow, another way you could do similar limiting with any SCC is to add an inline PV array cutoff by using a DSSR from electrodacus (or similar) in between the array and the SCC and enforcing your own cutoff regime.

The obvious disadvantage to either shutting the SCC completely off, or opening the PV array circuit, is that you reduce potential solar energy harvest. If you have a load on the system at that point, say charging an EV for example, more (or all) of the load suddenly starts being extracted from the batteries at a time when the sun might still be high in the sky. With the method I describe above, that doesn’t happen.

Additionally, if (for example) four days of stormy weather is forecast, and I decide that I want to temporarily raise my upper charge limit to 95%, I can change that target in about 5 seconds with one setting, and be sure of the accuracy. If, instead, we are charging to some target cell or pack target voltage, or with built-in lead-acid charge termination provisions, a lot more fussing and uncertainty is involved.

Yes, equipment does vary. It is a weakness of these Outback SCC that they don’t have any remote ON/OFF terminals, or feature. Thus the need for the 1.5k ohm resistor hack.

It sounds like you’ve done a lot of clever work
 
Last edited:

Mcgivor

Solar Addict
This is a modification to my solar charge controllers, for charging Lithium batteries, that I haven’t seen mentioned elsewhere. My SCC’s are Outback Solar FM80’s, but in reading the manuals for other similar charge controllers, such as the Victron “SmartSolar”, the Schneider “Context”, and the Morningstar “Tristar”, it appears that this modification might apply equally well to other brands of similar SCC.

This modification is just a bit complex to explain, so this is a rather long post.

I have a DIY 42 kWH lithium battery bank that is working wonderfully well for us. It is a lot nicer to rely on than was our old AGM lead-acid bank.

I know that some forum users here consider it “Pollyanna-ish” to want to limit the state of charge of our lithium batteries to something like between 15% to 85% of full capacity. But going the DIY-route, building a large lithium pack from low-cost 280-AH cells direct from China, allowed me to install much “extra” battery capacity, which means not needing to always draw 100% out of the batteries. Yet, I still have the extra capacity available if a long cloudy period is expected. So, I figure, if it might get us an extra two or three years use out of these batteries, why not limit the stress on cells?

The problem is, with my Outback FlexMax charge controllers, how do you reliably control charging to terminate at something like 85% of full charge? Like Victron, and Schneider, and others, the Outback FM80’s are designed to terminate charge based upon some combination of time in ABSORB mode, or decreasing tail current (which Outback calls Charged Return Amps), or a combination of measured battery voltage, time in ABSORB, and declining charge current.

As an aside, it happens that I don’t ever directly rely on my BMS to interrupt charge or discharge, and I don’t pass charging current or discharge current through my BMS. I could not bear the aesthetics of inserting these “little” BMS boards (even the ones rated at 120 amps) in line with my carefully chosen, lovely and expensive, large-gauge welding cables that connect my battery bank. Instead, I use the alarm outputs of my much-less-expensive 30-amp JBD BMS’s, via control relays, to stop charging or discharging as appropriate, if BMS alarm conditions are ever tripped.

One often mentioned way that these charge controllers can be setup to charge lithium, to something less than 100% charge, is to carefully select a lower ABSORB mode voltage, and pick a tail current that terminates charge in the neighborhood of 85%. Then you set a “pseudo-FLOAT” voltage that keeps the SCC supplying current to satisfy your load, while allowing some small current to flow in or out if the battery bank. Of course, we don’t normally FLOAT lithium batteries, but we can use this slightly lower “FLOAT” stage voltage setting to keep the SCC working to satisfy your load without further charging the battery bank.

After weeks and weeks of trying to find ABSORB Voltage/Time/Tail Current/FLOAT voltage settings to optimize charging to about 85% of full charge, I found that to be an exercise in frustration. It was easy to find a combination of settings that would give good results when the batteries were being charged with no load on the system. And it was easy to find a different combination of settings that would give good charging results, while at the same time drawing a moderate load on the system. And it was easy to find a still different combination of settings that would give results if charging the batteries while drawing a heavy load from the system. But it was impossible, at least for me, to find a single combination of settings that gave good results under the full range of sun and load conditions. Since I have to travel for work for two weeks at a time, I needed a wife-proof scheme that would allow her to operate the system with minimal intervention.

The modification that I’m now using is possible because these charge controllers have an external temperature sensor input, intended to adjust lead-acid battery charging voltages, depending on battery temperature. Because we don’t use temperature compensation for lithium batteries, this temperature sensor input is available to utilize in a different way.

The temperature sensor, which the manufacturers call either an RTS (Remote Temperature Sensor) or an RTD (Remote Temperature Detector) are simply metallic elements whose electrical resistance changes with temperature. For the Outback SCC, both the ABSORB stage, and FLOAT stage voltages are calibrated for room temperature. But if an RTS is connected to the system, then for every degree centigrade lower/higher than room temperature that is detected by the RTS, the stage voltages are increased/decreased by .06 volts (for a 24-volt system) up to a user-programmable limit.

But instead of using the actual RTS, a fixed-value resistor can be switched in circuit, across the RTS input terminals by a relay, to fool the solar charge controller into temporarily increasing or decreasing charge stage voltage. By experimentation, I found that switching a 15k ohm resistor across the RTS input terminals, I can increase the stage charge voltage by my pre-programmed 0.4 volts limit.

For utilizing this feature to help control charging of a lithium battery, I use it in conjunction with a coulomb-counting SOC meter having a programmable relay output. The SOC meter that I chose is the Victron BMV712, which I’ve found to be quite accurate, needing recalibration very infrequently once Peukert exponent, and Battery Charge Efficiency factors, were optimized by trial and error. My Victron BMV712 relay output is wired so that relay contacts are in series with both a 15k ohm resistor, and the RTS input terminals. The Victron relay contacts are programmed to open when my desired upper-limit SOC is reached.

For example, my ABSORB stage voltage is set in the FM80 to 27.1 volts. FLOAT stage voltage is set to 26.8 volts. Below my desired SOC charge termination setting, the 15k ohm resistor input tells the FM80 to increase ABSORB and FLOAT voltages to 27.5 and 27.2 volts respectively. When 85% SOC is reached, the 15k ohm resistor is switched out of circuit, ABSORB voltage drops to 27.1 volts, charging current drops substantially, the SCC quickly evaluates the batteries as being fully charged, and the SCC output voltage drops to FLOAT level, now providing current to any load, without further charging the battery bank. By carefully selecting this FLOAT voltage setting, and if the sun is still out, the solar panels continue to satisfy any load, without draining the batteries. The Victron BMV712 can even be programmed to switch the 15k ohm resistor back in circuit if the battery SOC drops by as little as 1%, thereby slowly cycling the SOC between 84% and 85%, if load and sun conditions change through the afternoon. This way, when the sun finally goes down, the batteries are always sitting between 84% and 85% SOC, as long as sun conditions were favorable.

Another big advantage of using this method to terminate SOC is that charging current does not taper off as batteries approach my desired SOC. For example, using the ABSORB voltage, time and tail current method to terminate SOC, I have seen solar energy wasted when charging current starts to taper off at as low as 70% SOC. Bumping the ABSORB voltage up, until exactly the desired SOC is reached, prevents this inefficient tapering of charge current. This increases the efficiency of solar energy harvest under some combinations of load, solar conditions, and battery SOC. In my observation, limiting SOC to 85% allows the batteries to keep charging at the highest rate my system can deliver, right up to 85%. This might still be true at 90% SOC, but I have not tried it.

Yet another advantage is that I also use the RTS input to terminate charging in case of BMS cell or pack high voltage alarm. Because I don’t rely on the BMS directly to interrupt charging in the event of a battery or cell over-voltage condition, I needed another way to shut down charging current. In the event of a BMS over voltage alarm, a different control relay switches a different resistor, this one a 1.5k ohm value resistor, across the RTS input terminals. This fools the charge controller into thinking that the batteries are very hot, and charge voltage drops to zero, without the potential harm that might occur if the charge controllers are just turned OFF while solar input is connected. Or a slightly different resistor value can be selected to reduce charge voltage to some other low voltage.

I hope my long explanation is clear, and that it might be useful to other forum users interested in limiting the SOC of their lithium batteries to a maximum of something less than 100%.


During my search for how to setup a Schneider 60-150 MPPT I stumbled across information from Simpliphi' Battery systems which provided settings for Schneider equipment, the recommend values have been working well, they are conservative so as to meet the 10 year warranty.

There is also the same information for Outback equipment, attached pdf which may help to fine tune your understanding.

The RTS hack is similar to what some use to simply shut down the SCC by creating a fault condition by a BMS powered relay NO contact
 

Attachments

  • simpliphi-power-phi-battery-outback-integration-guide.pdf
    1,018.4 KB · Views: 14
Top