diy solar

diy solar

Question on solar charging termination

sunworshipper

New Member
Joined
Nov 13, 2019
Messages
33
Most of us on this forum know that the battery voltage (esp. when discharging or charging) is not a reliable of the SoC of a battery bank and recommend the use Coulomb counting of battery monitors for this purpose. However, we seem to blindly accept chargers that determine if the batteries are fully charged based on the voltage. When a solar charger terminates charging, the chemical reaction is nearly reversed, but not fully reversed. Compounding the problem for lead acid batteries is the variation on recommendations from battery manufacturers - charge until the current drops below X% (1<X<5) of the nameplate capacity or stay in absorb for X hrs (2hr<X<8hrs), etc. Why don't we have chargers that terminate charging based on the net charge delivered? I realize that such a charger needs to know the net charge from a monitor.
 
Nothing is truly perfect. Especially with lead-acid. The voltage, time and amp methods are really pretty good. The periodic equalization is enough to make up for any shortfall and ensure long life. Then it comes down to cost.

Lithium BTW is far easier to manage and is not anything critical like lead-acid. Make the switch.
 
Most of us on this forum know that the battery voltage (esp. when discharging or charging) is not a reliable of the SoC of a battery bank and recommend the use Coulomb counting of battery monitors for this purpose. However, we seem to blindly accept chargers that determine if the batteries are fully charged based on the voltage.

Because the manufacturer's of these batteries define the fully charged state with a charged voltage and a tail current and/or absorption period.

There are two positions:

1) the one you are proposing.
2) the other used by the entire battery manufacturing industry for decades.

When a solar charger terminates charging, the chemical reaction is nearly reversed, but not fully reversed. Compounding the problem for lead acid batteries is the variation on recommendations from battery manufacturers - charge until the current drops below X% (1<X<5) of the nameplate capacity or stay in absorb for X hrs (2hr<X<8hrs), etc.

Not a problem AT ALL. You simply use the manufacturer's recommendations.

Why don't we have chargers that terminate charging based on the net charge delivered?

Because that is a completely worthless way to determine if a battery is full, and for any hope of it being effective, charge efficiency would either have to be 100% or the exact efficiency through all phases of charging must be known and factored in for each and every different brand/model of battery.

LFP: 3.65V/cell 0.05C tail current.
Lead acid: 14.0-15.0V, 0.01-0.02C tail current or defined absorption period WITH temperature compensation.
 
Most of us on this forum know that the battery voltage (esp. when discharging or charging) is not a reliable of the SoC of a battery bank and recommend the use Coulomb counting of battery monitors for this purpose.
Battery voltage STILL IS the primary indicator of whether a battery is charged or discharged fully.
It is only in-between that we struggle when it comes to correlating voltage with SOC.
But as long as approximations are good, hardly anybody cares.

Why don't we have chargers that terminate charging based on the net charge delivered? I realize that such a charger needs to know the net charge from a monitor.
Why should we, when we know that voltage is the primary measure to determine whether a battery is fully charged.

The simpler solution is more theoretically sound than the messy solution.
 
Last edited:
Charging is normally terminated based on amps, rather than voltage I'd say. Otherwise the charge cycle would be bulk-float.

I do think charge termination is a messy business which is why I'm exploring running bulk, absorb, and float all at the same voltage. So that charging never needs to terminate.
 
Terminating charging based on Amp-hrs returned (after allowing for Coulombic inefficiency) is the best way. After all, the electrons are part of the chemical reaction. In my view, all other methods are compromises.
 
Because that is a completely worthless way to determine if a battery is full, and for any hope of it being effective, charge efficiency would either have to be 100% or the exact efficiency through all phases of charging must be known and factored in for each and every different brand/model of battery.
I don't agree.
 
Operational management by voltage is easier for some chemistries than others. LifePo4 has a notoriously 'flat' charge/discharge curve except for the hi/low tails. On the other had, I run an 18650 INR chemistry (3.0v -> 4.2v) powerwall and the curve is steeper... and perfectly adequate to manage by voltage.

And even if you had a perfect charger - in an active solar system you have variable (multiple) MPPT charging + Inverter loads all at the same time + there can be voltage sag because of loads... so it can be a bit messy. But that's the nature of batteries and a little fussiness doesn't take away from them being a really great way to store and retrieve power.
 
Operational management by voltage is easier for some chemistries than others. LifePo4 has a notoriously 'flat' charge/discharge curve except for the hi/low tails. On the other had, I run an 18650 INR chemistry (3.0v -> 4.2v) powerwall and the curve is steeper... and perfectly adequate to manage by voltage.

And even if you had a perfect charger - in an active solar system you have variable (multiple) MPPT charging + Inverter loads all at the same time + there can be voltage sag because of loads... so it can be a bit messy. But that's the nature of batteries and a little fussiness doesn't take away from them being a really great way to store and retrieve power.
No that messy at all. Solar chargers & battery monitors already communicate with each other - controllers communicate battery temp to the monitor, monitors communicate the true battery terminal voltage to the controller, etc. It's a simple task to communicate net, efficiency adjusted charge returned to the controller & drive it until the reaction is reversed.
 
No that messy at all. Solar chargers & battery monitors already communicate with each other - controllers communicate battery temp to the monitor, monitors communicate the true battery terminal voltage to the controller, etc.
Will respectfully disagree here. None of my components communicate with each other - Multiple Midnite Classic Charge Controllers (and a single Wiz Bang Jr shunt), AIMS/SGP Inverters, Batrium Shunt, Chargerverter, and the DIY Powerwall. There is no actual 'true voltage' shared between these systems - they all see the powerwall voltage independently a little differently from each other for various reasons.

While I agree the recent trend is for components to communicate / be smarter there are plenty of systems out there where this is not reality.
 
Last edited:
Can you help me understand how you know better than every battery manufacturer on the planet?
career in physics & electro-chemistry (vanadium redox flow batteries). I can assure that when it comes to lead acid batteries, the manufacturers don't have a team of chemists; I'm personally aware of a well known US battery mfg with no chemist on their payroll. The field is so mature that anyone can set up a battery manufacturing operation without advanced team of scientists, a lot like opening a McDonalds franchise without the need for a background in food science.

The lead acid battery manufacturers have been recommending the same charging profile even as our understanding has improved and better measuring & control techniques have become available.
 
Will respectfully disagree here. None of my components communicate with each other - Multiple Midnite Classic Charge Controllers (and a single Wiz Bang Jr shunt), AIMS/SGP Inverters, Batrium Shunt, Chargerverter, and the DIY Powerwall. There is no actual 'true voltage' shared between these systems - they all see the powerwall voltage independently a little differently from each other for various reasons.

While I agree the recent trend is for components to communicate / be smarter there are plenty of systems out there where this is not reality.
Victron products talk to each other.
 
Hmmm...
So a person decides to set their 100% FULL for LFP @ 3.400 Volts per cell (13.6/27.2/54.4) and allows their charger (any) to charge to 3.420VPC and allows the Amps Taken to drop to EndAmps/TailCurrent and then transition to Float (allowing cells to settle to 3.400 with or without active balancing), that IS FULL @ 100%. Keeping in mind the Working Range for LFP is from 3.000 to 3.400 and not overly tipping into the Allowable range.

EndAmps/Tailcurrent calculated as: 100AH X 0.05 = 5A or in the case of 280AH 14A. Once the packs reach this point at set voltage they are in essence saturated & full.

If however you stop charging the moment the pack reaches 3.400VPC, the LFP will settle and likely stop at 3.340 +/- pending on cell state (wear & tear + temps etc) you will never see 100%.

AND SORRY BUT VOLTAGE IS THE ONLY READING THAT COUNTS !
Sadly most folks do not correct for voltage offsets between their gear & battery packs and that will ALWAYS Skew readings. One MUST account for any voltage deviations between the Charging & Discharging "devices" and the battery packs themselves at the "item" terminals. Depending on the hardware being used (SCC's Inverter/Chargers etc) some allow the setting for the Battery Efficiency (Midnite Solar gear as 1 example) and that should be set to 99% Efficient, because LFP IS ! Case in Point when my SmartShunt says my Bank is at 85% and I go look at the BMS' they always agree ! Then when looking at actual cell voltage within the packs they also agree. It's all in the config & tweaking to ensure accuracy.

Voltage != Accurate is a LEFTOVER from FLA, AGM and other chemistries for which too many factors affected the actual capacity & voltage readings... Low electrolytes would skew numbers as would temps etc... aging FLA and it's losses over time also a contributing factor because deterioration starts the moment you start using the batteries - this is NOT the case with Lithium Base cells.

LFP & Temps: LFP is happiest at 25C/77F but in lower temps such as 15C/59F they take charge slower and therefore it is longer to charge but they will still reach EndAmps/TailCurrent (full) just taking longer to get there. In High Temps 30C/86F they can charge faster and reach EndAmps quicker BUT if there is a weaker cell in the pack it could also RUN to higher voltages once reaching EndAmps. This is all Reproducible & Observable !

Coulomb Counting what goes In & Out works in a general sense BUT fails to account for losses and that results in increasing calculation errors providing erroneous results after short time. LOTS OF US HAVE SEEN THIS REALTIME. Pending on "what" device is doing the coulomb counting and the software/firmware is capable of, some are more accurate than others. Victron Smarthunts, Midnite WizBangJr can reset themselves when they reach programmed points to remain accurate, while others do not (and sadly that also includes many BMS').

BTW: In case you are not aware. Some folks in here have more qualifications for this tech than you can shake a stick at. They lurk, respond and do it quietly without drawing attention to the facts... Consider who you are arguing with. PS: there is even a few manufacturers who lurk in here and answer when needed.
 
career in physics & electro-chemistry (vanadium redox flow batteries).

Are the criteria for full charging of these devices consistent with your approach? If so, is there something different about these batteries that makes your method more appropriate?

Is this possibly a situation where you have a hammer, and everything that stores and releases charge looks like a nail to you?

I can assure that when it comes to lead acid batteries, the manufacturers don't have a team of chemists; I'm personally aware of a well known US battery mfg with no chemist on their payroll. The field is so mature that anyone can set up a battery manufacturing operation without advanced team of scientists, a lot like opening a McDonalds franchise without the need for a background in food science.

The technology is so mature, but a major change is needed to establish full charge?

Are you claiming NO battery manufacturers employ chemists?

Are there any chemists at Rolls-Surrette?

The lead acid battery manufacturers have been recommending the same charging profile even as our understanding has improved and better measuring & control techniques have become available.

Okay. Let's set lead acid aside. Why do LFP cell manufacturers specify a charge voltage and tail current to establish full if your method is better? I would expect that at least some LFP manufacturers employ chemists.

Victron products talk to each other.

Sure do. They whisper sweet nothings to me.
 
Are there any chemists at Rolls-Surrette?
Giggle LOVE IT ! Been in the factory & offices and they have a LOT of serious Engineers, they even have a special team just for the LFP battery systems and Ohhh Boy they are up to things ! BTW: Canadian Made LFP to boot, unfortunately more $ but WOW quality.
 
Back
Top