diy solar

diy solar

Behavior while AC running in heater mode

I understand the battery a bit better now. 'Tubular' refers to how the positive plate is constructed. Its primary benefit is to reduce positive plate grid oxidation corrosion which increases battery internal impedance. Other thing about these particular batteries is they use a more dilute acid concentration for electrolyte (28% versus normal 33%) which is the primary way to increase life of lead-acid battery, but not without downside of 20% less capacity for size, higher cell impedance, and biggest issue is battery capacity will drop significantly with any sulfation that locks up some of the sulphur so it cannot be recharged back to sulphuric acid and results in a very low density sulphuric acid in electrolyte.

Assuming your amperage measurement of about 30-35 A of battery current is accurate, having only about 1.5 hours of discharge before depletion of battery says battery capacity is only yielding about 50 AH's. My guess is your specific gravity of electrolyte has dropped to very low level due to insufficient charging and resulting sulfation of negative plates. If you can, get a float based hydrometer and measure it. Make sure the electrolyte level is above top of plates at its recommended level. SG spec for these batteries is 1.24( 28.5% acid concentration) when fully charged. This is lower than most flooded lead acid batteries which have SG about 1.28 (33% acid concentration).
Hydrometer copy.png
This is why the representative probably recommended equalization in an attempt to reduce the sulfation. It will burn off a lot of electrolyte water so make sure you check electrolyte level and add distilled water as necessary.

I have doubts that the batteries can be recovered.

If equalization does not improve battery, last ditch option would be to increase electrolyte acid concentration by removing some existing electrolyte and replacing with some concentrated sulphuric acid to get SG up to 1.24-1.27. This will give the battery a little 'adrenaline' shot.
Specific Gravity and acid percent Chart.png
 
Last edited:
I have a 4S2P bank of large SLA batteries (Group 8D, 12V, 225AH; bank is 48V, 450AH) running a 4400W inverter. I installed a 18,000 BTU mini-split AC last year and have been disappointed with the performance. I know that one of the two 4S banks is older and has reduced capacity, so I am now ready to switch from SLA to LFP battery technology. The primary reason is depth-of-discharge and the accompanying voltage drop during discharge. Lead acid batteries don't tolerate deep discharge very well, and asking for up to ten hours of AC service from what on paper looks like a quite capable system is too much. I can throttle the AC system back and improve the run time, but that comes at the expense of having to be colder or warmer than having the AC run with full power.

What I see is that my inverter cuts out due to low voltage long before it ought to (based on a back-of-the-envelope calculation). When the AC is running, it is drawing ~30A from the batteries. LFP should tolerate a much deeper discharge since the voltage stays relatively constant until the pack is nearly exhausted. Perhaps the OP could consider a switch to LFP from lead acid as well.
 
What I see is that my inverter cuts out due to low voltage long before it ought to (based on a back-of-the-envelope calculation). When the AC is running, it is drawing ~30A from the batteries. LFP should tolerate a much deeper discharge since the voltage stays relatively constant until the pack is nearly exhausted.
Yes, Peukert's is going to effect Pb batteries more than LFP. It is one of the things that some do not figure into the long term cost of operating Pb batteries.
 
I am going to switch the system to LFP down the road, but I still can't find many of those in the local market. I am hoping this changes (and I am starting to see some LFP making their way to the market nowadays) by the time the current battery banks are dead. It's a disappointment for now because as you said, on paper the story is different.

What confused me further is the smartshunt, it was still showing SoC around 80% when the issue happened... Shouldn't it be more accurate than that?
 
I am going to switch the system to LFP down the road, but I still can't find many of those in the local market. I am hoping this changes (and I am starting to see some LFP making their way to the market nowadays) by the time the current battery banks are dead. It's a disappointment for now because as you said, on paper the story is different.

What confused me further is the smartshunt, it was still showing SoC around 80% when the issue happened... Shouldn't it be more accurate than that?
You likely have not set the battery capacity size in setup. Also there is a setup for peak charge voltage where monitor resets itself to 100%. You also have a setup for charge efficiency. For LFP batteries set charge efficiency for 99%. Many monitors default charge efficiency to about 85% which is for lead-acid batteries, some just default it at 100%. Some monitors have a Peukert effect that includes a modification fudge factor on measured discharge current based on its absolute level relative to battery AH size you set to account for higher internal battery losses at high discharge currents.

The way a coulumb counter works is it just cummulatively sums discharge current vs time increment and charge vs time increment, with some modifications to measured current for charge and discharge efficiency.

All this summation has some cummulative error over time. For example your charge efficiency factor may not be exactly correct. Charge efficiency can be different depending on state of charge of battery at point where charge current applied.

Lead-acid battery, for example, has better charge efficiency below about 80% state of charge and has poorer recharge efficiency above 85% state of charge where some of recharge energy is going into making oxygen and hydrogen gases from electrolyte water electrolysis.

LFP cells recharge and discharge losses are caused by internal battery impedance and is very low. loss About 99.6% efficient at less than 0.2 CA current rates to about 97% at 0.5 CA current rate. As LFP cells get older/used these numbers degrade due to higher internal cell losses.

To periodically clear the cummulative errors the battery must be fully charged up to the level you set for full charge voltage. At this level of battery voltage the monitor clears the summation and resets counter to 100% full status of battery. Usually you set the full charge voltage point just slightly below your charger's absorb voltage level. This ensures it gets reset to 100% when you fully recharge.

If you don't fully recharge periodically to reset 100% full the errors will continue to accumulate so over time and the monitor will get less accurate. Many monitors have a method to allow you to manually reset the monitor to 100%.

'Smart' monitors have some algorythms that try to set the normally manual setups by watching max and min battery voltages and its rate of change based on type of battery. They will initially be very bad in accuracy until they have had a lot of time to experience the vary conditions of state of charge. Personnally I would rather set the parameters myself but smart monitors are trying to simplify the setup for inexperienced users.
 
Last edited:
Actually those voltages are configurable in the inverter, Bulk Charging Voltage (was 57.5, increased yesterday to 58.4) and Floating charging voltage (55V). Solar can charge up to 80A and grid up to 40A. I see that solar does hit the Bulk CV but the grid is always limited to 55V. There is an option in the inverter for "battery stop charging voltage when grid is available, which is set to 55V", maybe that needs to match solar...
You want to leave the grid at 55v, or it will do all the charging. The idea is for the grid to just get you by while you are waiting for the sun to come back up. If you set the grid higher, it will do the bulk of the charging and your solar will be wasted. Of course, that might be a good option to get you topped off short term, but I would not leave it that way.
 
You likely have not set the battery capacity size in setup. Also there is a setup for peak charge voltage where monitor resets itself to 100%. You also have a setup for charge efficiency. For LFP batteries set charge efficiency for 99%. Many monitors default charge efficiency to about 85% which is for lead-acid batteries, some just default it at 100%. Some monitors have a Peukert effect that includes a modification fudge factor on measured discharge current based on its absolute level relative to battery AH size you set to account for higher internal battery losses at high discharge currents.

The way a coulumb counter works is it just cummulatively sums discharge current vs time increment and charge vs time increment, with some modifications to measured current for charge and discharge efficiency.

All this summation has some cummulative error over time. For example your charge efficiency factor may not be exactly correct. Charge efficiency can be different depending on state of charge of battery at point where charge current applied.

Lead-acid battery, for example, has better charge efficiency below about 80% state of charge and has poorer recharge efficiency above 85% state of charge where some of recharge energy is going into making oxygen and hydrogen gases from electrolyte water electrolysis.

LFP cells recharge and discharge losses are caused by internal battery impedance and is very low. loss About 99.6% efficient at less than 0.2 CA current rates to about 97% at 0.5 CA current rate. As LFP cells get older/used these numbers degrade due to higher internal cell losses.

To periodically clear the cummulative errors the battery must be fully charged up to the level you set for full charge voltage. At this level of battery voltage the monitor clears the summation and resets counter to 100% full status of battery. Usually you set the full charge voltage point just slightly below your charger's absorb voltage level. This ensures it gets reset to 100% when you fully recharge.

If you don't fully recharge periodically to reset 100% full the errors will continue to accumulate so over time and the monitor will get less accurate. Many monitors have a method to allow you to manually reset the monitor to 100%.

'Smart' monitors have some algorythms that try to set the normally manual setups by watching max and min battery voltages and its rate of change based on type of battery. They will initially be very bad in accuracy until they have had a lot of time to experience the vary conditions of state of charge. Personnally I would rather set the parameters myself but smart monitors are trying to simplify the setup for inexperienced users.

These are my settings and it gets to 100% automatically at some point when it meets the 3 criteria (charged voltage, tail current and CDT):
battery capacity 480ah
charged voltage 55
discharge floor 50%
tail current 1%
charged detection time 5
peukert exponent 1.25
charge efficiency factor 90%
current treshold 0.1
time-to-go average period 3

I think I got those right? Float voltage is now around 55.3-55.4 on the inverter and the current goes down to 2-3A at some point when "fully" charged.
 
You want to leave the grid at 55v, or it will do all the charging. The idea is for the grid to just get you by while you are waiting for the sun to come back up. If you set the grid higher, it will do the bulk of the charging and your solar will be wasted. Of course, that might be a good option to get you topped off short term, but I would not leave it that way.
Aha, that's a very good point. I could change that if we're going through several days of rain then and change it back once the storm passes.

Although, the grid is providing hardly 2-4 hours a day (best case scenario), so I rely on backup generators for charging the batteries when in need. so still, matching the voltage with PV so grid does bulk charging might still be a good idea?
 
If you only have grid 2-4 hours a day, I would set it to charge the whole time (or at least until the batteries are full) if the grid power is cheaper than the fuel for the generator.
 
With 480 AH lead-acid batteries you should not be having problems with 35 amps of load current.

If batteries are fully charged and in good conditions you should be able to easily get 7 hours of run time for 70% discharge of batteries.
I derated AH rated capacity to 75% (360 AH) for 35 amp load, (100% would be with 24 amps load for 20 hr discharge rate), and used 70% of that capacity = 252 AH of useable capacity. 252 AH / 35A = 7.2 hrs.

Check their specific gravity at full state of charge. Unless you have a bad connection the batteries are in very poor condition.
 
Back
Top