diy solar

diy solar

What is the limiting factor in charging large battery arrays?

JakAHearts

New Member
Joined
Sep 27, 2023
Messages
33
Location
South Carolina
Ive been researching and learning a ton but can't quite find the information (or I just dont understand what Im reading) on this subject. I recently used a calculator that claimed one could charge 6 48v 100ah batteries with an 8000pv system in around 3.5 hours of peak sun. Since I assume one just cant charge a battery as fast as possible... Let's say I have a large wattage array of panels, what is the limiting factor in how fast you can charge a battery? Is it just the battery characteristics? The output of the inverter or MPPT?
 
It's multiple things.

1: How much energy you collect with the panels
2: How much of that energy the MPPT can accept/convert
3: How much of the accepted/converted energy the charge controller can push to the batteries.
4: How much energy the batteries can safely absorb over a period of time.

For example:

If you produced 8000W and got exactly that much all the way through the MPPT to the charge controller and assume the charge controller can convert 100% of that energy to 48v for battery charge, you're talking about 166 amps. (8000/48=166).

6x 100 amp 48v batteries makes for a 600amp battery bank. Pushing 166 amps to your bank would typically be evenly distributed to all 6 batteries at about 27.6 amps per battery.

If each battery was 100 amps and completely empty, it would take about 3.6 hours at a constant rate of 27.6 amps to charge the battery from zero to 100 amps.

There are a lot of oversimplifications I provided, like the fact that you never run your batteries to zero, typically 20% floor maximum for LifePoE , so technically you're starting with the battery already at 20 of 100 amps, which means the charge time would be less. Assuming 100% efficiency and 100% capacity for all that 8000 watts to be passed strait to the batteries, which is never true, the panels don't perform at 100%, the MPPT doesn't perform at 100%, the inverter needs the power to make power and the charge controller never 100% converts all the energy it takes in, it's just not how it works, each component needs some energy or has some loss in it's role.

As the other posters have mentioned, a .25C rate is preferable for most consumer-grade 100amp 48V batteries. What is a C rate you might ask.. simple. it's basically how much power you're pushing into the battery divided by what the battery's maximum is.

For example, a 100 amp battery that can receive/push 100 amps charging with 27.6 amps is exactly .27C. Take 27.6/100 and that gives you the C rate. Typically .25 or lower is best for most LifePoE batteries, both charging and discharging.
 
Last edited:
Its the battery characteristics primarily. A LiFePO4 prismatic cell used in most typical consumer grade solar storage batteries are recommended to be charged at 0.2C up to 0.5C. With the lower end of that range being best practice. For a 100Ah battery that translates to 20A to 50A.

There are specialty batteries that are designed for higher currents and wider operating temperature ranges.
 
Ive been researching and learning a ton but can't quite find the information (or I just dont understand what Im reading) on this subject. I recently used a calculator that claimed one could charge 6 48v 100ah batteries with an 8000pv system in around 3.5 hours of peak sun. Since I assume one just cant charge a battery as fast as possible... Let's say I have a large wattage array of panels, what is the limiting factor in how fast you can charge a battery? Is it just the battery characteristics? The output of the inverter or MPPT?
PV output is my limiting factor 80% of the time.

150kWh battery, 32kWpv, MPPT SCC output amps total= 490A.

My batteries could handle a lot more than 45A per battery typical.
 
It's multiple things.

1: How much energy you collect with the panels
2: How much of that energy the MPPT can accept/convert
3: How much of the accepted/converted energy the charge controller can push to the batteries.
4: How much energy the batteries can safely absorb over a period of time.

For example:

If you produced 8000W and got exactly that much all the way through the MPPT to the charge controller and assume the charge controller can convert 100% of that energy to 48v for battery charge, you're talking about 166 amps. (8000/48=166).

6x 100 amp 48v batteries makes for a 600amp battery bank. Pushing 166 amps to your bank would typically be evenly distributed to all 6 batteries at about 27.6 amps per battery.

If each battery was 100 amps and completely empty, it would take about 3.6 hours at a constant rate of 27.6 amps to charge the battery from zero to 100 amps.

There are a lot of oversimplifications I provided, like the fact that you never run your batteries to zero, typically 20% floor maximum for LifePoE , so technically you're starting with the battery already at 20 of 100 amps, which means the charge time would be less. Assuming 100% efficiency and 100% capacity for all that 8000 watts to be passed strait to the batteries, which is never true, the panels don't perform at 100%, the MPPT doesn't perform at 100%, the inverter needs the power to make power and the charge controller never 100% converts all the energy it takes in, it's just not how it works, each component needs some energy or has some loss in it's role.

As the other posters have mentioned, a .25C rate is preferable for most consumer-grade 100amp 48V batteries. What is a C rate you might ask.. simple. it's basically how much power you're pushing into the battery divided by what the battery's maximum is.

For example, a 100 amp battery that can receive/push 100 amps charging with 27.6 amps is exactly .27C. Take 27.6/100 and that gives you the C rate. Typically .25 or lower is best for most LifePoE batteries, both charging and discharging.

Thanks! I figured as much but wasn't sure. So basically, no matter how big or powerful a system is, the battery is limited by how fast youre supposed to charge it to make it last. Obviously cant charge a battery from 20% to 100% in 5 minutes and expect prolonged lifespan.

This is off the subject kinda but I can only fit 20-22 panels on my roof. Looking at your numbers, 22 450 watt panels should put me around optimal charging with losses and such, right? Obviously that would only be during perfect sunlight angle etc. Im just trying to make sure I dont get too small a wattage of an array while also not wasting money by over paneling. My household uses very little power during daylight hours and a lot from 4-9pm.
 
Overpaneling is actually a good thing, remember in the morning, and evenings your panels will produce less and the additional panels will give you more power extending your available PV generation time. For example, many inverters list the ability to handle more PV than they can convert to energy for the batteries or the household, for the reason I stated above and to compensate for the losses.

As an example, I have a Sol-Ark 15K, and I have 17kW of panels attached to it, in the mornings and evenings I get a little more boost from the additional panels allowing me to get about another full hour of higher energy production that if I only put 15K of panels on the house. This also ensures that at peak solar time I actually get 15Kw of solar, because recall the panels aren't 100% efficient. Panels are relatively cheap compared to other system components at around $400 or less, I have 550W panels so in my option for another $1600 it was worth it to over-panel the system and get more energy; peak in the daytime and more in lower light times of day.


Here is an older article from Victron, it gives some of the basic reasons why you should over-panel.

 
Last edited:
Back
Top