diy solar

diy solar

How to handle Surplus PV Power?

Keep in mind... the mppt doesn't SEND power...

Loads DRAW power
The mppt will sit there and allow it to be pulled.

The battery will only pull what it needs and can use.

Unless the battery is pretty big, 30A isn't going to flow into it, just .2C so, a 10Ah battery will pull 20A max, etc...
If a battery draws/pulls power... then I don't fully understand why connecting two batteries at a different SOC in parallel can end badly (I've never witnessed this personally, but I've been warned in many videos, forum posts, articles, etc, on parallel battery connections). Why would the battery at low SOC pull more current than it can handle?
 
There are several misconceptions in this Thread. First off MPPT means Maximum Power Point Tracking. It is incorrect to constantly use it when instead you mean a Solar charge controller or SCC. SCC can be PWM or MPPT in terms of how they take PV in and use it for voltage out to a battery and common DC bus.

Batteries are chemical devices that develop electricity and are used as storage. it is important to understand them. They are loads when needing charging and supplies when the common DC bus can not be maintained by the SCC. There is an excellent resource to understand batteries over at Battery University. https://batteryuniversity.com/

Loads drive supply. Insufficient supply means loads do not function properly or at all.
 
If a battery draws/pulls power... then I don't fully understand why connecting two batteries at a different SOC in parallel can end badly (I've never witnessed this personally, but I've been warned in many videos, forum posts, articles, etc, on parallel battery connections). Why would the battery at low SOC pull more current than it can handle?

When you connect 2 batteries at different voltages (see note below) then you have the risk of having high surge currents on the wires. Depending on the batteries, this can be in excess of 500 amps. Depending on your wiring, this can lead to fires, arcing, and other issues. Additionally, depending on those currents, you may damage the batteries, because you can potentially far surpass the rated charge/discharge rating for the individual cells in the battery. The batteries will try to "equal out", so until they are the same voltage, that high amperage will be seen on the wires or busbars.


(You'll notice I said voltage, and not SOC. SOC doesn't necessarily matter, as different battery chemistries have different voltage characteristics over the charge/discharge cycle. As such, it's possible to have 2 batteries. 1 at 80% SOC, and one at 20% SOC.. and still be similar voltages. In this case, it is the voltage that matters when connecting them.)
 
When you connect 2 batteries at different voltages (see note below) then you have the risk of having high surge currents on the wires. Depending on the batteries, this can be in excess of 500 amps. Depending on your wiring, this can lead to fires, arcing, and other issues. Additionally, depending on those currents, you may damage the batteries, because you can potentially far surpass the rated charge/discharge rating for the individual cells in the battery. The batteries will try to "equal out", so until they are the same voltage, that high amperage will be seen on the wires or busbars.
Point taken regarding Voltage vs SOC. But the question remains... if batteries pull/draw then why is there a risk of high current? Wouldn't they only draw what they can use? I'm pretty novice at all of this, but I think that batteries don't "draw" power. Current moves from high voltage to low voltage. This is why applying 1V to a battery at 12V won't charge it. I imagine connecting a 12V battery with C@200A to a 24V battery with C@100A wouldn't charge the 24V battery either.
 
Voltage is how you determine SOC. Capacity of batteries is the amount they can provide. Batteries of same voltages in parallel can have different capacity. The state of charge will be approximately equal. There are no shortages of SOC charts based on voltage per battery chemistry on the 'net.

SOC is not a measurement of capacity. Though if your battery is in good condition it serves as a approximation based on the batteries ratings.

The resourced I linked for Battery University is something you should explore. Good fortune.
 
There are several misconceptions in this Thread. First off MPPT means Maximum Power Point Tracking. It is incorrect to constantly use it when instead you mean a Solar charge controller or SCC. SCC can be PWM or MPPT in terms of how they take PV in and use it for voltage out to a battery and common DC bus.

I used MPPT because I was assuming that's the device I would use in the above scenario and was hoping for correction if I needed to use a different type of device. After watching Will's video on solar without a battery, I realized a DC-to-DC converter might be a better choice. Perhaps a PWM is the best choice for this scenario? I only have a blurry idea of how they work and don't know how it would help in the above scenario.

After digging through Battery University quite a bit, I'm still having trouble connecting the dots. Perhaps answers to these questions will help...

I have 2 batteries.
A: 12V 100Ah LiFePO4 currently reading 13.6V
B: 12V 200Ah LiFePO4 currently reading 13.5V

If I connect these two batteries positive to positive and negative to negative, I believe current will move from A to B until the voltages equalize. Is this correct?

I have 2 different batteries.
A: 12V 100Ah LiFePO4 currently reading 12.9V
B: 12V 100Ah LiFePO4 currently reading 13.6V

This is a larger voltage difference than the previous scenario. Larger than I've seen recommended for connecting in parallel. So I shouldn't connect them. But If I did anyway, and assuming nothing melts or explodes, current should move from B to A until the voltages equalize. Correct?

Finally....
I have a battery 12V 100Ah LiFePO4 currently reading 12.9V. I also have a solar panel reading 14.4V and 100A. If I connect these two things, what bad things will happen if any?
 
I've been warned several times in several places to hook up battery before panels. This seems contrary to that. If the MPPT controller doesn't have a battery to read.... how will it be able to do it's fancy math to determine the proper power point?

The hookup battery before panels thing is controversial. Some say it’s necessary and some say not. Some say they’ve hooked up panels first for years without issue and then one day it fried their MPPT. Read your owners manual and follow the instructions there.

As for inverters that don’t need a battery, i believe most of these are grid tied. The grid acts as a battery should a cloud block the sun for a bit.

Inverters need constant voltage, which a battery can provide, but an MPPT can only do so when the sun isn’t blocked by tree shade or clouds. Some inverters deal better with inconsistent voltage than others.
 
Reading a bit more, I think I finally had that "light bulb" moment...

Voltage dictates which direction current will flow -- from high voltage to low voltage. The component receiving the current determines how much it will try to pull. The component supplying the current determines how much is available. Is that right?

So, considering my original scenario (with a few more details)...

I have a nearly empty 12V 18Ah battery at 12.9V. The safe charging rate is 0.2C or 3.6A.
I also have a 150V solar array at 10A going into an MPPT Controller (specifically, i.e. not a PWM Charge Controller). The MPPT controller will output 14.4V with 104A. Because the voltage of the battery is lower than that provided through the MPPT controller, the battery will draw current. However, the battery has very low impedance and therefore will likely pull all 104A available, which is unsafe.

If my understanding above is correct, are there SCCs available that will limit current to a value I set? Or, what devices are available to place between the MPPT controller and the battery to limit the current available?
 
Point taken regarding Voltage vs SOC. But the question remains... if batteries pull/draw then why is there a risk of high current? Wouldn't they only draw what they can use? I'm pretty novice at all of this, but I think that batteries don't "draw" power. Current moves from high voltage to low voltage. This is why applying 1V to a battery at 12V won't charge it. I imagine connecting a 12V battery with C@200A to a 24V battery with C@100A wouldn't charge the 24V battery either.
Battery at a low state of charge will take as much current as the supply can provide when the voltage is above the battery state of charge, and it's internal resistance.

Just as a piece of wire will draw as much current as the battery can provide when the wire has little resistance...
 
Reading a bit more, I think I finally had that "light bulb" moment...

Voltage dictates which direction current will flow -- from high voltage to low voltage. The component receiving the current determines how much it will try to pull. The component supplying the current determines how much is available. Is that right?

So, considering my original scenario (with a few more details)...

I have a nearly empty 12V 18Ah battery at 12.9V. The safe charging rate is 0.2C or 3.6A.
I also have a 150V solar array at 10A going into an MPPT Controller (specifically, i.e. not a PWM Charge Controller). The MPPT controller will output 14.4V with 104A. Because the voltage of the battery is lower than that provided through the MPPT controller, the battery will draw current. However, the battery has very low impedance and therefore will likely pull all 104A available, which is unsafe.

If my understanding above is correct, are there SCCs available that will limit current to a value I set? Or, what devices are available to place between the MPPT controller and the battery to limit the current available?
In this scenario, the SCC will have 3 charge settings. bulk, absorb, and float. Each setting does something different thing.

Bulk is the first mode, which (while not entirely accurate, I'll simplify) will match your current battery voltage (actually, slightly above it) and allow current to flow. The amount of current will be up to a setting you adjust, or the maximum that the SCC / array can accommodate (based on hardware limitations, available energy from the sun, etc.. Eventually, the internal resistance of the battery will "slow the charge" down as it becomes more fully charged, in which case the SCC will usually switch to absorb mode.

So yes, if your small battery can't handle the max charge rate available, you'll want to change the setting to limit it.


*Edited to add*:

BTW, when people talk about sizing a battery system for solar, they usually only think about "how much capacity do I need to provide X power for Y amount of time". However, they often forget the "What is the safe charge and discharge rate of the battery" as a criteria. Luckily though, the "provide X power over Y time" equation usually makes the bank big enough that the safe charge/discharge rates are within acceptable limits.

Using your example scenario above, there are other reasons you likely wouldn't want an overly small battery. As an example, I have a Schneider system, which recommends a minimum of 440ah of 48v batteries per inverter. One of the reasons is, as the inverter is ramped up to satisfy the power demands for loads, it's draing in available current from the SCC's. However, if a large load shuts off suddenly.. it takes a split second for that inverter (and SCCs) to "ramp down" accordingly. In the meantime, that excess power needs to go somewhere. Which is often seen as a large spike in charge current to the batteries. with a small battery (18ah, in your example). that would again far exceed the safe charge/discharge currents.
 
Last edited:
It is true that loads draw power, but when you are dealing with a current limited and voltage limited device (MPPT) and you are bumping into that current limit you might describe it as feed or send. Sorry if I confused you there. As mentioned, without a limiting device, a battery suddenly connected in parallel will draw as much as it can, as much as the source can provide, based on internal resistance, if its state of charge/voltage level that can take it in. So it would be possible for a battery or cell to take too much power in and damage itself.

Same goes for DC to DC converters without current limiting, you have to use the voltage setting to limit the current otherwise they will blow. I've managed to blow all of mine pushing their limits so now I only use converters that have current and voltage limiting, even though they may be a little less efficient.

On the DC load terminals (of an MPPT) you can program the current limit to a lower setting which can be very handy.

I also assumed that the SCC should be a (bluetooth) MPPT and not PWM, MPPT controllers are now cheap enough, SO incredibly useful and so often typed that the terms are used interchangeably and as shorthand. I'd discourage any DiYer from using PWM unless you already have the equipment and no funds to upgrade. MPPT offers voltage flexibility, high conversion efficiency, quick and easy bluetooth programability, adjustable current limit on DC load output and of course maximum power point tracking, simply amazing gear. PWM and Lead batteries arguably a false economy at this point.
 
Last edited:
4.8kw load, sounds like the new hydro antminer ? ( they might come with a low power mode as well, for the ☁️ days )
 
The hookup battery before panels thing is controversial. Some say it’s necessary and some say not. Some say they’ve hooked up panels first for years without issue and then one day it fried their MPPT. Read your owners manual and follow the instructions there.
Many MPPT require battery first (all of my ipanda ones do) because it is automatically sensing for 12V/24V battery. So if you connect 36V panels first followed by a 12V battery it can confuse the system.
 
Last edited:
If my understanding above is correct, are there SCCs available that will limit current to a value I set? Or, what devices are available to place between the MPPT controller and the battery to limit the current available?
Yes you have it all correct. A 100A MPPT is "large" but as mentioned you could limit the charging current to 3 or 4amps which is about .2C in this case to accomodate your 18ah battery that you can also connect a load to the DC terminals to use more of the available PV power coming in but that will probably be limited to 30 amps. This is a screenshot from the settings of a 60A MPPT
 

Attachments

  • Screenshot_20230912_142023.jpg
    Screenshot_20230912_142023.jpg
    163.5 KB · Views: 1
Voltage dictates which direction current will flow -- from high voltage to low voltage. The component receiving the current determines how much it will try to pull. The component supplying the current determines how much is available. Is that right?
All components in an electrical circuit determine what happens with the direction and quantity of current flow. Attempting to isolate one component ("how much it will try to pull") is meaningless. It needs to be considered in the context of the entire circuit.

Solar charge controllers are called controllers because they have the ability to adjust the circuit's parameters to control when and how much current flows in the circuit. You choose the voltage range within which they will charge and you set thresholds on how much current they can maximally deliver. They will adapt the circuit's parameters accordingly.

Batteries also often come with a battery management system which provides another layer of protection, both in terms of limiting current flow to a safe level if needed, or terminating the flow of current in one or other direction if voltages fall outside of set thresholds. This is why I can connect two of my batteries, one nearly completely discharged with one fully charged. The BMS automatically recognises the "danger" and places a limit on the current which can flow in the circuit. It's not recommended practice of course but many modern well built batteries have these capabilities.
 
Back
Top