• Have you tried out dark mode?! Scroll to the bottom of any page to find a sun or moon icon to turn dark mode on or off!

diy solar

diy solar

Quick question regarding voltage/current

kentdavidge

New Member
Joined
Feb 24, 2025
Messages
27
Location
brazil
If I set the charging voltage to 14.2 V on an MPPT charge controller, then if I measure the voltage between terminals of my 12 V LiFePO4 battery, will it read 14.2 V? Also, will the rate at which energy enters the battery be 14.2 V times whatever current the MPPT outputs?
 
If I set the charging voltage to 14.2 V on an MPPT charge controller, then if I measure the voltage between terminals of my 12 V LiFePO4 battery, it will read 14.2 V? Also, will the rate at which energy enters the battery be 14.2 V times whatever current the MPPT outputs?

Depends.

First, the MPPT and your meter must measure the same voltage exactly the same. Pretty rare as all instruments have error.

With flowing current, there is a voltage drop between them according to the wire and connection resistance.

With the application of current, the voltage of the battery will rise, but it will only rise as high as the amount of current demands, and that's dictated by the SoC of the battery.

The voltage you measure at the battery terminals will always be lower than what you measure at the MPPT terminals if current is flowing.
 
What’s a good test (if you have meter leads long enough) is measure the voltage drop of the positive lead of the SCC to the positive terminal.

If you have any fuse or breaker, in the path it’ll measure the voltage drop. Do you have multiple lugs/bus bars between the two? Each adds some resistance so higher chance of having more voltage drop.
 
Thank you @sunshine_eggo and @740GLE. I already have all the necessary "parts" to measure the current and voltage (including a clamp multimeter), except the wires and breakers. I should get them by next month to finally put my system to work. Meanwhile I'm trying to make a mathematical simulation of the input power when the battery is being charged by the two 280 W solar panels that I bought. I was just setting 14.2 V * 30 A * 0.98 = 417 W since my MPPT is 30 A max and has an efficiency of 0.98. But now (thanks to you guys) I understand this is wrong because the input power decreases as the battery gets charged.
 
Depends.

First, the MPPT and your meter must measure the same voltage exactly the same. Pretty rare as all instruments have error.

With flowing current, there is a voltage drop between them according to the wire and connection resistance.

With the application of current, the voltage of the battery will rise, but it will only rise as high as the amount of current demands, and that's dictated by the SoC of the battery.

The voltage you measure at the battery terminals will always be lower than what you measure at the MPPT terminals if current is flowing.
I have noticed that …not much but a little …….thanks for explaining why …

J.
 
Thank you @sunshine_eggo and @740GLE. I already have all the necessary "parts" to measure the current and voltage (including a clamp multimeter), except the wires and breakers. I should get them by next month to finally put my system to work. Meanwhile I'm trying to make a mathematical simulation of the input power when the battery is being charged by the two 280 W solar panels that I bought.

Fun with details!

The 280W panels will almost never put out 280W. They'll typically put out 80-90% of rated due to conditions and the fact that the conditions for the STC ratings don't reflect reality the vast majority of the time.

I was just setting 14.2 V * 30 A * 0.98 = 417 W since my MPPT is 30 A max and has an efficiency of 0.98.

The .98 applies between the PV terminals and the battery terminals of the MPPT, and it's likely for an exact scenario like at 80% of rated output or some-such (I have no idea where), so that .98 will change depending on the output.

But that doesn't factor in the wiring losses.
  1. V_pv * I_pv = input power at PV terminals
  2. V_mppt * I_batt = output power at battery terminals
  3. V_batt * I_batt = input power at battery terminals
1 is greater than 2 is greater than 3. The losses between 1 and 2 heat the MPPT. The losses between 2 and 3 heat the wire.

But now (thanks to you guys) I understand this is wrong because the input power decreases as the battery gets charged.

Not necessarily at all true UNTIL the battery attains Absorption voltage. With LFP, this will be very late in the charge. The MPPT input will receive the maximum power the PV can supply provided the battery can accept the power.

Three stage charge profile:

1744952195467.png

Note that if you actually run the numbers, the above says stage 1 actually INCREASES in input power as the battery takes on charge as the voltage increases. This is only true if the MPPT is able to output its maximum rated current through the entire stage. If your PV can't support the MPPT's maximum output, you'll see that the V_batt * I_batt = constant throughout the charge where V is lower at the beginning and I is higher at the beginning, but when V is at peak, I reduced such that the power is constant.

Input power only starts decreasing as the charge becomes voltage limited in stage 2.
 
I was going to say, the mppt always outputs very slightly higher than the current battery voltage, not its "final charge voltage."

If the SCC tried to put 14.2V across the terminals of a discharged battery sitting at 12V, the battery would try to draw... Many thousands of amps, right? I can't remember the resistance, but iirc a dead short can hit 10s of kA.
And if the charger could actually provide those amps your everything would quickly melt.
Same reason connecting a full battery and empty battery in parallel gets sparky and melty.

So the mppt is always pushing out the max current it can handle, 30A, at the battery's voltage. When the battery's at 12V, that means only 360W going in. But when the battery's at 14V it's getting 420W
 
Last edited:
I was going to say, the mppt always outputs very slightly higher than the current battery voltage, not its "final charge voltage."

If the SCC tried to put 14.2V across the terminals of a discharged battery sitting at 12V, the battery would try to draw... Many thousands of amps, right? I can't remember the resistance.
And if the charger could actually provide those amps the ~~cables~~ everything would quickly melt.
Same reason connecting a full battery and empty battery in parallel gets sparky and melty.


So the mppt is always pushing out the max current it can handle, 30A, at the battery's voltage. When the battery's at 12V, that means only 360W going in. But when the battery's at 14V it's getting 420W

This last bit is true only if the input to the MPPT can support 30A through the whole charge, and this is rarely the case unless the array is significantly larger than the max power output of the MPPT.
 
I've had my 100/30 3x overpaneled all spring lol, it's 30A from shortly after dawn to shortly before dusk

Assuming that's a Victron... they love to be over-paneled, especially on 12V systems. My 250/100 maxes out at 5800W @ 48V, but the input limits allow for over 10kW of my panels in a 4S8P config.

I'm contemplating supplementing with some E/W arrays as the 285W panels are almost a perfect voltage match with my 3S6P 330W panels, but I'm a long way from needing the 6kW and exceeding the 40kWh the high desert AZ sun can provide daily... :)
 
So the mppt is always pushing out the max current it can handle, 30A, at the battery's voltage. When the battery's at 12V, that means only 360W going in. But when the battery's at 14V it's getting 420W
When charging a LiFePo battery with constant current (bulk mode) you’ll notice the battery voltage have an exponential decay of voltage rise, it quickly goes from 13.0v to say 13.8v and voltage tapers to almost a plateau of little gain, and seems like the current going into the battery isn’t doing anything, but it’s the nature of the LiFePo voltage curve. Once you get higher in SOC voltage will then hit its second knee and the last 5% charge voltage will climb very fast from 13.8v to 14.xx of your SCC absorbtion setting.

That 13.8v is just an example that voltage is dependent on the charge current. So that’s one issue people face when they have a high C rate of charging and dumb SCC, the battery terminals & SCC PV terminal voltage rises too quickly (but still needs the KWhr charge) and the SCC thinks it’s done then flops over to float, meanwhile the battery is only 50% charged.

This is where having a SCC and a shunt work together is a great feature and well worth the extra $. They make sure the actual battery terminals are shared with the SCC and the SCC adjusts its output as needed.
 
and has an efficiency of 0.98.
Which you will never see. That might be the maximum efficiency of very good quality converter in specific voltage step down and specific current draw. For example if the charge current is 2A and panel voltage is 18V. In practice you can be happy if you get 95%, likely less.
 

diy solar

diy solar
Back
Top