ebrummer
New Member
Theres so many great resources here and I've watched a lot of videos. Forgive me if this is covered somewhere and I missed it.
Electrical calcs are typically Watts = Volts * Current.
When sizing out a system, if you look at the specs on a lot of off-grid inverters, there will be a max Voltage, a max current and a max wattage. In strict math terms without factoring reality, one of those numbers is over constraining the problem.
So, if I'm sizing out a system, and lets use the EG4 6000XP system for example:
Max PV Input: 8000W (4000W across 2x MPPTs) although elsewhere on the spec sheet it recommends 10000W max...
Max voltage: 480V,
Max usable current: 17A,
Max short circuit current: 25A
So, if I size my system by using the Voc of the panel and the Isc to meet the current and voltage requirements, that's often over the listed Wattage requirement.
For example imagining we have a single 480V panel at 17A you could theoretically get 8160W on a single MPPT which is double the rated wattage. This is a very simplified example, but I often find, if I size based on the Voc and then depending on the panel current, do 2 strings in series so lets say 4S2P or 6S2P etc, I can meet the current and voltage requirements, but will be over slightly on the total wattage. (And thats without the spec sheet listing 8kW as the spec and also mentioning 10kW.) A lot of the information I've read on planning systems doesn't talk as much about the total wattage limit. Then you factor in that using Voc as long as you're not in the cold, is higher then I'll ever see, the total wattage of the panels will be much lower then calculating wattage with Voc and Isc. Should I be using Max power voltage and max power current when comparing wattage of system to wattage of inverter PV input? I'm trying to max out my PV wattage input based on the inverter and panels I'm looking at. Then you have over paneling, where once you exceed a certain current, you're not adding usable current to the system (so long as the system won't fail with that extra current.)
Electrical calcs are typically Watts = Volts * Current.
When sizing out a system, if you look at the specs on a lot of off-grid inverters, there will be a max Voltage, a max current and a max wattage. In strict math terms without factoring reality, one of those numbers is over constraining the problem.
So, if I'm sizing out a system, and lets use the EG4 6000XP system for example:
Max PV Input: 8000W (4000W across 2x MPPTs) although elsewhere on the spec sheet it recommends 10000W max...
Max voltage: 480V,
Max usable current: 17A,
Max short circuit current: 25A
So, if I size my system by using the Voc of the panel and the Isc to meet the current and voltage requirements, that's often over the listed Wattage requirement.
For example imagining we have a single 480V panel at 17A you could theoretically get 8160W on a single MPPT which is double the rated wattage. This is a very simplified example, but I often find, if I size based on the Voc and then depending on the panel current, do 2 strings in series so lets say 4S2P or 6S2P etc, I can meet the current and voltage requirements, but will be over slightly on the total wattage. (And thats without the spec sheet listing 8kW as the spec and also mentioning 10kW.) A lot of the information I've read on planning systems doesn't talk as much about the total wattage limit. Then you factor in that using Voc as long as you're not in the cold, is higher then I'll ever see, the total wattage of the panels will be much lower then calculating wattage with Voc and Isc. Should I be using Max power voltage and max power current when comparing wattage of system to wattage of inverter PV input? I'm trying to max out my PV wattage input based on the inverter and panels I'm looking at. Then you have over paneling, where once you exceed a certain current, you're not adding usable current to the system (so long as the system won't fail with that extra current.)