diy solar

diy solar

Big inverters vs smaller inverters- power usage.

DreaminFred

New Member
Joined
Jun 7, 2021
Messages
6
Wondering. If you have a cumulative intermittent load of 1500 watts being powered by an inverter would you burn battery-stored energy faster with a 3000W inverter than a 2000W inverter of the same brand/quality?

I’m sure there is some overhead/idle energy usage, but how significant would the extra standby consumption be? Are there other issues I’m missing?

I’ve seen it recommended recently to downsize a purchase decision to smaller, but, other than the cost, it seems to me that the extra capacity allows for future load increases without having to re-engineer an upgrade.

F

BTW, I’m hooking up a Renogy 3000W open box inverter I got as a refurb on eBay for $275. Marine solar system.
 
Generally, yes.

Inverters have an idle power usage. A Victron 48/5000 burns 30W just by being powered on. That's 0.72kWh/day or 60Ah of 12V battery capacity - would kill a medium size car battery in 24 hours even if no loads are supplied.

The MPP Solar/Growatt units and most all-in-ones are notorious for high idle energy consumption.

This consumption does NOT go away as the inverters are used. This is the energy consumption the inverter needs to perform its function. Inefficiencies are in addition to the idle consumption. Net efficiencies of low power draws are horrifically bad. When you include the idle power consumption of the inverter with it's conversion inefficiency while powering small loads, 50-150W, 55-70% efficient is a good number.

Many units have a "low power" option where idle power consumption is decreased; however, those are only useful if you have NO loads whatsoever on the unit. If you need AC loads of any kind, you can't use any "low power" function.

You have to answer the question for yourself. Do you want to size for the "final" system that may require a larger inverter and eat the additional power consumption, or do you want to do it incrementally at higher $ cost.

from:


"No Load Power Consumption: Normal: <30W; Power Saving: <15W"

You should count on 60Ah of 12V capacity being consumed by the inverter in 24 hours.

It's often overlooked, but including the inverter's idle power consumption in your energy audit helps.
 
Nothing that can't be solved by throwing more $$$ at it.

An example from the high end, here's a 6kw inverter (11 kW surge) that consumes 25W at idle, 4W standby:


The difference between brands is larger than difference between various wattage models from a given brand.
A standby mode which wakes up periodically to see if loads present makes a big difference.
Some inverters have a connector for remote enable, so could be turned on/off with a switch, thermostat, etc. for zero consumption when not needed.

By shopping specs of inverters that meet anticipated future loads, you may find one that is good enough for present low-power needs as well.
 
Great responses, thanks guys, but is lower standby power consumption the only reason to go with a “right-sized” inverter vs “overkill?”

!500 Watts of load is going to consume 1500w/12V=125 Amps whether coming out of a 3000W inverter or a 2000W inverter, yes?
 
!500 Watts of load is going to consume 1500w/12V=125 Amps whether coming out of a 3000W inverter or a 2000W inverter, yes?
Not if one unit is 80% efficient and the other is 95% efficient. You need to check the specs (whether they are accurate is another story).
 
Great responses, thanks guys, but is lower standby power consumption the only reason to go with a “right-sized” inverter vs “overkill?”

!500 Watts of load is going to consume 1500w/12V=125 Amps whether coming out of a 3000W inverter or a 2000W inverter, yes?
either size inverter would handle the load
It really depends on how much "operating overhead" your battery bank can afford.
When not in use it can be turned off.

Just my opinion based on no real facts, but I prefer a larger than needed inverter.
I just feel it's better to work a unit less than push the limits of a smaller unit.
Less wear and tear and besides I'm ready to increase my loads as need arises
 
I think it’s worth factoring in the way the loads are going to be used and how often

E.g. an AC fridge will be powered constantly and need te inverter to be on all the time but say a microwave will only be used occasionally

I’m these cases I would give the fridge it’s own small inverter sized to be able to just do the job. % inefficiencies decreased as inverters reach closer their top end

Then I would have a slightly smaller big inverter for occasional higher loads. I would keep this turned off most of the time only powering it up when needed

On 12V systems where efficiency is king it tends to be most efficient to run any load off DC when ever possible. Fridges and device charging are both good examples. Inverter power should be used for occasional high power use
 
This is a complicated question with some simple rules of thumb, that of course have exceptions to the rule.

First there are two types of efficiency to consider (1) "no load" (idle) consumption (2) conversion effiiency. The first is the energy overhead being burned regardless of whether the inverter is working hard or just sitting there idle (there are various 'eco' or 'standby modes that can dramatically reduce this when there is no load. The second is the ratio between energy-in/energy-out, i.e. the efficiency penalty incurred converting low voltage DC to higher voltage AC.

Some rules of thumb:
1. More expensive inverters will tend to have higher conversion efficiency and lower no load draws Watt for Watt compared to similar budget models.
2. Most quality inverters will have low power 'eco' modes, but there are caveats to these modes from what I've heard
3. Higher power inverters tend to have higher no load draw
4. Inverters do not have uniform efficiency across their whole power range (most but not all will be most efficient at or near their limit)
5. No inverter is more efficient than the most efficient inverter, so the more you can run directly from DC the less efficiency penalty you get hit with.

There are exceptions and caveats to almost all of these generalizations.
 
5. No inverter is more efficient than the most efficient inverter, so the more you can run directly from DC the less efficiency penalty you get hit with.

There are exceptions and caveats to almost all of these generalizations.
Conversely if your usage model requires your inverter to be on 24/7 why waste the money and time trying to get dc versions of your loads and appliances.
This becomes more true the greater capacity of the system.
 
Conversely if your usage model requires your inverter to be on 24/7 why waste the money and time trying to get dc versions of your loads and appliances.
This becomes more true the greater capacity of the system.
True! if you are building an off-grid home or something and at most 2% or 10% or 20% of your consumption could be DC, its not worth the trouble probably. I was speaking purely from the perspective of conversion efficiency, not necessarily what is practical or economical or 'worth the effort', but those are certainly important considerations that may overshadow efficiency or may not.
 
Conversely if your usage model requires your inverter to be on 24/7 why waste the money and time trying to get dc versions of your loads and appliances.
This becomes more true the greater capacity of the system.
Agree

The ‘sticking to DC loads’ is primarily for small off grid systems like RVs and cabins with limited space for solar panels and therefore small batteries.
 
This is a complicated question with some simple rules of thumb, that of course have exceptions to the rule.

4. Inverters do not have uniform efficiency across their whole power range (most but not all will be most efficient at or near their limit)

PV inverters are expected to do their best work near full load, while battery inverters normally run at a fraction of full output.
This link for Sunny Island shows peak 96% efficiency (4% loss) at 20% load, dropping to 92% (8% loss) at 100% load.


The efficiency curve seems to be a baseline wattage consumed by inverter while operating, and I^2R losses in the transistors and inductors/transformers. That "square" term starts driving losses higher faster as current increases, instead of being proportional to load.
 
  • Like
Reactions: Dzl
PV inverters are expected to do their best work near full load, while battery inverters normally run at a fraction of full output.
This link for Sunny Island shows peak 96% efficiency (4% loss) at 20% load, dropping to 92% (8% loss) at 100% load.


The efficiency curve seems to be a baseline wattage consumed by inverter while operating, and I^2R losses in the transistors and inductors/transformers. That "square" term starts driving losses higher faster as current increases, instead of being proportional to load
I will use lower case i for current here since upper case looks too much like a one.

Help me think about Power = i^2R losses. The losses are power losses and Power = Vi in one equation and Power = i^2R in another.

As far as, “That "square" term starts driving losses higher faster as current increases, instead of being proportional to load.”, goes, I am not so sure. It sounds as though it would be exponential, but consider this:

The load (L) is power measured in Watts. So doubling L1 = X watts gives L2 = 2X watts.

From a Load = L = V * i perspective:
L1 = V1 * i1
L2 = V2 * i2
L2 = 2 * L1 (doubling the load)
L2 = 2(V1 * i1)
V2 * I2 = 2(V1 * i1)
Let V2 = V1 (Hold voltage constant and let current vary)
V1 * i2 = 2(V1 * i1)
i2 = 2(V1 * i1) / V1
i2 = (V1 * 2 * i1) / V1
i2 = (V1 / V1) * 2 * i1
i2 = 2 * i1

So the current increase is the same as the load increase value of 2, thus linearly proportional to the load.

From a load = L = i^2R perspective:
L1 = i1^2 * R1
L2 = i2^2 * R2
L2 = 2 * L1 (doubling the load)
L2 = 2(i1^2 * R1)
i2^2 * R2 = 2(i1^2 * R1)
Let R2 = (R1)/2 (because the parallel combination of twice the load is half the resistance)
(i2^2) * ((R1)/2) = 2(i1^2 * R1)
i2^2 = (2(i1^2) * R1) / ((R1)/2))
i2^2 = ((2(i1^2) * R1) * 2) / R1
i2^2 = 4(i1^2)
Take square root of both sides
i2 = 2 * i1

And again the current increase is the same as the load increase of 2.

HLB
 
Help me think about Power = i^2R losses. The losses are power losses and Power = Vi in one equation and Power = i^2R in another.

As far as, “That "square" term starts driving losses higher faster as current increases, instead of being proportional to load.”, goes, I am not so sure. It sounds as though it would be exponential, but consider this:

I think your equations after this are referring to power delivered to load, and resistance of load.
I'm referring to power dissipated in inverter (and wiring), which is loss in efficiency.

Power = i^2R is the behavior of current forced through a resistance.
Assume R is constant, and comes from wire, or a fuse, or MOSFET when on.
Current i is whatever the inverter draws from battery to power your AC load. (which of course increases as battery voltage drops)

If load wattage is doubled, battery current is doubled. (i x 2)^2R goes up by a factor of 4 due to the 2^2 term.

The battery inverter efficiency curves seem to follow that, added to no-load power consumption.
For one brand, this matched to three decimal places making me suspect it was calculated not measured.

Inverter wattage limits likely come from continuous current its MOSFETs and inductors/transformers can handle without overheating. Copper has more mass, takes longer to heat up. Surge ratings likely come from MOSFETs which operate at junction temperature well above package and heatsink.

e.g. consider a MOSFET with 150C max rating operating at 125C, and heatsink at 65C. That's a 60C rise. If surge rating is 2x operating, MOSFET power dissipation would be 4x, for steady-state temperature rise 4 x 60C = 240C, 240C + 65C = 305C (ignoring heatsink rise due to heat transfer to air, just thinking of heatsink as an infinite thermal sink.) Obviously MOSFET can't take that; it can only deliver that much current for an instant until its own thermal mass has heated too much.
 
The dominate idle consumption on inverters should be caused by the power consumed switching the high frequency power MOSFET gate input capacitance. The larger the inverter VA rating, the greater the sum total of MOSFET input capacitance to chop on and off.

There are a couple of other things impacting idle power, primarily because of poor design. A poorly designed low pass L-C filter for filtering out the high frequency PWM from sinewave output can put too much reactive load on inverter causing inverter to dissipate more idle power.

For 24v and 48v inverters there should be a small DC to DC buck converters to efficiently produce 12-15v to run the MOSFET drivers. Some cheap inverters just use a linear regulator to drop battery input voltage.

Some inverters cut back switching of some of the parallel MOSFET's when AC load is low. This gives improved idle power but can be a problem if the partial MOSFET shutdown is carried too far when sudden large load surge currents show up resutling in blowing out MOSFET's before shutdown MOSFET's can be brought back online.

Inverter should be sized to your needs to minimize inverter overhead power. The toughest thing to figure out is what power capability for inverter is needed to handle your highest turn on surge loads.

Single phase motors have the greatest surge current, typically lasting for about 0.5 seconds after activation. An inverter surge spec needs to be for at least a half second in duration. Many HF inverters spec surge for 1 msec, this is useless for most applications.
 
Last edited:
My 18kw inverter can handle 3X the surge current (54kw) for 20seconds. This is the most important spec to me.
It also uses about 300 watts/hr idle power. But the 12kw model which is 33% smaller still uses 270 watts.
I just figure one extra solar panel to power the inverter. If you have a lot of heavy loads, it's not a big deal.
But not everyone is running a central air heat pump, microwave, water heater and large air compressor. :)
 
There are really three issues to consider when selecting an inverter.
1/ zero load idling power.
2/ continuous max power rating, and efficiency.
3/ peak short term surge capacity.

The importance of each varies considerably depending on the nature of the load.

If an inverter drives only a single infrequently used high power load, such as a well pump, or air-conditioner for example. And the inverter is switched off when the load is not running, very high idling power is of no significance.

Motor start up loads are a big problem for inverters that have very limited surge capacity. It might need 5Kw to start a 500 watt motor. A 5Kw rated inverter might be required, unless a much smaller inverter was specified that CAN reliably provide high short term surge current.

And on it goes. The highest efficiency point is never at full flat out power as mentioned above by previous posters.
Its probably at about 15% to 20% of rated power which can fit in very well with normal domestic load profiles.
Many inverter manufacturers lie a little bit and quote the max rated power, and efficiency at the highest efficiency point.
That's like saying my car can do 150 Mph and gets 30 Mpg. It probably can, but not both at the same time !!
 
Last edited:
if you are building an off-grid home or something and at most 2% or 10% or 20% of your consumption could be DC, its not worth the trouble probably
For those with small systems- either intentionally or defaulted by the square footage available- DC convenience lighting and device charging often balances down battery inverter idle consumption for at least part of the day. 120VAC LED ‘bulbs’ often use as much as or more than direct 12VDC LED in watts/lumens because the “120” bulb has internal circuitry converting AC to DC losing some watts as heat. So for the small system (tight budget folks often are here) some DC makes sense. To me, however, the efficiency lost to the inverter that one panel or two makes up for to run a smaller 120V fridge 24/7 is a small ‘price’ to pay VS a $1600+ DC fridge. The loss isn’t a loss, really it’s just what it takes to run a fridge imho
Inverter should be sized to your needs to minimize inverter overhead power. The toughest thing to figure out is what power capability for inverter is needed to handle your highest turn on surge loads.
As per OP question snd in the spirit of your statement I wouldn’t run a 3000W inverter when all I need is a 1200W inverter. (That’s assuming 800W is typical use peak).
When you get to needing a 3000W inverter there’s AIOs and other components that start sounding like better spends and better dependability imho
I just figure one extra solar panel to power the inverter. If you have a lot of heavy loads, it's not a big deal.
People think about ‘wasting’ money on more panels- that confuses me. If it lets you do what you need to it’s merely providing the system architecture to accomplish the goal successfully. The cost is one-time; anything unharvested isn’t “wasted” it’s merely not used- like a river running by or the rainwater from your gutters. It doesn’t cost anything to let it run away.
 
  • Like
Reactions: Dzl
My 18kw inverter can handle 3X the surge current (54kw) for 20seconds. This is the most important spec to me.
It also uses about 300 watts/hr idle power. But the 12kw model which is 33% smaller still uses 270 watts.
I just figure one extra solar panel to power the inverter. If you have a lot of heavy loads, it's not a big deal.
But not everyone is running a central air heat pump, microwave, water heater and large air compressor. :)
In reality, a 270W drain, requires over four 270W solar panels to cover the drain of operation… since 270W x 24hours is 6480Wh, and you only get 5.5 sun hours of production time…
 
  • Like
Reactions: Dzl
Back
Top