diy solar

diy solar

Matching inverter to battery - amps DC to amps AC

jwoo

New Member
Joined
Apr 3, 2021
Messages
3
I'm looking for some clarity on AC amps usage as it corelates to the DC amp coming out of the battery. I want to make sure I do not go over my batteries max discharge rating. I don't see where this is discussed much at all. Say I have a 2000 watt inverter with a load pulling 2000 watts at 120 volts AC. That would mean I am pulling 16.6 amps AC from the inverter. So how many amps am I pulling from a 12 volt battery that's powering that inverter? From What I can understand I would be pulling 166 Amps DC from my battery??? Therefor, exceeding a battery with a 100amp discharge rating. Am I going about this right? so if I have a 12 volt battery rated to 100 amps max discharge the max inverter I should use would be.....1200 watt inverter???

AND
would a battery bank in parallel change the max discharge in any way? Say two 12 volt batteries in parallel. max discharge on each battery is 100 amps. would the battery bank max discharge still be 100 amps?
 
Last edited:
rough is 10 to 1

16.6 amps at 120vac will be about 166 amps from a 12 volt battery. For a 2000 watt inverter 200 amp discharge rate would be minimum battery rating and closer to 300+ I recommend.
 
When you convert the DC to AC, you will have conversion loss, so to get 2000W on the AC output of the inverter, then the input power to the inverter will be higher than the output power, typical numebr we use around here is around 15%, so as state above, it will be around 200A of DC current.
 
Last edited:
The key thing to remember is that Watts out of the inverter is roughly equivelent to Watts into the inverter.

So if you have 2000W coming out of the inverter, you will have slightly more than 2000W going in. All you have to do is convert from Watts In to Current in. To do that you use Watts Law: P=I*V. (Power=Current*Voltage) A LiFePo 4 12V nominal battery runs at arrond 12.8V but you have to plan on when the battery is nearly empty and running at near 12V, so the first estimate is 2000=I * 12.V or I = 2000W/12.V=166.7A

Now, if you want to get more accurate you have to consider the efficiency of the inverter. If it is a 90% efficent inverter then Watts in * .9 = 2000 or Watts in = 2000/.9 = 2222.2W and Current In is 2222.2/12=185.2A.
 
The key thing to remember is that Watts out of the inverter is roughly equivelent to Watts into the inverter.

So if you have 2000W coming out of the inverter, you will have slightly more than 2000W going in. All you have to do is convert from Watts In to Current in. To do that you use Watts Law: P=I*V. (Power=Current*Voltage) A LiFePo 4 12V nominal battery runs at arrond 12.8V but you have to plan on when the battery is nearly empty and running at near 12V, so the first estimate is 2000=I * 12.V or I = 2000W/12.V=166.7A

Now, if you want to get more accurate you have to consider the efficiency of the inverter. If it is a 90% efficent inverter then Watts in * .9 = 2000 or Watts in = 2000/.9 = 2222.2W and Current In is 2222.2/12=185.2A.
If I use @Bud Martin 85% efficiency for the inverter, I get 2000/.85=2352.9W and 2352.9/12=196A

Notice that most of the time the cells will be operating at closer to 12.8V. Consequently most of the time it will be 2352.9/12=183.82A at full power.
 
I edited the original post, but just to put it down here as well.
would a battery bank in parallel change the max discharge in any way? Say two 12 volt batteries in parallel. max discharge on each battery is 100 amps. would the battery bank max discharge still be 100 amps?
Also of could in series i would up the voltage therefor lowering the amps being pulled out, but how about in parallel. any affect?
 
If you put two batteries in series then you double the voltage, but you cut the current in half for the same amount of power as parallel batteries
If you put two batteries in parallel then the voltage stays the same, but you need twice the current for the same power as 2 series batteries.
 
Say two 12 volt batteries in parallel. max discharge on each battery is 100 amps. would the battery bank max discharge still be 100 amps?
If the max current out of one battery is 100 amps. The max current out of two batteries in parallel is 200A. The max current out of two batteries is still only 100 Amps but the voltage is doubled so the max power remains the same as the two batteries in parallel. However by going in series you can use smaller wires because the wire size is determined primarily by amperage... not voltage.
 
perhaps you are confusing a few different terms...

filterguy said the most important thing to think about at the start:
"The key thing to remember is that Watts out of the inverter is roughly equivelent to Watts into the inverter."

watts are the instanteous demand of power.

The "roughly" part is because everything that gets converted from one thing to another, loses something in that conversion.

So here is a real simple way to look at it:
the watts I want OUT of my system is equal to the watts going INTO my system PLUS some additional watts to handle the conversion
This helps you not get caught up in all the potential units of measure, just convert everything into watts for the first step.

The maximum wattage of your inverter has NOTHING to do with the wattage of your batteries, as long as you have enough.

How much power(this is watts) can your inverter deliver at its maximum, this is the most it will use (you can ask the inverter for anything UP TO this maximum amount), your inverter has no idea what is providing it the power, it just watts enough to meet whatever you are asking for on the output side.

How much energy (usually talked about in watt-hours) you have stored in your batteries is how LONG it can feed the inverter. You can have a 10000watt inverter run off a battery pack you hold in your hand...of course it will only run for a fraction of a second hehe.

Notice that storage (batteries) involves how much energy is stored(watt hours = battery voltage * battery amp hours), inverters are described in terms of the most watts they can handle.
This is how you figure out how long yor inverter can provide power.

As to your question of how huge that amperage number looks for a 12v battery system, the bottom line is, yes, as the voltage goes down on the input the system will need to draw more current to make up the needed POWER.
That is why increasing your input voltage to inverters and solar panels really helps deliver more power without the penalty of more amps.
I am sure if you have been looking into this you know the power loss in a system via wires is almost entirely due to wire heat losses...which is from the flow of current.
If you have an electric stove or resistive space heater...those are just wires with too many amps going thru them!!

batteries in series means you ADD the voltages which is how you get more energy. current stays the same, voltage increase.
batteries in parallel means you ADD the amps which is also how you get more available energy, the

if you have two 12v 100Ah batteries:
in series you would have 24v * 100Ah = 2400Wh (Wh ==watt hours)
in parallel you have 12v * 200Ah = 2400Wh

but the current(amps) in your series setup would only need to feed the inverter HALF as much current for the same amount of energy.

well this turned into a rather long winded write up...

Ok, here is the simple simple simple answer:
The inverter will only draw in enough power to supply whatever is needed on the output side.
If there is a 1 million AMP hour battery on the input and you plug a 60watt bulb on the output, the invert will just draw in 60watts of power (plus the little bit of conversion penalty extra that was mentioned).
 
Back
Top