Hello everyone! I am a total noob in all senses of the word. I tried two different approaches for estimating my total load as a first step.
The first method was to calculate Amp Hours. I did this by dividing the listed wattage on the appliance and dividing it by the voltage it receives. I then multiplied by the amount of time the device would be in use. For example, the water heater I plan to install is a 12,000 Watt system running off of 240 V power. I figured 20 minutes to shower for two people and a few minutes to fill the sink with hot water should be plenty for a day's use. This gave me 25 Amp hours a day for this unit. I tried to repeat the process for every other major power draw in my system. The total came out to 247 Amp Hours even running the 710 watt AC at 110 V for 24 hours a day. With the addition of a bunch of smaller DC appliances that would run much more often and charging computers and such it brought the total up to roughly 400 Amp hours. Based on this I calculated that I should be fine running 800Amp hour batteries at 24volts charging them up with 2,200 W panels in roughly 8 hours of sunshine.
THEN I found another method of estimating energy usage: Whrs. Its simple, they said! Multiply the wattage by the hours used. Bing bang boom. So I did this for the major appliances and found that it would require an astonishing 18KWhrs using the AC only 6 hours a day. The exact same battery setup only provides 19,800 Whrs (taking 800A * 24V). So which is it? Can I power my system with the aforementioned setup, or will I struggle to get through a 12 hour summer scorcher alive?
The only thing I can think of is the inverter. The first scenario is imagining that each Amp is being released from the inverter at no cost with high pressure to do large amounts of work. Perhaps that one Amp will produce 240 Watts instead of 24. This would make it really hard to calculate the Whr potential of the battery because each Amp is spent individually, but could potentially increase the Whr of the same battery bank to 192KWhrs if every amp was discharged at no loss at 240 volts. The second scenario is imagining that all the power in the system is being delivered by a 24 volt pressure. Is it true that the Inverter doesn't really increase the power of the watt? Does it just suck up the extra amps and use them to increase the voltage, or is that energy lost somehow? Please help!
The first method was to calculate Amp Hours. I did this by dividing the listed wattage on the appliance and dividing it by the voltage it receives. I then multiplied by the amount of time the device would be in use. For example, the water heater I plan to install is a 12,000 Watt system running off of 240 V power. I figured 20 minutes to shower for two people and a few minutes to fill the sink with hot water should be plenty for a day's use. This gave me 25 Amp hours a day for this unit. I tried to repeat the process for every other major power draw in my system. The total came out to 247 Amp Hours even running the 710 watt AC at 110 V for 24 hours a day. With the addition of a bunch of smaller DC appliances that would run much more often and charging computers and such it brought the total up to roughly 400 Amp hours. Based on this I calculated that I should be fine running 800Amp hour batteries at 24volts charging them up with 2,200 W panels in roughly 8 hours of sunshine.
THEN I found another method of estimating energy usage: Whrs. Its simple, they said! Multiply the wattage by the hours used. Bing bang boom. So I did this for the major appliances and found that it would require an astonishing 18KWhrs using the AC only 6 hours a day. The exact same battery setup only provides 19,800 Whrs (taking 800A * 24V). So which is it? Can I power my system with the aforementioned setup, or will I struggle to get through a 12 hour summer scorcher alive?
The only thing I can think of is the inverter. The first scenario is imagining that each Amp is being released from the inverter at no cost with high pressure to do large amounts of work. Perhaps that one Amp will produce 240 Watts instead of 24. This would make it really hard to calculate the Whr potential of the battery because each Amp is spent individually, but could potentially increase the Whr of the same battery bank to 192KWhrs if every amp was discharged at no loss at 240 volts. The second scenario is imagining that all the power in the system is being delivered by a 24 volt pressure. Is it true that the Inverter doesn't really increase the power of the watt? Does it just suck up the extra amps and use them to increase the voltage, or is that energy lost somehow? Please help!