No sunny days

cho

New Member
Joined
Oct 21, 2021
Messages
21
I have a hybrid inverter with two acid batteries (260a) hocked up with 6 x 285w panels. In house I have separated electrical wires; one parallel for grid electric and one parallel for solar electric. Both on 220v. When there is sun I can run everything on solar power without using grid energy. But in place where I live during autumn and winter there can be very long periods without sun (few weeks). In that no sun period my idea was to charge batteries during the night when electric power is 50% cheaper and then using it during the day. What I figure out by measuring output and input power to batteries is that for charging batteries during night inverter is spending 2 x more kWh then that I use during day. I measured ten days input / output and I got about 10 kWh spending during day and 20 kWh for charging battery during night. Does anyone have some advice how can I lower that big difference between spent during day and charging batteries during night? Or how to effectively use solar system in period when there is no sunny days?
 

Short_Shot

Photon Sorcerer
Joined
Jul 13, 2021
Messages
1,820
I'm confused.

If you're only measuring what goes into or out of the battery you might be missing the total picture. Where are you taking the measurements from?

10kwh out and 20kwh in suggests your batteries are not fully charged and each night, but your indicated battery capacity seems far too low to put 20kwh into them unless it's 48v or higher.

What is your battery voltage so we can identify how much capacity you have in watt hours?
 

cho

New Member
Joined
Oct 21, 2021
Messages
21
I'm measuring power that goes from grid to inverter and power that goes from inverter to electric line in house only for solar power.
During day inverter is not connected to grid and then I measured power that goes from inverter to electric line using only solar/battery power.
During night inverter is connected to grid and then I measured power that goes to inverter from grid. In that time inverter is only charging batteries (no output).
Batteries are 12v 260a, hybrid inverter 24v.
Inverter is connected to the grid from 9pm till 7am when electric power is 50% cheaper and it is charging batteries almost always to full.
 
Last edited:

Supervstech

Administrator
Staff member
Moderator
Joined
Sep 21, 2019
Messages
7,892
Location
Belmont, NC
I have a hybrid inverter with two acid batteries (260a) hocked up with 6 x 285w panels. In house I have separated electrical wires; one parallel for grid electric and one parallel for solar electric. Both on 220v. When there is sun I can run everything on solar power without using grid energy. But in place where I live during autumn and winter there can be very long periods without sun (few weeks). In that no sun period my idea was to charge batteries during the night when electric power is 50% cheaper and then using it during the day. What I figure out by measuring output and input power to batteries is that for charging batteries during night inverter is spending 2 x more kWh then that I use during day. I measured ten days input / output and I got about 10 kWh spending during day and 20 kWh for charging battery during night. Does anyone have some advice how can I lower that big difference between spent during day and charging batteries during night? Or how to effectively use solar system in period when there is no sunny days?
Either there are 10KWh loads on the battery, or there is a mistake in the wiring, or in the interpretation of what energy is being used.
 

Short_Shot

Photon Sorcerer
Joined
Jul 13, 2021
Messages
1,820
24v @ 260ah absolutely does not account for 20kwh of charging, so that can't be what you're measuring.

There's a mistake in how you're figuring energy use somehow.
 

cho

New Member
Joined
Oct 21, 2021
Messages
21
10 kwh out from inverter (battery only) / 20 kwh in inverter (charging batteries) - is total value for 10 days
so one day would be 1 kwh out / 2 kwh in -> I used 1 kw and then I need 2 kw for charging batteries
my conclusion is that inverter is using to much energy for charging batteries from AC output because of converting AC to DC
 

circus

Solar Addict
Joined
Jul 8, 2021
Messages
379
two acid batteries.... In that no sun period my idea was to charge batteries during the night when electric power is 50% cheaper and then using it during the day.
Better to consume grid power and just idle the batteries spending 10¢ a month floating on a battery maintainer. There's loss from the charger and battery, not to mention cycle deterioration.
 

cho

New Member
Joined
Oct 21, 2021
Messages
21
Of course there are some losses and inefficiencies, but they can't be that big.
I was very surprised by result of testing but numbers don’t lie. By converting AC to DC and storing in batteries inverter will spend 1.7x more energy that you will get from batteries out.

Example: for inputting (AC 220V source charging) 1 kwh in batteries (lead acid 12V) inverter will spend additional 0.7 kw
 

MattiFin

Solar Addict
Joined
Dec 31, 2020
Messages
446
Of course there are some losses and inefficiencies, but they can't be that big.
OP has very small loads if he uses 10kWh in 10 days, leading to all of the components working at really inefficient range.
With larger load sort of best-case scenario efficiency would be:
Charger 90%
Lead acid-battery 70% (70% energy efficiency as measured in kWh, coulombic or charge efficiency measured in Ah would be better than that)
Inverter 90%
0.9*0.7*0.9 == 56,7%

With low loads and lead acid battery nearing full charge you might get even lot worse:
Charger 80%
Battery 60%
Inverter 80%
=0.8*0.6*0.8 =38,4% efficiency

Lead acid just sucks for application like this and even with Li-battery the idea is born dead as typically battery cycles cost more than the saved electricity.
 

upnorthandpersonal

Administrator
Staff member
Moderator
Joined
Dec 25, 2019
Messages
3,259
Location
63° North, Finland
OP has very small loads if he uses 10kWh in 10 days, leading to all of the components working at really inefficient range.
With larger load sort of best-case scenario efficiency would be:
Charger 90%
Lead acid-battery 70% (70% energy efficiency as measured in kWh, coulombic or charge efficiency measured in Ah would be better than that)
Inverter 90%
0.9*0.7*0.9 == 56,7%

With low loads and lead acid battery nearing full charge you might get even lot worse:
Charger 80%
Battery 60%
Inverter 80%
=0.8*0.6*0.8 =38,4% efficiency

Lead acid just sucks for application like this and even with Li-battery the idea is born dead as typically battery cycles cost more than the saved electricity.

Yeah, I missed the 10 day aspect - I thought it was just a test during one day...
 
Top