diy solar

diy solar

Dual Inverters for low and high wattage

  1. It's a 26 foot boat. I can put (2) 100 watt panels on it. Not 3, not 4. There is no usable/clear location! I suspect a van would have similar constraints.

Hope you're using the highest efficiency panels to pack the most watts you can into that space. My older ones are 12% to 14%, newer are 20% so by replacing I get 50% more in the same space.

Also, does/would an MPPT charge controller harvest the most power for you? With PWM, panel voltage is pulled down from Vmp = 17V or so to 13.5V or whatever battery is, and the delta voltage is lost power. If the panels are oriented identically so they can be wired in series to get the voltage headroom needed for MPPT, it will extract the most power over a wide range of conditions.

My whole motivation for this thread was reading that: Inverters are most efficient near their rated output. That a large inverter running at low output would be far more wasteful than a low-wattage inverter. That's all that prompted this thread. I was hoping that someone here could definitely validate whether this was marketing hype or marketing BS.

Is that what the data shows for inverters you're looking at? The ones I know of peak around 25% load
(Which would make 80W from the 300W inverter you're looking at ideal, except for the 85% actual efficiency you determined.)

The 65% figure was for Snoobler's 5kW inverter delivering 71W. (Not bad, that's almost 4x as big as your 1500W.)
Your interpretation of marketing "up to" numbers calculated out to 108W/84W consumed for large/small inverters.
Since your measured figure for small inverter differed considerably, would be interesting to see how large inverter measures.

What I've seen for multiple models/brands is that power wasted is power consumed at zero output plus a term which is proportional to the square of output power. Resistive losses in the output transistors and copper windings goes as the square of current.

For my large battery inverters, here's the efficiency curve.
At 100% (6000W), efficiency is 91% to 92% depending on battery voltage.
At 25% (1250W), it is 95% to 96%
At 10% (600W), it is 93%

SI 6048 efficiency.jpg
 
Is that what the data shows for inverters you're looking at? The ones I know of peak around 25% load
(Which would make 80W from the 300W inverter you're looking at ideal, except for the 85% actual efficiency you determined.)

The 65% figure was for Snoobler's 5kW inverter delivering 71W. (Not bad, that's almost 4x as big as your 1500W.)
Your interpretation of marketing "up to" numbers calculated out to 108W/84W consumed for large/small inverters.
Since your measured figure for small inverter differed considerably, would be interesting to see how large inverter measures.

What I've seen for multiple models/brands is that power wasted is power consumed at zero output plus a term which is proportional to the square of output power. Resistive losses in the output transistors and copper windings goes as the square of current.

For my large battery inverters, here's the efficiency curve.
At 100% (6000W), efficiency is 91% to 92% depending on battery voltage.
At 25% (1250W), it is 95% to 96%
At 10% (600W), it is 93%

I just knew if we got on the same page, I'd learn a lot from you. I am an engineer, but wrong kind for this forum. I can do a DC circuit, but much beyond Ohm's law, and I'll go blank. The fact you have a graph is a shocker considering how long I have been dumpster diving on the Internet. Not one Inverter I looked at... Amazon, company sites, manuals has one stinking graph. I can think of two reasons for that...

(1) I'm looking at low-end inverters. As mentioned above, I went away from the gold standard of Victron. But that's Amazon's fault. They showed no hits for tiny Inverters. I falsely assumed that Victron got out of the rat-race of the low-end market.

(2) That the type of Inverters you are familiar (large home, stationary). I can easily imagine that for homes that the design optimization might be toward some statistics of usage in a home. That efficiency at 25% of max value saves more over the long run. Whereas the low-end stuff I'm looking are gear to hang off a car battery and they are optimized for peak efficiency at the max rating. Just guessing. But, I'd do that if I couldn't make them 95% efficient across the board.

Either way, I cannot point to some graph or website that confirms my stipulation of efficiency at the high end of a units capability. I do recall several posts on this forum that lead me to that conclusion. ALSO, the test I did also shows a significant gain toward the high-end. Noteworthy that at 65 watts, it had the worse efficiency even though the fan was not turned on! The best efficiency was at the high-end just below its maximum output with the fan going full-tilt.

Looking at your numbers and graph... I think #2 is more likely. That the average consumer of your devices is sophisticated enough to recognize the long-term advantage of efficiency and recognize the unit is rarely pushed to its maximum capacity. Whereas these car inverters are engineering optimized at their maximum value and for production cost.
 
The manuals I'm finding for small Victron inverters don't have efficiency curves


The manual for my SMA Sunny Island is where I got that curve


Their grid-tie inverters have a similar curve, but doesn't roll off as steep


As you noted, an inverter to power a house has modest load most of the time, but can deliver more when needed. For surge 2x of peak wattage, only runs for seconds (4x the heating of transistors)
PV inverter isn't running as many hours, and is near peak for a while - its size is matched to the array.

The Outback sealed 2012 model might be interesting for your application. But no graphs there, either.

 
Hope you're using the highest efficiency panels to pack the most watts you can into that space. My older ones are 12% to 14%, newer are 20% so by replacing I get 50% more in the same space.

Also, does/would an MPPT charge controller harvest the most power for you? With PWM, panel voltage is pulled down from Vmp = 17V or so to 13.5V or whatever battery is, and the delta voltage is lost power. If the panels are oriented identically so they can be wired in series to get the voltage headroom needed for MPPT, it will extract the most power over a wide range of conditions.
I segregated this one out as I don't want to water down this thread's primary purpose and make it about my system only. The primary importance is the comparison of Low versus High Inverters. If you were to read my posts, I'm trying to educate as quickly as I'm trying to learn from you experienced members.

I do fully understand that to be totally efficient that all aspects need to be evaluated. Using an incandescent bulb would totally defeat the use of better solar panels. It is a complete optimization and that energy creation and energy storage are just as important as (this topic) energy transformation.

For this thread, I do want to avoid the influence of price into the equation, just so I (and others) can know the "best". For my personal situation... obviously price does come into the equation . IOWs I want to personally decide if something is "worth" purchasing.

But... to answer your question...

Panels - I took Will videos as gospel and have purchase (2) Rich Solar 100 watt panels. The output vs efficiency was certainly best in his video. I'm sure some might be better, but not significantly. Although typically a monocrystalline panel is more efficient, but is less tolerant of obstructions. In a dynamic environment of a sail boat it will be constantly changed in angle relative to the sun and the amount of obstructions like sails, mass, booms, lines. Poly is more tolerant.

Charge Controller - I'm not using a Victron. I found that they are prone to interfere with marine VHF radios from their own forum. Having to turn off the charge controller to hear/send on the radio is totally unacceptable to me. Life threatening, deal breaker. I found a a small vendor specializing in this field. Fortunately, efficiency is not sacrificed. Although marketing numbers, they claim... 197% over competitor PWM solutions and even 17% over the best of MPPT competitors. But for small systems.... 12/24V, 300 watts max. Perfect for me, but not your level of systems.
 
The manuals I'm finding for small Victron inverters don't have efficiency curves


The manual for my SMA Sunny Island is where I got that curve


Their grid-tie inverters have a similar curve, but doesn't roll off as steep


As you noted, an inverter to power a house has modest load most of the time, but can deliver more when needed. For surge 2x of peak wattage, only runs for seconds (4x the heating of transistors)
PV inverter isn't running as many hours, and is near peak for a while - its size is matched to the array.

The Outback sealed 2012 model might be interesting for your application. But no graphs there, either.


Thank you for the suggestions. I really need to get my cells from the China (group thread) and do the test with the big 2200 watt inverter (in my case, tiny in yours) to see if there is a significant efficiency advantage. As stated... adding the 300W Inverter to my boat ONLY costs about $100 (inverter, wiring and my time to install). If it doesn't justify even that pid'ln amount, I'll investigate your better solutions. Thanks.
 
As a follow up, what is the highest load that you have had successfully running on the Bestek?
I've not approached the "300" watt level. more like 145 watts to date (TV, Refrigerator (compressor was off, etc..)
 
I've not approached the "300" watt level. more like 145 watts to date (TV, Refrigerator (compressor was off, etc..)

I think it is highly dependent on not just the inverter, but the devices being loaded. On my 300 watt inverter, there are many things that shut if off immediately even though on-paper they should not have... but I got up to 294 watts using various bulbs. Point being... check out "surge", "in-rush" current of various devices. Anything using a motor is BAD, anything using a transformer (Microwave) is BAD. Creeping up on it with low surge is what the manufacturers seem to be using in their specs.
 
I'm sounding like a grumpy old man... snapping at other forum members, questioning marketing data, wary of "standards".
Maybe its time to unplug and sail away to tropical oblivions.
Its right to be wary, tech companies (cough cough Apple especially) have an atrocious record of pushing their own standards over universal open standards which hurts consumers and hurts the environment. USB-C PD if it is successful is a potential break from that, its an open standard, not just another alternative to Apple, Qualcomm, Samsung et,c and it is catching on. I believe even Apple is migrating toward it, at least with their thin/light laptops. And the pixel has used the standard for a few years now. I have read that google will soon be mandating downstream OEM's support the standard (which means virtually every smartphone maker outside of China and Apple). This is why I am excited about it, its (1) open/universal (2) actually catching on (3) can work with a wide range of devices and (4) I believe has a fallback mode to normal usb charging (5v 2A?) for unsupported devices.

But there are devils in the details, and in implementation, and I'm still pretty ignorant of the power efficiency side of things (though this would apply for AC and DC chargers I think), and pretty ignorant of how battery management works in devices like laptops and smartphones. I believe with my laptop the power management and battery management resides with the laptop not the charger, as for the phone, I'm unsure, but Google has said (for the pixel 4):
“Pixel 4 is compatible with all chargers and cables that comply with the USB Type-C Cable and Connector Specification”
 
DC-DC USB-C chargers seem to lag in the technology compared to their AC versions and even if current, who are they current with? I see some that claim they have electronics for multiple standards... like this one. You use one port for Pixels and one for another "standard". And heaven forbid if you don't have the right cable and have to charge your Pixel with the other standard. Does is abuse the battery... etc.?
Yes this is quite common, and probably will be for some time since USB-A ports and alternative standards aren't going anywhere quickly. However, the one thing I would point out, the charger you show doesn't have 'one port for the pixel, one port for another standard' it has 'one port for the open USB-C PD standard and one port for another proprietary/legacy standard.' This might sound like semantics, but I think its not, the USB-C port on that chargers has nothing to do with Pixel or Google specifically the standard was developed by the organization behind the development of USB, and is meant to be open/universal. This organization has been horrible at branding and communication (usb 3.0 3.1 3.2 Gen1 Gen2 etc, its a mess) lets hope they do a better job communicating this standard and lets hope it catches on.
 
Yes this is quite common, and probably will be for some time since USB-A ports and alternative standards aren't going anywhere quickly. However, the one thing I would point out, the charger you show doesn't have 'one port for the pixel, one port for another standard' it has 'one port for the open USB-C PD standard and one port for another proprietary/legacy standard.' This might sound like semantics, but I think its not, the USB-C port on that chargers has nothing to do with Pixel or Google specifically the standard was developed by the organization behind the development of USB, and is meant to be open/universal. This organization has been horrible at branding and communication (usb 3.0 3.1 3.2 Gen1 Gen2 etc, its a mess) lets hope they do a better job communicating this standard and lets hope it catches on.
Thank you for clarifying! I thought PB stood for "Pixel Domination" :p

I bought this device... and hoped I'd always have a USB-C to USB-C cable.
 
The only two caveats to be mindful of with PD is
(1) not all chargers support all voltages / power profiles, I believe high power chargers are always compatible with the lower power profiles, but not vice versa. So a laptop charger could charge a phone but usually not vice versa.
(2) There are many off-brands and non-brands that sell usb-c cables and chargers that cut corners or don't fully or properly implement the standard. From what I understand, cables/chargers certified by the USB IF can be trusted, but certification costs money so these chargers tend to cost more, uncertified ones are more of a crap shoot with some meeting the technical standard and others not. The Pixel phones apparently can detect this and self protect (refuse to charge or refuse to quick charge), not sure of other devices will as well.

edit: sorry I'm dragging this conversation off topic.
 
Yes this is quite common, and probably will be for some time since USB-A ports and alternative standards aren't going anywhere quickly. However, the one thing I would point out, the charger you show doesn't have 'one port for the pixel, one port for another standard' it has 'one port for the open USB-C PD standard and one port for another proprietary/legacy standard.' This might sound like semantics, but I think its not, the USB-C port on that chargers has nothing to do with Pixel or Google specifically the standard was developed by the organization behind the development of USB, and is meant to be open/universal. This organization has been horrible at branding and communication (usb 3.0 3.1 3.2 Gen1 Gen2 etc, its a mess) lets hope they do a better job communicating this standard and lets hope it catches on.

I just saw a video of the new Jackery with a PD port. It can charge at almost 60w through the wall wart. If you had a dc source for that, it makes a pretty incredible source through a usb cable
 
  • Like
Reactions: Dzl
As Enya said, " Sail Away, Sail Away, Sail Away". I did on a Taswell43. Unfortunately, I had to come back ,.......LOL
 
edit: sorry I'm dragging this conversation off topic.

Actually... no fret. Until I actually am able to get my cells, build my battery, get my 2200 watt Inverter, run the test... there is a temporary lull on this thread...

Party on!
 
  • Like
Reactions: Dzl
Hi, I have two MPP 5kw inverters in parallel but I also want to conserve power and I only rarely need both units on.
I have installed a current sensing relay to turn on the second unit when the current gets over a specific threshold, I have an additional timer relay to keep the second inverter on for a minimum time after the last current threshold is sensed.
 
I'm looking at low-end inverters.
Here is an alternate consideration to the low- vs high-wattage question.

High-frequency inverters are cheap, lightweight and very efficient especially on standby. My Reliable inverter seems to be “reliable” however others have had issues. Will be fine for powering your minor electronics.

Low-frequency inverters are expensive, heavy and not efficient especially on standby. They are usually very reliable. Your microwave may require a LF inverter to run. UPS units (yes I have a bias) are LF and can be purchased inexpensively when used. I bought a used 800W APC that has plenty of surge capacity to start my full-size fridge. Look for “XL” versions rated for continuous operation, easily identified by the external battery connector.

I would recommend dedicating a LF/UPS inverter to the microwave and a HF for everything else. Turn off when not using the microwave. If the Reliable inverter fails you still have the LF/UPS as a backup.
 
Here is an alternate consideration to the low- vs high-wattage question.

High-frequency inverters are cheap, lightweight and very efficient especially on standby. My Reliable inverter seems to be “reliable” however others have had issues. Will be fine for powering your minor electronics.

Low-frequency inverters are expensive, heavy and not efficient especially on standby. They are usually very reliable. Your microwave may require a LF inverter to run. UPS units (yes I have a bias) are LF and can be purchased inexpensively when used. I bought a used 800W APC that has plenty of surge capacity to start my full-size fridge. Look for “XL” versions rated for continuous operation, easily identified by the external battery connector.

I would recommend dedicating a LF/UPS inverter to the microwave and a HF for everything else. Turn off when not using the microwave. If the Reliable inverter fails you still have the LF/UPS as a backup.

Some good ideas and points to ponder in the above. I love discussions like this.

The only thing I would point out is that idle power draw =/= efficiency. It is one (important) part of the efficiency picture, but conversion efficiency can be an equally important or more important factor depending on the context. I'm not sure (genuinely unsure) that HF inverters have any advantage in that context, and cheap HF inverters usually perform worse in the context of peak conversion efficiency (~80-90% as opposed to low to mid 90's for quality inverters)
 
I wanted to close the circle on this topic... at least to my satisfaction. :)

I finally got the Lishen cells and finished all the top balancing and capacity testing. I returned to do this test and check efficiencies.

Theory - That small Inverters might handle smaller loads more efficiently than a large Inverter trying to support that same small load. In general, Pure Sine Wave Inverters tend to be more efficient toward their stated, maximum values. Company's Engineers are under the assumption that if you purchased a 2200 watt Inverter, it'd be better if it was efficient there versus at 70 watts. They optimize at the 2200 watt number. This held to be true in the earlier 300 watt Inverter test. EVEN at near 300 watts output and the fan going full-tilt, the efficiency was significantly better than at 70 watts and no fan running.

Test - Using the same LiFePO4 battery, use two different Pure Sine Wave Inverters (300 watt and 2200 watt units) to power a 70 watt load. Calculate their efficiency. Each Inverter ran for three hours to improve the accuracy of the test equipment.
2200W Test.jpg300W Test.jpg

Results
1617409576736.png

Interpret
For example - If you have a usable 100 Ah battery, you can run the 70 watt device for:
300 watt Inverter: 100 Ah * 12 volts / 70 watts * 86% = 14.7 hours
2200 watt Inverter: 100 Ah * 12 volts / 70 watts * 78% = 13.4 hours

Summarizing
  1. Is the efficiency improvement valid? - For these specific Inverters, yes. For all Inverters... probably for higher end models. For modified sine wave Inverters, probably not. My only other data point, for top-of-the-line Inverters, we have a $2000 Victron 5kW Inverter that was only 65% efficient at 70W. It is probably safe to say that as the maximum capacity is increased the efficiency at a low wattage gets worse.
  2. This test is certainly not valid for modified sine-wave Inverters. These are also harmful to sensitive electronics and motors.
  3. Should you use a two Inverter solution? For myself, I already have both. Having two gives a level of redundancy. If I was before the purchase, I certainly would not do it. In fact, I probably will only mount the large one and simply expose a 12 VDC socket in case the first one goes out OR I'm in a critical situation where I need to be more efficient.
 
I wanted to close the circle on this topic... at least to my satisfaction. :)

I finally got the Lishen cells and finished all the top balancing and capacity testing. I returned to do this test and check efficiencies.

Theory - That small Inverters might handle smaller loads more efficiently than a large Inverter trying to support that same small load. In general, Pure Sine Wave Inverters tend to be more efficient toward their stated, maximum values. Company's Engineers are under the assumption that if you purchased a 2200 watt Inverter, it'd be better if it was efficient there versus at 70 watts. They optimize at the 2200 watt number. This held to be true in the earlier 300 watt Inverter test. EVEN at near 300 watts output and the fan going full-tilt, the efficiency was significantly better than at 70 watts and no fan running.

Test - Using the same LiFePO4 battery, use two different Pure Sine Wave Inverters (300 watt and 2200 watt units) to power a 70 watt load. Calculate their efficiency. Each Inverter ran for three hours to improve the accuracy of the test equipment.
View attachment 43571View attachment 43572

Results
View attachment 43573

Interpret
For example - If you have a usable 100 Ah battery, you can run the 70 watt device for:
300 watt Inverter: 100 Ah * 12 volts / 70 watts * 86% = 14.7 hours
2200 watt Inverter: 100 Ah * 12 volts / 70 watts * 78% = 13.4 hours

Summarizing
  1. Is the efficiency improvement valid? - For these specific Inverters, yes. For all Inverters... probably for higher end models. For modified sine wave Inverters, probably not. My only other data point, for top-of-the-line Inverters, we have a $2000 Victron 5kW Inverter that was only 65% efficient at 70W. It is probably safe to say that as the maximum capacity is increased the efficiency at a low wattage gets worse.
  2. This test is certainly not valid for modified sine-wave Inverters. These are also harmful to sensitive electronics and motors.
  3. Should you use a two Inverter solution? For myself, I already have both. Having two gives a level of redundancy. If I was before the purchase, I certainly would not do it. In fact, I probably will only mount the large one and simply expose a 12 VDC socket in case the first one goes out OR I'm in a critical situation where I need to be more efficient.
Another solution is to use a current sensor to turn in the second inverter. Reduce the standby losses and still have ample power when you need it.
 

Attachments

  • IMG_20210403_115645558.jpg
    IMG_20210403_115645558.jpg
    121.4 KB · Views: 3
  • IMG_20210403_115654718.jpg
    IMG_20210403_115654718.jpg
    57.3 KB · Views: 3
Back
Top