Is there a rule of thumb or standard assumption for determining actual (average real world) watt/hrs supplied to the battery bank from nominal panel wattage (for horizontal mounted panels)??
Obviously any formula would just give a rough estimate of efficiency, and vary depending on temperature, and other factors. But I'm sure there must be a formula that people use to determine how many watts of panels are needed to supply a certain amount of watt/hours, and I would imagine there is a formula that specifically pertains to horizontally mounted panels.
Can anyone enlighten me or point me towards what I'm looking for?
- I remember years ago, the conventional wisdom on an expedition vehicle forum I frequented was [nominal panel wattage] x 0.7 (70%) = expected panel output in average clear conditions.
- I know I've heard Will mention he assumes a ballpark 50% efficiency from panel wattage to actual power delivered to devices (that means accounting for all losses, including inverters, converters, etc, and is not what I'm interested in)
- I believe AM Solar uses [total array wattage] x 0.9% / V = charging current on output size of MPPT controller (but this also isn't what I'm looking for and assumes perfect panel efficiency I believe).
Obviously any formula would just give a rough estimate of efficiency, and vary depending on temperature, and other factors. But I'm sure there must be a formula that people use to determine how many watts of panels are needed to supply a certain amount of watt/hours, and I would imagine there is a formula that specifically pertains to horizontally mounted panels.
Can anyone enlighten me or point me towards what I'm looking for?