• Have you tried out dark mode?! Scroll to the bottom of any page to find a sun or moon icon to turn dark mode on or off!

diy solar

diy solar

Reactive power compensation with hybrid inverter (no grid sell)

Perhaps my inverter has been doing it and I never knew. Interesting.

Thank you very much for this, it makes a whole lot of sense now.

Basically what you are saying is instead of letting my loads pull power from grid, for which the meter can easily account the active/reactive power, in low PV production scenario loads should be powered from battery. This works because the grid power consumption required for charging battery does not generate reactive power costs, right?

If so, then it`s pretty intuitive that everything can work like this with a conventional off grid inverter.

BUT, the issue arise when the batteries reach the minimum SOC. At this point, with grid charging enabled, would the current flow in series like Grid > Battery > Load or it would rather run in parallel like Grid > Battery (until max SoC) and Grid > Load?
As others said, in parallel, but there is nothing stopping you from using a separate charger. At 28kWh per day that's below what my house uses. A single extra AIO connected to the same battery bank could do all the charging or as you mentioned, if there are AIOs with double conversion, it sounds like a good idea.

However, consider the below
Regarding loads, it`s a local shop full of ice, food and drink refrigerators and occasionally air conditioning with around 850 kwh monthly consumption.
So these are all inductive loads (I assume no inverter refrigerators, perhaps wrongly so?)

If so a problem you might face is much higher startup power than running power. So you may end up having to oversize your system. How much is the important question. Do you plan expansion in future? If so probably choosing AIOs that can be paralleled will be important too.
 
I’m kind of going back and forth on whether you have a conceptual gap, or have some really good ideas. Probably a combination of both.

When you operate in parallel with grid and do not do fancy VA correction input to the inverters Your active power and apparent power will go down (active going to zero, reactive unchanged). Since the inverter will push at PF1.0 as measured at its output. Now actually active power can go negative if not using zero export and with enough solar, so the resulting PF is unknown, but often it will be worse.
Most likely is lack of knowledge and not conceptual gap. :cry: Now your answer made things even worse for me, lol.

We kind of established so far that the best would be to use off grid inverter, while you are refering to "not using zero export". Are you saying even with off-grid inverter (with battery bank) the PF hence reactive power consumption can get worse? (stupid question, I know).

To put it in easier terms for myself, my question is mainly if nowadays off grid inverters are "recognized" by the grid as being inductive load while pulling power to satisfy loads (when solar/battery is not sufficient). All I want is to know if an offgrid inverter is able to prevent the consumption of reactive power when pulling current from grid.
 
@Luk88 I think I found a walkaround regarding this, haha.

I am thinking to go for Deye hybrid inverter and simulate an off grid scenario in this way:

- connect loads to LOAD port (full UPS mode)
- connect grid to GEN port
- enable Time of Use settings and select Gen Charge (in reality it will take power from grid wires

When grid is connected to deye inverter through the Grid port, it uses a bypass relay which does not fix my issue, and I end up charging battery + powering loads with no double conversion, so i`m looking more into connecting grid into Gen power.

The deye manual states that when battery reaches a certain SoC which you set, a signal is given to the generator in order to power the battery.

Another alternative would be to use GEN port as micro inv input, but thats another option which I need to explore further.
 
@Luk88 I think I found a walkaround regarding this, haha.

I am thinking to go for Deye hybrid inverter and simulate an off grid scenario in this way:

- connect loads to LOAD port (full UPS mode)
- connect grid to GEN port
- enable Time of Use settings and select Gen Charge (in reality it will take power from grid wires
This sounds like double conversion on a budget. The same power electronics are used as a charger to transfer energy from grid to battery (at PF=1) and then at a later time to transfer energy from battery and solar to loads (at whatever PF the loads need)

my question is mainly if nowadays off grid inverters are "recognized" by the grid as being inductive load while pulling power to satisfy loads (when solar/battery is not sufficient)
I think the answer is that the inverter won't prevent the consumption of reactive power, when it is feeding power through to loads from its grid or generator inputs. In this mode, all of the reactive power of the loads comes straight from the grid or generator inputs.

I don't have experimental proof of this, but when I stated it, others agreed with me. I think your cheap double conversion idea will only work if the inverter doesn't supply loads at the same time as charging the battery from the Gen port.
 
We kind of established so far that the best would be to use off grid inverter, while you are refering to "not using zero export". Are you saying even with off-grid inverter (with battery bank) the PF hence reactive power consumption can get worse? (stupid question, I know).
Not using zero export means on grid and pushing all generated power. I wanted to distinguish this case because it can flip the apparent power into a different quadrant

No statement being made about off grid inverter. Should be clear that the off grid inverter in double conversion exposes the charger’s power factor and inverter is not involved.

If off grid inverter is running off the battery it does nothing to grid

If off grid inverter is in bypass, inverter is off and loads are passed through to grid

I don’t think grid AC coming in on Gen input or on Grid input will be any different. Except maybe from CT sensing positions and algorithm

Maybe there is a misalignment of your terminology with mine or with your country’s terminology with mine
 
Are we talking about loads below AC-out/EPS port on a hybrid, or above AC-In

A hybrid/off grid inverter has a better shot at managing reactive power in the first config because the reactive power has to pass through it. The second config is what the SMA white paper for reactive power control paper battery less grid tie inverter. The reactive power will not pass through it

I was thinking of the second config.

For the first config, I would suggest measuring it or asking manufacturer.
 
In the old days a power plant operator had a big rheostat that they would make adjustments to for controlling the Voltage level output of their respective plant. For rotating generators this would increase or decrease the excitation current which would have the same effect on the output terminals of their generator. Now days most if not all power plants Thermal, Hydro, wind, solar and BESS have automatic voltage regulators capable of regulating output in several ways.

A lot of this would have to do with the mode of voltage regulation being used. Most larger utilities run in Voltage Regulation mode. For my solar plant I tie into a 161 kV line and my voltage regulator is set to maintain 167 kV per my transmission operator's request. If the voltage goes below 167 kV my system will try and push VARS onto the system and if it goes above, we absorb VARS. This is the most frequently, probably 99%, of the way most utilities run their systems.

Two other modes of operation are Power Factor Control and Reactive Power Control. Utilities are not allowed to run in these modes without an exception from their transmission operator. These types of regulation put the responsibility of maintaining a given system voltage on others and can be used to maximize MW production. I could definitely see a utilities interest and making sure a large number of residential solar systems are not run in this mode as it could cause grid instability at the 120/240 VAC level.

On the user's problem of being charged for VAR usage. I am unsure of how a grid supplemented system is going to react, but I am going to go out on a limb here and say your system is set probably to a stable 120/240 VAC output. The problem is your tie point at your meter will vary throughout the day and depending on how close you are to the substation feeding you or any other generating assets your system will still use VARS if your set point is below the grids current level. While technically your system is still not using watts the grid still wants to boost you to its level as long as you are connected. So, if it is running at 122/244 VAC it will try and push you up. Possible solutions to this are if your system will allow you change your voltage regulator to output to 121/242 or 122/124 VAC output and it should reduce VAR usage as long as system voltage remains below your chosen set point. You walk a fine line with this as you don't want to go to high and possibly damage your in-house equipment by putting too much voltage to it. If you were true grid tied with the ability to back-feed the grid this settings change would have to be discussed with your power company as they wouldn't want you boosting system voltage in a way, they have no control over. Granted it is fairly common for utility generators to run a slight VAR output to make up for line losses.
 

It seems to already be at the same stage in Europe. Note that there is no Europe wide national grid from a regulatory point of view. European countries are like the US states in this respect. Interconnected electrically but they would each have to ratify EN50589
I finally found time to go through that article. It is from 2019 and it is pretty amazing to see my country (Poland) barely register in that report as we had 0.5GW of PV power in 2019 (it is still significantly more than the article claims 14W per person not 2W, but maybe they only took large plants into account) to over 18GW now. So installed power grew 36 times in last 5 years...

Coming back to the topic, my tiny on grid inverter meets the NC Rfg requirements but it seems power factor correction is only required for "b type" sources so over 1MW. Definitely not residential systems (yet?).

In the old days a power plant operator had a big rheostat that they would make adjustments to for controlling the Voltage level output of their respective plant. For rotating generators this would increase or decrease the excitation current which would have the same effect on the output terminals of their generator. Now days most if not all power plants Thermal, Hydro, wind, solar and BESS have automatic voltage regulators capable of regulating output in several ways.

A lot of this would have to do with the mode of voltage regulation being used. Most larger utilities run in Voltage Regulation mode. For my solar plant I tie into a 161 kV line and my voltage regulator is set to maintain 167 kV per my transmission operator's request. If the voltage goes below 167 kV my system will try and push VARS onto the system and if it goes above, we absorb VARS. This is the most frequently, probably 99%, of the way most utilities run their systems.

Two other modes of operation are Power Factor Control and Reactive Power Control. Utilities are not allowed to run in these modes without an exception from their transmission operator. These types of regulation put the responsibility of maintaining a given system voltage on others and can be used to maximize MW production. I could definitely see a utilities interest and making sure a large number of residential solar systems are not run in this mode as it could cause grid instability at the 120/240 VAC level.

On the user's problem of being charged for VAR usage. I am unsure of how a grid supplemented system is going to react, but I am going to go out on a limb here and say your system is set probably to a stable 120/240 VAC output. The problem is your tie point at your meter will vary throughout the day and depending on how close you are to the substation feeding you or any other generating assets your system will still use VARS if your set point is below the grids current level. While technically your system is still not using watts the grid still wants to boost you to its level as long as you are connected. So, if it is running at 122/244 VAC it will try and push you up. Possible solutions to this are if your system will allow you change your voltage regulator to output to 121/242 or 122/124 VAC output and it should reduce VAR usage as long as system voltage remains below your chosen set point. You walk a fine line with this as you don't want to go to high and possibly damage your in-house equipment by putting too much voltage to it. If you were true grid tied with the ability to back-feed the grid this settings change would have to be discussed with your power company as they wouldn't want you boosting system voltage in a way, they have no control over. Granted it is fairly common for utility generators to run a slight VAR output to make up for line losses.
There are lots of overvoltage issues around here mainly because it is common to have entire streets on a single (3 phase) transformer. So on that street if every 4th house has ~7kw of solar on average the local voltage very quickly goes above the cutoff point of 253V on one or more phases abd everyone's inverters switch off. Then voltage falls and the cycle repeats.

It is a purely a localised thing so it is not accompanied by any increase in frequency. People have been asking for seasonal switching of transformer taps (to drop summer's voltage by few volts) for ages. However stopping everyone's production is in utilities interest so nothing is done about it.

I wonder is there is some mechanism by which inverters "sinking VARs" can contribute to lowering of voltage while maintaining ability to push active power across an under sized transformer? I doubt it, as the overvoltage is a result of too much active power rather than anything to do with the power factor, but I'd love to be wrong on this.
 
I wonder is there is some mechanism by which inverters "sinking VARs" can contribute to lowering of voltage while maintaining ability to push active power across an under sized transformer? I doubt it, as the overvoltage is a result of too much active power rather than anything to do with the power factor, but I'd love to be wrong on this.
...
If the voltage goes below 167 kV my system will try and push VARS onto the system and if it goes above, we absorb VARS. This is the most frequently, probably 99%, of the way most utilities run their systems.
It should basically work the same way, if you assume the bottleneck is the transformer's leakage reactance, as opposed to the resistance.

The problem with rooftop PV is really that the system wasn't designed for reverse power flow. The voltage at the customer's terminals is already near the maximum 253V when unloaded. Nobody ever thought of negative load.

People have been asking for seasonal switching of transformer taps (to drop summer's voltage by few volts) for ages. However stopping everyone's production is in utilities interest so nothing is done about it.
I heard of a friend of a friend (who must remain nameless) who took advantage of an outage to climb the pole and crank the transformer to a lower voltage tap, in order to increase his PV production. He turned it the wrong way and killed his production completely.
 
There are lots of overvoltage issues around here mainly because it is common to have entire streets on a single (3 phase) transformer. So on that street if every 4th house has ~7kw of solar on average the local voltage very quickly goes above the cutoff point of 253V on one or more phases abd everyone's inverters switch off. Then voltage falls and the cycle repeats.

It is a purely a localised thing so it is not accompanied by any increase in frequency. People have been asking for seasonal switching of transformer taps (to drop summer's voltage by few volts) for ages. However stopping everyone's production is in utilities interest so nothing is done about it.

I wonder is there is some mechanism by which inverters "sinking VARs" can contribute to lowering of voltage while maintaining ability to push active power across an under sized transformer? I doubt it, as the overvoltage is a result of too much active power rather than anything to do with the power factor, but I'd love to be wrong on this.
Same happens here in Romania, we have kind of infrastructure issue. We installed at home a system 2 months ago and the grid is always up at 243V with no power being exported to grid, so we have so much less room for selling before reaching the 253, 264V voltage ceiling.
 
I think I should`ve named this topic "How to prevent reactive power consumption by installing the right inverter".

Since some off grid inverters inputs power from grid and outputs it to loads in a pure sine wave form, this being advertised for some inverters as double conversion, would it be safe enough to assume that this can solve my problem and prevent me from registering reactive power consumption?
 
I heard of a friend of a friend (who must remain nameless) who took advantage of an outage to climb the pole and crank the transformer to a lower voltage tap, in order to increase his PV production. He turned it the wrong way and killed his production completely.
I was surprised how easy it is to change. From my ground-mount transformer:
Screenshot 2024-08-19 at 3.07.37 PM.pngScreenshot 2024-08-19 at 3.09.53 PM.png
 
Yup the only 2 things you can get wrong are turn it the wrong way, and turn it while the transformer is energised.
 

diy solar

diy solar
Back
Top