diy solar

diy solar

Ripple in Double Conversion UPS

3-phase double conversion wouldn't (might not) show that effect. At least not if all phases loaded equally, and maybe not even otherwise, depending on charge circuit. But probably, input of charge circuit is just PF corrected, doesn't try to track/prevent battery voltage ripple from non-constant load. If loads have poor PF (e.g. cheap rectifier/capacitor front ends, that ripple would probably be experienced by battery in with 3 phase in and out.
 
It's interesting/real enough for scientists people to get grants to investigate and write research papers on this. Remember the ripple will cause charging and discharging chemical reactions if current alternatingly goes into and out of the battery. I don't have the background/time right now to look into whether this is just a jobs program or very high social value research.

I would imagine for any new (or major update to the geometry/recipe/whatever) chemistry someone will want to create test cells and see what happens when subjected to this workload.


You don't need to make it sit at 100%, you can use current shunt to stop at 80% and periodically charge to 100% to calibrate without compromising UPS endurance.

Interesting. There are a lot of efforts to improve power supply efficiency and reduce components counts in electrical systems. On the consumer end, designers are always looking for ways to cut out capacitors and inductors to save $0.02 per build, especially in server and desktop power supplies.

I follow a professor in the UK on LI that has a team that does some very interesting work on grid level and higher end commercial switching power supplies and motor control.

One of his long term projects was a way to use high frequency switching to improve / support grid level sine wave quality and voltage sag by injecting just the right amount of power - real time - into sine wave imperfections and sort of moving around spikes / sags to make it all work.

It is similar in concept to how a 3 phase, variable frequency motor controller works, but another level of sensing, real time control and power injection.

To be honest, for a long time I though that it was just a hypothetical concept that only an EE prof would find useful, but it has made it's way into the real world.
 
To be honest, for a long time I though that it was just a hypothetical concept that only an EE prof would find useful, but it has made it's way into the real world.
That is pretty cool. It is the much fancier version of basic power factor shifting that large scale inverters have done for a while. And there is real money in it bc large users are billed on power quality.

I wonder how much it works in real time and how much it uses building a model (of various kinds) of the power harmonics. You could probably get pretty far by assuming a lot of the problems are periodic repeating things from constantly running equipment, before you try to predict transient changes that always look the same once started, but they can start at random times.
 
One possible way that you might see ripple in a system is similar to what I ran into on a test stand experiment.

I was trying to test a power system build for a sprinter and part of it was a DC - DC charger pulling power from the starter battery using a sterling 12 - 48 volt unit. (one of the older models )

For the starter battery power source, I was using a size 24 Lifeline AGM fed by a 120 vac, iota charger (as I had it sitting around) , and the van battery pack was setup as a 48 volt AGM system.

The sprinter max pull on that model was 40 amps, but the BTB charger was rated for a higher current, but has a mode that it can be set to pull at 50%, which was within the limits.

So the interesting challenge was that when the sterling was turned on, it would go from "off" to "full power" more or less instantly - no soft start of any kind. It pulled so hard, that the size 24 would drop below the low voltage cut off of the sterling and shut off and the 120 vac charger could not respond fast enough to make up for the voltage drop to hold it stable.

Then it would all go into oscillation as the low voltage cut off turned on / off - a real pain.

I had to go to a larger battery on the input side to make it stable, and this really gave me a real "hats off" to the people at Ford, MB and Promaster who somehow are able to make their alternator charging systems capable of fast responding to these highly variable power draws in vehicles.

_________

So I guess the simple condensed version of all of this is that if your battery pack is large enough for the load, the ripple will be small just by essentially electrochemical inertia. If the battery pack is relatively small compared to the load, then ripple is more likely to happen.

My brute force approach, based on somewhat simplistic testing, is to use size 27 batteries (which tend to have fairly fast response ) and no more than 500 - 600 watts of load per battery. With this, ripple is very low.

A lot of the commercial portable power stations try to run at double this - so 1 - 2 kW from a battery like this - so clearly they are working those batteries a lot harder and I would not be surprised if that is in fact damaging batteries.
 
Implying that cycling around 80% or perhaps 60% might not be as destructive?
If so this would be because the chemical state at those levels is more amenable to switching back and forth.

To reiterate if you set up the Charge controller so that it tries to have 0A into battery while also sending some current to serve another load, I’m going to need reasons explaining why maintaining this balance at different SoC is easier. It might even be harder in a part of the voltage vs SoC curve where voltage is very flat and the charger is better controlled in constant current mode rather than trying to target a voltage

And keep in mind the typical charge controller does not have access to separate shunt information about how much current is flowing into battery and into the load. It only sees the current drawn by the combination. So it’s hard pressed to implement a clever control loop to try to drive battery contribution and charging to zero.
 
So I guess the simple condensed version of all of this is that if your battery pack is large enough for the load, the ripple will be small just by essentially electrochemical inertia. If the battery pack is relatively small compared to the load, then ripple is more likely to happen.
Yes, I trust a big battery to be able to handle these surge loads better than a random SMPS. Because it’s a big dumb pile of chemical reactions.

So that means the load is well buffered from a sagging power source.

However it also means that the load induced an extra discharge/charge cycle on the battery. That may not have been expected. My belief was that this is the kind of ripple under discussion for the impact of double conversion on battery life
 
That is pretty cool. It is the much fancier version of basic power factor shifting that large scale inverters have done for a while. And there is real money in it bc large users are billed on power quality.

I wonder how much it works in real time and how much it uses building a model (of various kinds) of the power harmonics. You could probably get pretty far by assuming a lot of the problems are periodic repeating things from constantly running equipment, before you try to predict transient changes that always look the same once started, but they can start at random times.

It was based on both. There is a fairly sophisticated algorithm and real time measurements.

Their mathematical model and real time controls are able to also deal with harmonics remarkably well.

I am not really qualified to explain it further, but it works better than I ever imagined, and was feasible due to the availability of high speed local processors and the switching speeds of modern GaN and SiC switching elements.

As renewables are added to the grid, and rotating mass is reduced, the grid in some parts of the world has become less and less stable and less able to deal with loads turning on / off. I believe that his team's work was at least in part to work on this issue.

There are some posters on LI from AUS that are involved with power management and buying / selling power to the grid there. That grid is less stable than you might think, while interestingly the most stable power grid in the world appears to be Russia - probably due to it being powered mostly by large rotating generators. (coal / hydro / nuke ) The difference in stability was kind of amazing.
 
Yes, I trust a big battery to be able to handle these surge loads better than a random SMPS. Because it’s a big dumb pile of chemical reactions.

So that means the load is well buffered from a sagging power source.

However it also means that the load induced an extra discharge/charge cycle on the battery. That may not have been expected. My belief was that this is the kind of ripple under discussion for the impact of double conversion on battery life

A voltage sag under load does not necessarily mean that the SOC has changed significantly.

Batteries are electrochemical systems. These reactions are fairly fast, but not infinite. Every chemical reaction is dependent on diffusion of the reactant to the reaction site, kinetics (speed of the reaction) and other reactive effects such as concentration, surface area and temperature.

Under a heavy instantaneous load, the ability of a battery pack to keep up (voltage stability) might fall temporarily below the short term capability, but still be within the long term capability. Part of this shows up in the design of starter batteries vs marine vs deep discharge and the discharge curves.

Batteries have effects like surface charges, some capacitance, and other interesting and complex effects that are not completely linear. They often appear to be linear to us as users because there has been so much optimization done over the past 100+ years and the volumes are so high.

One of the more interesting "applications focused optimizations" is the difference between a Lifeline marine AGM and a Rolls AGM battery even though they have similar sizes and general ratings. The Lifeline marine batteries tend to be capable of much higher charge and discharge currents, as they were really designed for the Navy and coast guard use. The Rolls batteries are built in Canada and have less capability in terms of charge / discharge rates, but work better in very cold weather than the Lifelines. It all makes sense when you think about the customer base and goals, but isn't really obvious unless you really dig into the data sheets.

Many Inverters have fairly large capacitors on the input stage to help deal with this effect.

If a person wanted to eliminate these input stage capacitors and have an absolute minimalist battery pack, one way to deal with this would be if the inverter had a built in soft start on the output. The downside of this is that it would probably trip off the appliance. I have seen this on coffee makers attached to inverters with idle / sleep modes, they started to go, and then the coffee maker would trip off before the inverter went to full power.
 
Last edited:
Batteries are electrochemical systems. These reactions are fairly fast, but not infinite. Every chemical reaction is dependent on diffusion of the reactant to the reaction site, kinetics (speed of the reaction) and other reactive effects such as concentration, surface area and temperature.
If the battery reacts faster than the charger when current is called for, then I think it would carry the load until the charger catches up. Actually I’m not sure how a dumb charger is guaranteed to realize that it needs to supply more current, in preference to the battery providing it. I can imagine both voltage based and SoC based (via comms) lagging and letting the battery carry some load.
 
If so this would be because the chemical state at those levels is more amenable to switching back and forth.
I'm imagining this. That it is preferable to "hover" (my term for light discharge/charge cycling) within the sweet spot. I mean, it seems that cycling 5% of an LFP at 90-100% is harder on the chemistry than cycling around 50-60%. If one could manage that. But 85-95% even is still likely easier on chemistry than 90-100%
To reiterate if you set up the Charge controller so that it tries to have 0A into battery while also sending some current to serve another load, I’m going to need reasons explaining why maintaining this balance at different SoC is easier. It might even be harder in a part of the voltage vs SoC curve where voltage is very flat and the charger is better controlled in constant current mode rather than trying to target a voltage
I'm not sure. Not sure what you mean by "easier". Easier on the chemistry? Anyway, perhaps it's not possible with voltage settings as you say. But that's what I meant when I said I need to add a shunt. I need a reliable way to know my state of charge and find out where I can get it to hang out at when using a "storage" voltage.

It seems that this idea works when we set up our solar charge controller to a certain float voltage. Like 13.3-13.4 on my 12V, somewhere in there, the battery tends to sit after settling from 14.x setting and discharging a bit as well. The SCC will show 0 or very low current. But if the sun is shining and some load kicks in, the current flow goes up. I'm nowhere near my RV but that's what I remember? Or am I hallucinating?

Now can I set that V and reliably land at 60%, or 80% just by a voltage setting? Never really checked. V control within the flat zone seems unlikely, but there is a setting - maybe around 85% - where it's resting and not pulling amps in float yet the charge controller responds to loads....I think. I hate to act certain when I'm going on memory.

With my new double-conversion AC-DC-AC system, I will observe what is going in and out of the BMS along with the charger over time. I might not be able to see the fine ripple from SMPS push pull, but I should be able to track the broad kind of ongoing discharge/recharge by watching the BMS, shunt and charger. I want to call this charge state "hovering" to distinguish but maybe low frequency (LF) ripple works as a term. It may happen over a frequency of partial seconds to partial hours. See if I can find a stable voltage setting. Seems like a constant pull/push of 10-20% SOc might be bad. First time I've had a smart BMS so it's cool to a bit more window into what the battery pack is seeing.

It's surprising to me that this is still not clearly known. I mean people must have been building LFP backup systems for a few years with some running them online without transfer switches.

Ultimately, if this is problematic I will just buy a power conditioner and a transfer switch and forget trying to use the backup power system as a conditioner. But that adds a couple hundred more in cost so if this can work for 8-10 years, great! Would rather put that into panels.
 
Not sure what you mean by "easier". Easier on the chemistry?
Easier on the charger control and balancing between charger and battery supply of current to load.

Looks like you are looking for a shunt for the normal value that a shunt provides (SoC monitoring for a LFP chemistry). Reading shunt in BMS can do this too if compatible with your SCC

I don’t think it will float over such a big range of voltage if you have a SCC that can talk to a shunt to get the very high level SoC information and use the SoC level to control charge profile instead of voltage threshold.

I was talking about using the current reading from a shunt to try to dynamically zero out the charging completely. This is a hypothetical application, I doubt the components can do this.
 
The Victron document was simplistic enough to be helpful to me.

One thing it pointed out, not even as directly as it could have, is that with poor battery wiring or an undersized battery bank, the lifetime of the input capacitors on the inverter will be reduced.

It also suggests, again not even stated as directly as possible, that if you simply put a meter on AC volts and measure your battery bank while the inverter is under full load, you would ideally see <0.8v of ripple. That gives us a simple test.

It also implies (via the words 'locked out') that the inverter will shut down if ripple on the battery circuit exceeds 1.5v. Im curious as to whether that is because it is dangerous for the inverter, because the inverter cannot reliably correct the input conditions to the desired output conditions, or whether the ripple is simply a very accurate stress indicator on the battery circuit.

Pretty interesting discussion, although without some quantification of the effect on either battery or inverter life, it still seems to me that the only thing ive picked up that i can implement is 'test for it based on what Victron says and try to improve your wiring if result is bad'. ?‍♂️
 
Easier on the charger control and balancing between charger and battery supply of current to load.

Looks like you are looking for a shunt for the normal value that a shunt provides (SoC monitoring for a LFP chemistry). Reading shunt in BMS can do this too if compatible with your SCC
Yes, that's why a shunt. It appears that the BMS uses voltage to determine charge state displayed. At least the JBD does but I understand that JK and Seplos are the same Someone suggested that it/they may incorporate measurement of power movement into the displayed charge state calculation but that's speculation. What's certain is that if I change the parameters on the BMS' charge state to cell V, then my displayed charge state will revise instantly upon save.
I do have a budget shunt I bought for this system. Waiting for some parts to fab up cables & terminals on my final build and fit it into the circuit.
I don’t think it will float over such a big range of voltage if you have a SCC that can talk to a shunt to get the very high level SoC information and use the SoC level to control charge profile instead of voltage threshold.

I was talking about using the current reading from a shunt to try to dynamically zero out the charging completely. This is a hypothetical application, I doubt the components can do this.
Yes. That would be cool. Perhaps that can be done with the Victron VE bus (I think it's called) system. I have their BMV 712 on my vehicle and it has a relay output which I think can be triggered with charge state, among other variables.
 
The Victron document was simplistic enough to be helpful to me.
Yes. I found it easy to read.
One thing it pointed out, not even as directly as it could have, is that with poor battery wiring or an undersized battery bank, the lifetime of the input capacitors on the inverter will be reduced.
#1 takeaway. Low resistance is desirable for yet another reason.
It also suggests, again not even stated as directly as possible, that if you simply put a meter on AC volts and measure your battery bank while the inverter is under full load, you would ideally see <0.8v of ripple. That gives us a simple test.
Yes. Nice tip. I intend to do that myself. Do an AC voltage check with the battery charging and significant load on inverter. Then battery not charging and load.
It also implies (via the words 'locked out') that the inverter will shut down if ripple on the battery circuit exceeds 1.5v. Im curious as to whether that is because it is dangerous for the inverter, because the inverter cannot reliably correct the input conditions to the desired output conditions, or whether the ripple is simply a very accurate stress indicator on the battery circuit.
Good catch! I skimmed over this part but guess what?

Last night, fooling around, I had my 2k PSW inverter (Voltworks) go into beeping flashing error mode claiming "overvoltage". The inverter showed 14.1 VDC when in error mode. But it's supposed to accept up to 16V! BMS and charger showed battery at 13.6. Hard to believe voltage was higher at the end of the cable to the inverter. WTF?

This started when I unplugged the charger and then when it came back on it went from "storage" into a new charge cycle. The charger was kicking out it's full 15A and AC load was also drawing maybe 10A. Removing the load eliminated the "overvoltage" error at the inverter and displayed DC V dropped to same as BMS reading. Restoring the load put me into error again and 14.1 v display. I pulled the load and waited until charge current into the battery dropped down. At that point, I could add the load again without triggering an inverter error.

I was totally perplexed but now I wonder if what Victron is warning about might be what happened! I should add that my setup is provisional. I have not cleaned the terminals and have not optimized the wiring for low resistance yet. I am in test mode until I can make up the shorter, thicker final cables. Hopefully that will fix the issue. First time it's happened in a ten days of test use.

In troubleshooting section of inverter manual for high voltage error one solution is "Do not use it when the battery is charging"
WHAT? How many systems can afford to switch off their loads in order to charge the battery? This is crazy, I thought.
I am now thinking that they bury this fatal requirement in troubleshooting because it's only an occasional issue for some cases due to....RIPPLE !?
As per Victron's warning of "lockout".

Pretty interesting discussion, although without some quantification of the effect on either battery or inverter life, it still seems to me that the only thing ive picked up that i can implement is 'test for it based on what Victron says and try to improve your wiring if result is bad'. ?‍♂️
Well, here you go ! My recent experience in the light of Victron's paper seems to be bringing some of this theoretical talk home to an immediate issue in the field.

Effect on battery life/chemistry remains the more difficult question to answer.
 
It appears that the BMS uses voltage to determine charge state displayed. At least the JBD does but I understand that JK and Seplos are the same Someone suggested that it/they may incorporate measurement of power movement into the displayed charge state calculation but that's speculation. What's certain is that if I change the parameters on the BMS' charge state to cell V, then my displayed charge state will revise instantly upon save.
Did some research on this. As often happens, the more I learn, the less I know!
Summary is that it's a mix of things, it can sometimes be managed well enough to be accurate and much more likely to work if you get some full cycles in.

"It isn’t really coulomb counter. It is a calculation with available numbers for current, voltages and time to simulate what a real coulomb counter does."
" I had to calibrate the idle, charge, and discharge currents to get the SOC during coulomb counting to be accurate."

https://diysolarforum.com/threads/jbd-overkill-bms-state-of-charge.28329/ Older longer, deeper
https://diysolarforum.com/threads/bms-that-doesnt-mess-with-the-coulomb-counting.72224/
https://diysolarforum.com/threads/wrong-percentage-jbd-overkill.75839/ new and more concise.
 
This started when I unplugged the charger and then when it came back on it went from "storage" into a new charge cycle. The charger was kicking out it's full 15A and AC load was also drawing maybe 10A. Removing the load eliminated the "overvoltage" error at the inverter and displayed DC V dropped to same as BMS reading. Restoring the load put me into error again and 14.1 v display. I pulled the load and waited until charge current into the battery dropped down. At that point, I could add the load again without triggering an inverter error.

Firstly i totally understand not wanting to make everything 'nice' when still in proof of concept or testing phase. Im still there after a year and a half with my house but im getting CLOSE to wanting to 'tidy it up'.

As far as the issue you're experiencing, first thing im curious about is whether you unplugged the charger from the battery or from AC power? I could see unplugging it from AC resulting in a brief 'spike' on the output if the controls part of things turns off while the charger is still 'storing energy' in all its various capacitors and inductors and since the input is all diodes (in the rectifier section), the only way that stuff gets out without dedicated 'bleed off' (flyback?) circuits is to go out the output. Kinda making a lot of assumptions there but its plausible to me. If you unhooked it from the battery i wouldn't see any way for a voltage spike to be seen by the inverter.

So, im stretching my personal knowledge maybe close to the limit here, but think of it this way.. The battery charger takes a wavy AC input and makes a flat-ish DC output. The way it does this is by putting the AC through a full wave rectifier which by itself would result in very lumpy DC that was wavy at almost exactly 120hz (it would flip the waves from the bottom of the AC waveform back over onto the top, resulting in twice as many waves on the top half). From there it feeds into a bunch of capacitors. The capacitors charge up from the 'peaks' of the waves and then discharge to fill the 'valleys'. If the current drawn from this 'reservoir' is small, the capacitors are not exhausted and the output remains fairly flat. However, as the output grows the capacitors are less able to completely fill the valleys, and lumpiness returns to the output. This is the basic thing we're dealing with, right. Im completely ignoring the whole output section of an SMPS and pretending the whole thing is basically just diodes followed by capacitors because that's all you need to recognize 'the issue'.

Ok, phase two of this explanation.. For a charger to function it has to output voltage above the battery's voltage. Side effect of this is, if you apply a 'load' to this charger+battery circuit, by nature you are going to 'load' the higher voltage source first, which is the charger. As you approach the current limit of the charger, its output will get wavier and wavier. Another way to think of it is that, from the charger's point of view, it is hooked up to a lower resistance circuit than just one battery, because to the charger the battery and the inverter appear in parallel. So each time a transistor or whatever closes the circuit between the 'reservoir' and the 'load', more current flows. However, it cannot flow more than it can flow (total stored energy for that cycle), so if you watch this purely on DC amps the difference between this 10amps vs 10amps flowing only to the battery might look the same.. still 10 amps right? Just the current vs time plot looks different and measuring amps with a meter would not pick this up (would need a scope, or at minimum a meter with an 'inrush' setting). So in this case, the ripple will be worse than if the charger had been hooked to the battery alone, and it will also be worse than if the battery alone were hooked to the inverter.

When dealing with a PWM'd source into a PWM'd load you're always going to run into some super brief weirdness because even though they both might be switching their respective transistors at thousands of hertz, there's going to be SOME times when the 'on' transistors 'line up' all the way from the input caps of the source, to the thing being powered by the inverter, and the load appears as a dead short to the source and pulls it way down out of its 'comfort zone' of the kind of ripple it would have made if hooked ONLY to a battery. I believe i experience a symptom of this in that when my house inverters are powering ONLY inverter-controlled air conditioners, it reports a rapidly fluctuating 'load percentage' which i believe is down to this sporadic 'overlap' of transistor on times from the PWM going on at both ends.
 
Last edited:
first thing im curious about is whether you unplugged the charger from the battery or from AC power?
AC power.
I could see unplugging it from AC resulting in a brief 'spike' on the output if the controls part of things turns off while the charger is still 'storing energy' in all its various capacitors and inductors and since the input is all diodes (in the rectifier section), the only way that stuff gets out without dedicated 'bleed off' (flyback?) circuits is to go out the output. ...
I don't think it was a spike from charger losing power because the error did not start until I re-plugged it and it reset to a new charge cycle feeding 14 v to the battery instead of much lower storage/float v. It then kept up the alarm state until I unplugged the load on the other side. It seemed clear that having the charger at 14v pumping full amps and having the inverter with a decent (not BIG) load was the condition that caused the alarm. I could replicate it by unplugging AC (source or load) at either end of the double conversion chain and it would stop. Replugging both, it would start.
So, im stretching my personal knowledge maybe close to the limit here, but think of it this way.. The battery charger takes a wavy AC input and makes a flat-ish DC output. The way it does this is by putting the AC through a full wave rectifier which by itself would result in very lumpy DC that was wavy at almost exactly 120hz (it would flip the waves from the bottom of the AC waveform back over onto the top, resulting in twice as many waves on the top half). From there it feeds into a bunch of capacitors. The capacitors charge up from the 'peaks' of the waves and then discharge to fill the 'valleys'. If the current drawn from this 'reservoir' is small, the capacitors are not exhausted and the output remains fairly flat. However, as the output grows the capacitors are less able to completely fill the valleys, and lumpiness returns to the output. This is the basic thing we're dealing with, right. Im completely ignoring the whole output section of an SMPS and pretending the whole thing is basically just diodes followed by capacitors because that's all you need to recognize 'the issue'.
This has been explored and explained by some smart folks earlier in this thread and it seems similar to what you are saying. I'm sorta glossing over the details for sanity and accepting that there are ways this happens and that it probably happens at multiple frequencies as a result of the SMPS sections of both the charger and the inverter. My electronics knowledge is weaker than yours and others here so I'm just trying to absorb the "executive summary" and staying out of the weeds!
Ok, phase two of this explanation.. For a charger to function it has to output voltage above the battery's voltage. Side effect of this is, if you apply a 'load' to this charger+battery circuit, by nature you are going to 'load' the higher voltage source first, which is the charger. As you approach the current limit of the charger, its output will get wavier and wavier. Another way to think of it is that, from the charger's point of view, it is hooked up to a lower resistance circuit than just one battery, because to the charger the battery and the inverter appear in parallel. So each time a transistor or whatever closes the circuit between the 'reservoir' and the 'load', more current flows. However, it cannot flow more than it can flow (total stored energy for that cycle), so if you watch this purely on DC amps the difference between this 10amps vs 10amps flowing only to the battery might look the same.. still 10 amps right? Just the current vs time plot looks different and measuring amps with a meter would not pick this up (would need a scope, or at minimum a meter with an 'inrush' setting). So in this case, the ripple will be worse than if the charger had been hooked to the battery alone, and it will also be worse than if the battery alone were hooked to the inverter.
YES. I think that this is what various folks including Victron are saying. And there is a push-pull effect due to offset in the waveforms? Uhoh, entering the weeds! I need to reread the Victron one because it's so nicely dumbed down.
When dealing with a PWM'd source into a PWM'd load you're always going to run into some super brief weirdness because even though they both might be switching their respective transistors at thousands of hertz, there's going to be SOME times when the 'on' transistors 'line up' all the way from the input caps of the source, to the thing being powered by the inverter, and the load appears as a dead short to the source and pulls it way down out of its 'comfort zone' of the kind of ripple it would have made if hooked ONLY to a battery. I believe i experience a symptom of this in that when my house inverters are powering ONLY inverter-controlled air conditioners, it reports a rapidly fluctuating 'load percentage' which i believe is down to this sporadic 'overlap' of transistor on times from the PWM going on at both ends.
Yep. What I can tell from Victron and the kind of error my inverter threw, there are times when - due to combined ripples (?)- the inverter will see a V surge that throws it off and it doesn't have the "intertia" (would it be capacitors?) to produce a clean output so it locks out rather than delivering a messy AC waveform. Or rather than stressing other circuit components, perhaps.

Anyway, I thought that this issue was timely and in the context of ripple discussion plus Victron's paper, I will definitely work on minimizing resistance and just see if it comes up again, before and after. I don't know if a larger load would help or hurt and same with charger capacity. I do have the option of charging at a lower amperage or lower voltage so can play with those if the error state returns after addressing resistance.
 
I think im with you. Im gonna keep my non-EE brain out of the weeds on this issue from here on out. Ive already used more words than i have qualifications to back them up. :ROFLMAO: :ROFLMAO:
 
Back
Top