Update. Replacement BMS? What's next?

Hedges

I See Electromagnetic Fields!
Joined
Mar 28, 2020
Messages
8,351
Dirty surfaces or low torque on a nut could make a contact resistive.
In my AC breakers, I find the specified torque provides very little compression. If I just wiggle/rotate wire, it works loose and I can turn the screw further, so that's what I do.
Loose/bad connection would get hot, pushing breaker closer to tripping.

The breakers I'm used to can take 50% excess current for 10 minutes or so. I think 100% excess might trip in 1 minute.
Try to find current/trip curves for yours, see what would explain 1 minute trip.


Battery should supply approximately constant current. Capacitors in inverter should smooth out 60 Hz cycles, and the higher frequency PWM used to synthesize sine wave.
If the capacitors aren't enough to smooth out 60 Hz, each pulse of current will go through the breaker.
Try using AC Volts scale to measure voltage across breaker. Also across a shunt if you have one. Across a length of battery cable (you can look up resistance per foot and use it as a shunt).
Try DC Volts too.

The AC Volts would tell you how much ripple.

People say to derate the BMS, plan to use it for continuous current maybe half its rating.
If there is very high ripple current (zero to 200A instead of steady 100A) that would increase heating of both breaker and BMS (when back to using BMS in the circuit.)

Maybe added capacitors at the inverter would reduce ripple. It would also increase the current surge when closing DC breaker, so a precharge resistor might be a good idea.
My thinking is that invertor vendor might skimp on capacitors.

Bigger breaker/fuse would be a work-around. Just like getting a BMS rated twice what you need.

"I^2 R" or "I squared R" is the power dissipated in a resistor.

Pulses of 200A half the time would deliver same power as 100A all the time, but would cause more heating in the wires.
If wire, breaker, or another component was 0.001 ohm, power dissipated in it is:
100A x 100A x 0.001 ohm = 10W all the time
200A x 200A x 0.001 ohm = 40W half the time, 20W average.

Obviously if breaker dissipates 20W not 10W, it'll trip sooner. Can only carry 1/sqrt(2) = 0.7 times as much current
140A x 140A x 0.001 ohm = 20W half the time, 10W average
So 70A average current (140A half the time) would heat breaker same as 100A continuous.
 

jesfl

Solar Enthusiast
Joined
May 17, 2020
Messages
80
Dirty surfaces or low torque on a nut could make a contact resistive.
In my AC breakers, I find the specified torque provides very little compression. If I just wiggle/rotate wire, it works loose and I can turn the screw further, so that's what I do.
Loose/bad connection would get hot, pushing breaker closer to tripping.

The breakers I'm used to can take 50% excess current for 10 minutes or so. I think 100% excess might trip in 1 minute.
Try to find current/trip curves for yours, see what would explain 1 minute trip.


Battery should supply approximately constant current. Capacitors in inverter should smooth out 60 Hz cycles, and the higher frequency PWM used to synthesize sine wave.
If the capacitors aren't enough to smooth out 60 Hz, each pulse of current will go through the breaker.
Try using AC Volts scale to measure voltage across breaker. Also across a shunt if you have one. Across a length of battery cable (you can look up resistance per foot and use it as a shunt).
Try DC Volts too.

The AC Volts would tell you how much ripple.

People say to derate the BMS, plan to use it for continuous current maybe half its rating.
If there is very high ripple current (zero to 200A instead of steady 100A) that would increase heating of both breaker and BMS (when back to using BMS in the circuit.)

Maybe added capacitors at the inverter would reduce ripple. It would also increase the current surge when closing DC breaker, so a precharge resistor might be a good idea.
My thinking is that invertor vendor might skimp on capacitors.

Bigger breaker/fuse would be a work-around. Just like getting a BMS rated twice what you need.

"I^2 R" or "I squared R" is the power dissipated in a resistor.

Pulses of 200A half the time would deliver same power as 100A all the time, but would cause more heating in the wires.
If wire, breaker, or another component was 0.001 ohm, power dissipated in it is:
100A x 100A x 0.001 ohm = 10W all the time
200A x 200A x 0.001 ohm = 40W half the time, 20W average.

Obviously if breaker dissipates 20W not 10W, it'll trip sooner. Can only carry 1/sqrt(2) = 0.7 times as much current
140A x 140A x 0.001 ohm = 20W half the time, 10W average
So 70A average current (140A half the time) would heat breaker same as 100A continuous.

Hedges,

I owe you an apology for the long delay in responding. Please accept my heartfelt "I am sorry." I've had several travel issues with my RV over the past few weeks.

I've read your last message at least a half-dozen times over the past couple of weeks, hoping a read on a different day would help me understand it better. I sincerely appreciate your effort to educate me. I think some of it is beginning to stick, but I have such low base knowledge, much is still beyond my grasp.

Since I now have picked up the replacement BMS I ordered (AliExpress), I am hoping that I can get one of the old batteries working again and the extra power will solve the system deficiency that keeps the converter/microwave from operating?

Of course, the new BMS came with no installation instructions. Further, it looks like I'm going to have to learn to solder again. It's been since about 1968 when I last soldered anything . . . then assembling a Heathkit receiver.

Again, thank you for your continuing support and information. I'll let you know when I get to the next stage of my restore-my-solar-system process.

jesfl
Jim
 

Hedges

I See Electromagnetic Fields!
Joined
Mar 28, 2020
Messages
8,351
No problem, we all work on various other things, can sometimes get by with a system that's halfway working.

Breaker tripping could be one of the following, maybe something else I haven't thought of:

1) Just too much current
2) Breaker bad, not able to carry its rated current
3) External heating, especially connection to wire getting hot
4) Current cycling high/low rather than being constant.

That last one is tricky. I've realized that calculating inverter battery current based on wattage isn't sufficient. Battery current has a large ripple because the AC current produced varies as a sine wave between peak, zero, opposite peak.

Power delivered from battery to inverter is voltage x average current. Heating of breaker is proportional to RMS current, calculation includes square of current so the peaks cause more heating.

After thinking about your problem I decided to measure my system. I have current transformers which clamp around wires and show the AC (but not DC) current through them. I have an oscilloscope which can capture a waveform. I was expecting some ripple current but was surprised at how large it was.
I found that for an average 217A battery current, it showed a ripple going as low as 125A and as high as 309A.
As large as that ripple sounds, I calculated it was 227A RMS, only 5% more current, 10% more heating.
This was with particularly high-end inverters, which probably have more capacitors than less expensive ones.
The load was only 40% of inverter rating. If running at full load the ripple would have been higher, RMS current and heating maybe much higher.
If essentially no capacitors at all, entire AC current comes directly from battery, my calculations indicate 12% higher RMS current, 26% more heating.

Our usual practice of oversizing breakers/fuses 25% might cover that, but the 12% effect of ripple uses up half that safety margin. I think the fuse/breaker for inverter battery connection ought to be increased another 12% due to ripple.

That alone isn't enough to explain yours tripping, but contributes to it.


Let me know what you find measuring voltage drop and checking for hot connections.
 

jesfl

Solar Enthusiast
Joined
May 17, 2020
Messages
80
No problem, we all work on various other things, can sometimes get by with a system that's halfway working.

Breaker tripping could be one of the following, maybe something else I haven't thought of:

1) Just too much current
2) Breaker bad, not able to carry its rated current
3) External heating, especially connection to wire getting hot
4) Current cycling high/low rather than being constant.

That last one is tricky. I've realized that calculating inverter battery current based on wattage isn't sufficient. Battery current has a large ripple because the AC current produced varies as a sine wave between peak, zero, opposite peak.

Power delivered from battery to inverter is voltage x average current. Heating of breaker is proportional to RMS current, calculation includes square of current so the peaks cause more heating.

After thinking about your problem I decided to measure my system. I have current transformers which clamp around wires and show the AC (but not DC) current through them. I have an oscilloscope which can capture a waveform. I was expecting some ripple current but was surprised at how large it was.
I found that for an average 217A battery current, it showed a ripple going as low as 125A and as high as 309A.
As large as that ripple sounds, I calculated it was 227A RMS, only 5% more current, 10% more heating.
This was with particularly high-end inverters, which probably have more capacitors than less expensive ones.
The load was only 40% of inverter rating. If running at full load the ripple would have been higher, RMS current and heating maybe much higher.
If essentially no capacitors at all, entire AC current comes directly from battery, my calculations indicate 12% higher RMS current, 26% more heating.

Our usual practice of oversizing breakers/fuses 25% might cover that, but the 12% effect of ripple uses up half that safety margin. I think the fuse/breaker for inverter battery connection ought to be increased another 12% due to ripple.

That alone isn't enough to explain yours tripping, but contributes to it.


Let me know what you find measuring voltage drop and checking for hot connections.

(1) Since I have none of your (sophisticated-to-me) tools, I have not tried to test and make the checks you suggested above. But, I kinda' understand what you are saying.

And, since I am now in an RV park with 30 amp power, I am not microwave-less.

But, I will test this weekend with shore power off -- at least for connection heating and the breaker flipping off when I try to use the microwave on battery only.

Thank you once again.

Jim S.
 

Hedges

I See Electromagnetic Fields!
Joined
Mar 28, 2020
Messages
8,351
Got a DMM?
DC voltage drop across a connection (e.g. battery terminal to cable clamp of a car battery) is often where I find a problem at high current.
Heating takes time, but voltage shows up immediately.

If not a connection at the breaker/fuse it doesn't contribute directly to heating of the OCP. But if it drops a percentage of battery voltage, inverter draws that much more current to produce same watts, and current squared is what trips OCP.

If inverter terminals are much lower voltage than battery terminals (e.g. 11V vs. 12V) then you chase down where the drop occurred, like 0.5V across a single connection.
 
Top