diy solar

diy solar

Battery pack seemingly getting out of balance

Aero

New Member
Joined
Mar 14, 2020
Messages
73
Hello,

I assembled a battery pack from 4 lishen 272 AH.
I top balanced them with a PSU in summer of 2021.

The BMS in use is a JBD 150A.
It's currently in place and in use in an RV.

Today the battery was at a low state of charge. It's connected to a multiplus combo inverter/charger.
I attempted to charge at 60A but the cells under charge appear out of balance and OVP from ths BMS quickly triggers

I can't easily access it so I started with a thermal shot.

How would you diagnose what may be the issue?

The 2nd cell starting from the left on the thermal camera is hotter.
 

Attachments

  • Screenshot_20220228_145353_com.jiabaida.bms.jpg
    Screenshot_20220228_145353_com.jiabaida.bms.jpg
    242 KB · Views: 24
  • IMG_20220228_145400.jpg
    IMG_20220228_145400.jpg
    82.3 KB · Views: 24
Last edited:
Without charge or load the voltage readings from the BMS are at a max voltage diff of 0.035V.
 

Attachments

  • Screenshot_20220228_152814_com.jiabaida.bms.jpg
    Screenshot_20220228_152814_com.jiabaida.bms.jpg
    231.7 KB · Views: 12
In the mean time I was doing a load test on a new pack. Why would voltages measured by the BMS be different than what I measure with a multimeter?

I set a load of 30A. Measured each cell indiviually with the multimeter (in place, while connected as a pack). All are similar at around 3.33V. But when I turn on the load, the bms measurements are quite different.

I just crimped the bms wiring harness again.

Will measure the voltages directly at the connector to check.
 

Attachments

  • IMG_20220228_145910.jpg
    IMG_20220228_145910.jpg
    258.1 KB · Views: 12
  • Screenshot_20220228_162621_com.jiabaida.bms.jpg
    Screenshot_20220228_162621_com.jiabaida.bms.jpg
    234.2 KB · Views: 11
Last edited:
Why would voltages measured by the BMS be different than what I measure with a multimeter?
Meter out of whack or BMS out of whack or both out of whack.
A BMS is more complex than a meter so I'd say the BMS is at fault.

It's relatively easy to check the meter.
A working car battery not being charged or discharged should read about 12.8vdc.
 
I may not have been precise or clear enough

For my pack under test.
DMM measures at battery terminal

At rest
BMS
3.360
3.359
3.360
3.361
=> 0.002 voltage diff

Digital meter
3.350
3.349
3.350
3.351
=> 0.002 voltage diff

Under load
BMS

3.238
3.317
3.315
3.320
=> 0.082 voltage diff

Digital meter
3.272
3.267
3.263
3.267
=> 0.009 voltage diff
 
3.360v BMS
and
3.350v BMS

The average is
3.355v

So readings are within
+/- 0.14% of each other.
This is probably within spec.

What is a typical accuracy spec for a DMM on the 10v or 20v scale [you can't use the 2v scale]?


But. . . I have to say interpreting accuracy specs is not easy.

For 1.00000v and a 1% meter accuracy, what range of voltages will you read 95% [or 99.7%] of the time?
You need to know the meaning of a normal [Gaussian] distribution, the bell-shaped curve.

Like another OP, I feel like chewing thru my restraints. :)
 
When voltage differences between a BMS and a multimeter are observed I consider the following points:
  1. Where are you placing the multimeter probe? The best location is on the cell terminal itself, not the bus bar or the stud/nut.
  2. At least half of the voltage issues seen on the forum are due to poor connections. Poor connections can be due to bad crimps, dirty surfaces between items (cell terminal and bus bar) or loose fasteners.
The thermal picture is rather telling. Something is certainly out of whack.
 
Here’s how I would tackle the problem of deciding whether my costly batteries are being damaged.

Assuming both your meter & your BMS are working properly, my spreadsheet tells me that the average voltage of all 16 readings is 3.319v.

What does your BMS flag as an abnormally low voltage & an abnormally high voltage, given your charge or discharge rate?
 
Sorry for the confusion.

There is 2 different battery pack discussed in this topic.

Pack N°1: The one in my RV. At the moment, I couldn't do much on it so I just took the thermal picture and bms screenshot seen in my first post.
Pack N°2: The one in my garage that I'm testing/charging/load testing.

The voltage difference in measurement I'm talking since post #6 is from Pack N°2.

Pack n°2
Below an attached picture of 2 measurements made with the DMM
Charging at 10A

I measured on the terminal of the cell. 3.380V
When I measure on the pin of the connector on the BMS: 3.410V
 

Attachments

  • IMG_20220228_204014.jpg
    IMG_20220228_204014.jpg
    176.7 KB · Views: 4
You did not state the amount of load is being applied but I assume you are loading with a sinewave inverter. Based on the loaded cell voltage slump with a 272 AH cell, I would guess you are putting about 40 amp load on battery.
The BMS is likely averaging any battery voltage 120 Hz ripple from the 120 Hz ripple load current.
The DVM might be averaging the voltage or if it is a true RMS meter it might be true rms of the rippling DC voltage, or just does not like the 120 Hz ripple voltage riding on top of the DC voltage.

With no load on inverter, the DC current is just the idle current of inverter which is primarily the power required to run the drivers which is supplied by a high frequency DC to DC converter so the DC current is almost constant smooth DC with little to no 120 Hz ripple current.

When you load a sinewave inverter the 120 Hz ripple current dominates the inverter current and the readings depend on how the DC voltmeter reacts to the 120 Hz ripple voltage riding on the batteries due to battery impedance and the 120 Hz ripple current from inverter. The greater the cell impedance (poorer cell condition) the more ripple voltage there will be on cell due to its internal impedance.

The delta between lowest and highest cell in this voltage range is not as important as the delta between rested and loaded per cell. It will indicate the matching between cells (not to be confused as SOC balance). You should have cell load current of 0.2 to 0.4 C(A) current rate to get good test on cell matching. That would be 50 to 100 amps for a 272 AH cell.

On a per cell basis, your data rearranged:
1646082891832.png

Conclusions:
1) no load voltage difference between meters is only 0.3% (no 120 Hz ripple current present at no load)
2) One of the meters doesn't like the 120 Hz ripple voltage present on cells under load. Based on slump voltage I would guess DVM doesn't like the 120 Hz ripple voltage.
3) The BMS reading of first cell voltage slump with load is disturbing. It shows higher cell voltage slump under load. Check cell terminal connections. It could also because of terminal connection and you probed battery with DVM directly on battery terminals not subjecting the DVM to the bus bar connection issue. DVM is showing higher voltage reading.
4) The first cell voltage slump does not show up on DVM reading. This might be due to the 120 Hz ripple voltage effect on DVM where DVM is measuring close to peak of ripple voltage. This is typical of a cheaper DVM.

The big question is the first voltage cell with greater BMS reading voltage slump under load also the same cell you indicated has greater temperature rise? Having greater internal impedance, with greater terminal voltage slump with load current would cause cell to heat up more. This would indicate a cell matching issue with first voltage cell being in poorer condition than other three cells.

A greater cell internal impedance will also affect charging voltage rise on cell, causing it show greater cell voltage under charge current.

Hopefully all you have is a bad bus bar connection.
 
Last edited:
I'm going to redo the measurements. And detail what DMM and PSU I have so we can rule out possible mistakes on my end.

Thanks RCinFLA for the detailed answer.

DMM is a Uni-T UT61E
PSU is a Riden RD6018
Load tester is East Tester ET5410
Thermal camera is a Uni-T Uti260B
 

Attachments

  • IMG_20220228_225911.jpg
    IMG_20220228_225911.jpg
    162.2 KB · Views: 2
  • IMG_20220228_225855.jpg
    IMG_20220228_225855.jpg
    71.8 KB · Views: 2
  • IMG_20220228_225845.jpg
    IMG_20220228_225845.jpg
    76.2 KB · Views: 3
Last edited:
You should have cell load current of 0.2 to 0.4 C(A) current rate to get good test on cell matching. That would be 50 to 100 amps for a 272 AH cell.
I suppose the load test should be at the cell level. So out of a pack?

The only load option I have at the moment in the load tester I mentioned in the previous post.

For a pack, I have the inverter in my RV but not convenient to use for load tests as it's used by the RV itself.
 
The Uni-T 61E is a fairly good meter. I don't think it rms's in the DC voltage function though.

Make sure you do at least 20 amp load on load tester. It is minimum amount of load to detect voltage slump accurately.

Load tester should have four wire connection so voltage reading is not affected by high current test leads drop. Remote voltage sense should be directly on cell terminals.

Inverter loading is okay, just have to realize there will be 120 Hz ripple current which has peak current of about twice the DC average.

You should check the bus bar connection first before going to all the trouble of pulling cell out to test with load tester.

Make sure your BMS voltage sense wire lugs are all above bus bars and on consistent side of bus bars. (you don't want one BMS voltage sense pair having two bus bars in series with cell).
 
Last edited:
Load tester should have four wire connection so voltage reading is not affected by high current test leads drop. Remote voltage sense should be directly on cell terminals

I have a 4 wire Atorch DL24P.
My other load tester is unfortunately only 2 wires.
 

Attachments

  • 16460877494053057134613032525457.jpg
    16460877494053057134613032525457.jpg
    86.7 KB · Views: 1
Back
Top