diy solar

diy solar

How do you set your cut-off voltage when running a discharge test?

OnTheRoadAgain

Solar Enthusiast
Joined
Feb 22, 2021
Messages
643
Let's say you are testing a LiFePO4 bank of cells that together make up a 12v 100Ah battery. Each cell is a standard 3.2v cell with max voltage of 3.65v

Fully charged you will be at 3.65v per cell so times 4 that is 14.6v fully charged. So let's say at the moment it reaches that full SOC you connect it to a dummy load discharge capacity tester. In my case I use a DL24P, 180 watt device.

What do you set the Cut-off voltage to? If you set it to 2.5 (or whatever the data sheet specifies), you will never reach that discharge voltage.
At least not on this discharge tester. Does your discharge tester go by battery terminal voltage or internal to tester voltage drop voltage?

You have your actual bank voltage at the battery terminals, and then you have your loaded voltage internally at the discharge device which will be significantly lower due to voltage drop and resistance Even if you use 0 gauge cables you will still have voltage drop.

So do you set the cut off voltage at the batteries specified cut off voltage and know that the battery will never actually reach that voltage because the discharge device will react to the loaded internal voltage and not the actual cell voltage....resulting in the appearance of reduced capacity.....

Or do you set the cut off voltage so that the voltage at the battery terminals reaches the data sheet minimum discharge voltage?
 
dear @OnTheRoadAgain,

i’m not sure exactly how the DL24P determines voltage. it has an option for changing voltage cutoff, right?

random idea: begin a test and compare on screen voltage with digital multimeter you hold at the battery terminals safely?

if the DL24P reads significantly different than the DMM maybe working an offset in would be wise.

hope this helps
 
Or do you set the cut off voltage so that the voltage at the battery terminals reaches the data sheet minimum discharge voltage?

I set the cutoff voltage based upon what is measured at the terminals of the battery.

This is what a BMS will do for you since you are connecting the BMS connection to each cell at the battery terminal.

The best battery monitors of any kind will have a ”sense” wire hooked to the battery terminal, in addition to a shunt that is in line with the current carrying battery cables. In either case, these sense wires can be very small gauge because they are the leads for a very high impedance voltmeter having 10 to 50 million ohms of input impedance .... for exactly this purpose ... so that the meter does not place any significant load on the battery, and thus distort the voltage reading because of ohmic losses in the wire.

If you have a BMS, you can use it to terminate the discharge test..
 
If you have a BMS, you can use it to terminate the discharge test..

This. Set your BMS to stop at 2.5v and you are done.
If you are capacity testing a battery (not a cell) you want it to stop when the first cell hits 2.5v.
If you are running a capacity test on a battery and don't have a BMS, I say "DON'T".

At higher C rates, the FETs might get upset, but under 20 amps, just verify your BMS cutoff voltages work first.
 
Yeah, since I posted this I've come to discharge to battery terminal voltage on non BMS batteries like lead acid and on LiFePO4 I let the BMS cut off the discharge.

My main point of contention was terminal voltage vs dummy load indicated voltage.
 
I’ve just done a capacity test on a 280ah 12v in 3 phases, with recorded voltages. I had a 247 watt load AFTER my 90% rated inverter.
I recorded voltage load drops of .14v-.21, with likely significant error. A shorter roughly 15 minute rest, had a load drop of .1v-.13v.
My inverter is set with 10.0/10.6 lead acid based triggers so I did the multistage test in order to avoid triggering the inverter, or testing the BMS. I only went to 12.84 testing, ~ 80%.

I did not test the voltage differential at 100% and my resting voltages were not impeccably controlled.
 
Back
Top