diy solar

diy solar

Testing Amp Hour ratings of a battery

fblevins1

New Member
Joined
Aug 14, 2021
Messages
171
I have a 12 volt 36 Amp Hour battery (AGM) If I pull a constant 9 Amps, I should be able to pull that current for 4 hours before the battery reaches a discharge depth of 50 percent (12.3 Volts)? And if I am doing amp hour testing, should I use a smaller current and increase the duration say....4.5 amps for 8 hours? What is the standard for loading when doing A/H testing? Thanks folks.
 
Actual lead acid (including AGM) battery capacity depends on discharging rates. Higher discharge Amps get lower capacity. Usually battery Ah rated for 20Hr discharge. Check your battery datasheet for exact numbers.

And please note that your 36Ah rate is for full discharge, not for 50%.

Peukert-Losses.png
 
Last edited:
36 Amp Hour battery (AGM) If I pull a constant 9 Amps, I should be able to pull that current for 4 hours before the battery reaches a discharge depth of 50 percent (12.3 Volts)
9A for 4 hours will completely drain your battery (theoretically anyway) to 0V and likely kill it.
The rule of thumb bandied about here is that 50% is roughly 12V. And you should get 18Ah out of it.
So 9A for 2 hours is what you should expect with everything operating optimally.

But as brewmatic points out, the rate at which you discharge makes a significant difference (varies by battery type and quality and probably more things beyond what i know or care about).
 
9A for 4 hours will completely drain your battery (theoretically anyway) to 0V and likely kill it.
The rule of thumb bandied about here is that 50% is roughly 12V. And you should get 18Ah out of it.
So 9A for 2 hours is what you should expect with everything operating optimally.

But as brewmatic points out, the rate at which you discharge makes a significant difference (varies by battery type and quality and probably more things beyond what i know or care about).
This.
35ah is ok for a 1 hour 9amp draw.
With room to spare.
That size battery is more for a 3 amp load or less.

Now, if a 200W solar is feeding those 4 hours its fine.
 
Sounds like I am better off starting the test in the afternoon and checking the results the next day. I am not too concerned about killing off the battery I am using for testing, it will die with honor. I am just trying to get a feel for the science and much of the time googling things tend to overwhelm me with things that may or may not be correct. I am also (sort of) trying to figure out if my amp hour meter follows any kind of standard. But I think I have enough to start testing at lower power levels, it might take longer but I think that is the point. I can calibrate voltage and current all day, but amp hours was one of those things that I was just scratching my head over. Thanks folks. I still have to absorb what you indicated above, but in the end I will figure it out.
 
trying to figure out if my amp hour meter follows any kind of standard.
Which amp hour meter?
Amps and (earth) hours are pretty standard.
but amp hours was one of those things that I was just scratching my head over.
Its kind of like miles, miles per hour and hours.

miles per hour (the rate of speed) x hours (time) = miles (quantity)

amps (the rate of energy flow) x hours (time) = amp hours (quantity)
 
MisterSandals, A cheap hall effect unit that measure voltage, current, power and then hours and also watt hours and amp hours (I might have one of those wrong) I can calibrate the voltage and current and of course the wattmeter would easy to check because it calculates the power based on what it measures for voltage and current. But them pesky watt and amp hour values are still a learning curve. I guess in theory if I can control the voltage and current values, everything else should fall into place. Unless of course the hall effect sensor and meter consumes more than 50 milliamps, because my multiproduct calibrator trips at 50 milliamps for voltage. Most of the time that is not a problem with VAW meters because they are self powered. Eventually you guys will get me to understand the science, just keep punching away at it. Maybe instead of using a battery for testing I should use simulated conditions. The meter won't know the difference.....maybe.
 
Last edited:
I guess in theory if I can control the voltage and current values, everything else should fall into place.
You're clearly more on top of this than your originally indicated.

But if you have watt hours or amp hours, its easy to find the other if you know the voltage.
The amps are only (more?) meaningful if you know the volts.

Ah x V = Wh
or
Wh / V = Ah

(apologies to Georg Ohm)
 
You're clearly more on top of this than your originally indicated.

But if you have watt hours or amp hours, its easy to find the other if you know the voltage.
The amps are only (more?) meaningful if you know the volts.

Ah x V = Wh
or
Wh / V = Ah

(apologies to Georg Ohm)
I edited my comment a bit to clarify where my strengths are. I get stupid when the science ventures outside my knowledge base and clearly amp hours causes my brain to pitch a hissy fit. Not so much the math but rather the conditions for the test. Today I just hooked the battery up to an electronic load, set it for 36 amps and expected it to run for an hour because I am a dimwit. I just don't understand battery science. But thanks to you guys and a painful birth...I am getting it.
 
So I think I am going to step back from amp hour testing until I am more coherent in my understanding of the specification and how to test it. One problem I am having at the moment is I think my cheap little hall effect current and voltage meter is crap. Right now I am simulating 12V @ 25 Amps and it is very inaccurate, I had to manually adjust my standard to make it read as close to those values as it's stability would allow, but close. In reality I am just pushing 12V @ 5 Amps with 5 loops through the hall effect coil. So I will just let it run and see what happens after a while, my crude understanding suggest that in an hour I should read 25 amp hours. I guess it's not really simulating but there is no battery, just a multiproduct calibrator being used to test the unit. I will probably purchase a new meter that does not require a PHD to operate, it has so many user defined settings that if you muck up the settings the Chinese to English translated instructions are impossible to get things back to default and if there is a default button, I sure the heck can't find it. It is still good as a basic monitor meter (volts, current and watts) so not a total loss, but I do need something a tad more accurate and hopefully plug and play. Oh well....moving on.
 
my crude understanding suggest that in an hour I should read 25 amp hours.
With a 36Ah battery, draining it past the recommended point (50% SoC or 12V) will damage your battery. The discharge curve may accelerate too as it approaches depletion. Is there anything to learn discharging below standard discharge recommendations?
 
With a 36Ah battery, draining it past the recommended point (50% SoC or 12V) will damage your battery. The discharge curve may accelerate too as it approaches depletion. Is there anything to learn discharging below standard discharge recommendations?
At the moment there is no battery involved just a calibrator hooked up to the hall effect meter. It has no way of knowing what the source is, just that it is reading 12V @ 25 Amps. So far it is tracking those values well to include the watts value. The watt hour and amp hour displays are still showing zero so I guess at some point here in the next hour or so it should update those values. The battery that I was testing had no hopes of really meeting specifications anyway since it had been stored in a box for over 5 years before I even opened it up new. It was more a sacrificial lamb (blow it up instead of an expensive Vmax AGM battery) I decided that simulating a battery was a safer, more controlled way to test the meter and further my understanding of this damned voodoo magic science. Let's see where this goes. Peace MisterSandals.
 
Actual lead acid (including AGM) battery capacity depends on discharging rates. Higher discharge Amps get lower capacity. Usually battery Ah rated for 20Hr discharge. Check your battery datasheet for exact numbers.

And please note that your 36Ah rate is for full discharge, not for 50%.

View attachment 60845
Sounds like doing a full 36 Amp Hour test on a 36 Amp Hour Battery is not too healthy for the battery. But that's OK, I am testing my hall effect meter with all the measurement units without using a battery just to see how the pesky thing works. My first test was a flop because I pulled the current wire and it tripped the calibrator and the voltage went to standby and while the meter did give me all the values, as soon as the calibrator shut off, it erased the data on the meter (doh!) So I am running the test again except this time I will manually set the current to zero as soon as the meter reaches 1 hour. Apparently the meter is designed so that it stores data until you remove the load and then it presents all the values. I will post the results down below.
 
For your purposes (education) you could discharge at C/10 (36/10=3.6 amps) at room temperature. Because the battery is AGM it is safe to discharge it to 20% without damaging it. So, here's what you could do:

1. Fully charge the battery
2. Connect a 3.6 amp load and begin timing and monitoring voltage
3. When voltage reaches 11.35v disconnect the load and note the time the load has been on the battery
4. Continue to monitor voltage
5. When you disconnect the load from the battery, the voltage will begin to rise
6. Multiply the time (in hours) by 3.6 to tell you how many amp hours you have extracted from the battery
7. When voltage stops rising, use that number to determine the battery's true state of charge
8. Use the numbers obtained in #6 and #7 to extrapolate the battery capacity at a discharge rate of .1C

Since you are doing this for educational purposes, you could repeat the experiment to see the effect different discharge rates have on capacity. Then you could repeat the process to test the effects of temperature.

Here is one resource: https://sunonbattery.com/agm-battery-voltage-capacity/
 
For your purposes (education) you could discharge at C/10 (36/10=3.6 amps) at room temperature. Because the battery is AGM it is safe to discharge it to 20% without damaging it. So, here's what you could do:

1. Fully charge the battery
2. Connect a 3.6 amp load and begin timing and monitoring voltage
3. When voltage reaches 11.35v disconnect the load and note the time the load has been on the battery
4. Continue to monitor voltage
5. When you disconnect the load from the battery, the voltage will begin to rise
6. Multiply the time (in hours) by 3.6 to tell you how many amp hours you have extracted from the battery
7. When voltage stops rising, use that number to determine the battery's true state of charge
8. Use the numbers obtained in #6 and #7 to extrapolate the battery capacity at a discharge rate of .1C

Since you are doing this for educational purposes, you could repeat the experiment to see the effect different discharge rates have on capacity. Then you could repeat the process to test the effects of temperature.

Here is one resource: https://sunonbattery.com/agm-battery-voltage-capacity/
Thank you. As soon as I complete testing on the hall effect meter to determine how it performs AH and WH measurements I will move on to the next test. Right now I am using the wrong calibrator to test it because removing the current lead trips the calibrator and the meter resets to zero. There is a slight delay in the trip so I figured out that I can quickly insert a short on the output of the calibrator and the output will stay on and the meter will then calculate the values. It is a dual output calibrator so shorting the current output does not effect the voltage output and the meter does not turn off. What I will probably do next is use a 36 volt 40 amp power supply and an electronic load (set to give me 12 volts at 25 amp) to simulate the testing conditions because I can switch the load off and on without tripping the output. I should have done that from the beginning but since my test is in process I figured I would complete the test. When I am ready to do some practical testing on an actual battery, I will refer back to these post and see if I can do it right. At least though this testing I have done some good validation testing on the two new electronic loads. I have never used them on batteries, just power supplies. Batteries are a serious challenge to work with and learn about. But eventually I will figure it all out and be able to have more coherent conversation with you solar/battery wizards.
 
Here is some of the equipment I am using to learn how all this battery stuff works. As suspected the hall effect meter does what it supposed to do, I gave it 12 volts 25 Amps (300 watts) for one hour and in one hour I forced it to calculate the readings and as I suspected it read 25.4 Amp Hours and 304 Watt Hours.

20210820_121700.jpg20210820_115104.jpg20210820_105132.jpg20210820_104821.jpg
 
For your purposes (education) you could discharge at C/10 (36/10=3.6 amps) at room temperature. Because the battery is AGM it is safe to discharge it to 20% without damaging it. So, here's what you could do:

1. Fully charge the battery
2. Connect a 3.6 amp load and begin timing and monitoring voltage
3. When voltage reaches 11.35v disconnect the load and note the time the load has been on the battery
4. Continue to monitor voltage
5. When you disconnect the load from the battery, the voltage will begin to rise
6. Multiply the time (in hours) by 3.6 to tell you how many amp hours you have extracted from the battery
7. When voltage stops rising, use that number to determine the battery's true state of charge
8. Use the numbers obtained in #6 and #7 to extrapolate the battery capacity at a discharge rate of .1C

Since you are doing this for educational purposes, you could repeat the experiment to see the effect different discharge rates have on capacity. Then you could repeat the process to test the effects of temperature.

Here is one resource: https://sunonbattery.com/agm-battery-voltage-capacity/
Hey ya, kenryan, I am still sorting through the helpful information that you and others have provided. Much of my learning occurs not on the bench but in a virtual testing environment (I think about it while driving or drinking booze) And it did occur to me that if you have a voltage start point and you time how long it takes to reach the voltage stop point (for example how long does it take to go from 12.8 to 12.3 volts under a reasonable load, like you said for example...3.6 amps) then you can estimate the amp hour rating of a battery. In short a 36 amp hour battery will reach the stop voltage much sooner than a 100 amp hour battery. I still tend to shy away from float voltage measurements because those only seem useful in figuring out that your battery is full and just can't hold any more electrons or quickly figuring out that your battery might be going bad because (for example) the float voltage never exceeds 12.7 volts (bad ju ju) Even if I did not specifically understand the math, when I buy a new battery, I can bring it into work, put it on my electronic load and time how long it takes to reach the stop voltage point and call that my benchmark for that battery. If in the future I start seeing the batteries performance degrade, I can bring them back into work and see if the benchmark repeats. I have not yet checked that resource link but I will. Thanks .
 
Back
Top