diy solar

diy solar

Rant about Ah and Wh...

upnorthandpersonal

Administrator
Staff member
Moderator
Joined
Dec 25, 2019
Messages
5,205
Location
63° North, Finland
Ok, this has been grinding my gears to the point where I'm annoyed with it, so I just wanted to write it down here.

I see statements like this all over the net when it comes to cell capacity testing:

- "you need to divide the Wh reading by the nominal voltage of 3.2V to get the real Ah rating".
- "AH without a nominal voltage means nothing, otherwise we have an infinite AH battery for you at 0 volts. "

Usually related to some capacity measurement not matching a label.

I'm just going to say this: taking the Wh reading of your measurement and dividing it with the nominal voltage of the cell or battery is just wrong.

During a discharge (and you can pick e.g. any of Andy's curves he recorded as example) the voltage stays above 3.2V for the majority of the time (over 60% iirc). If the voltage were truly 3.2V over the entire discharge, you could do this - even if it were average at 3.2V - but because it's over 3.2V for most of the time you're getting a skewed result in favor of higher Ah results by doing the "divide by 3.2V" thing.

The reason they're skewed is in the definition of nominal voltage: the nominal voltage is measured at the mid point between fully charged and fully discharged based on a 0.2C discharge - so unless you discharge e.g. a particular 304Ah cells at 60A, you're not having a 3.2V nominal voltage over the curve, and you can't just divide these results by the nominal voltage. Now the difference in this voltage for LiFePO4 is small, but it's large enough to make the Ah rating look better than they are.

In fact, if you take the Wh reading you would get and divide that by the Ah measurement of the same test, you would see an average voltage >3.2 with typical 40A or less discharge current that most of those ZKE and others support. Suppose we measure 1000Wh from a cell and divide this by 3.2, we get 312.5Ah. If the actual average voltage were just 0.02V higher, at 3.22V, we get 'only' 310Ah. The whole reason of rating a battery in Ah is because it takes voltage and losses out of the equation: any bad connection, wire resistance, etc. will impact the Wh readings (especially on devices without separate voltage sense wires), but will still produce accurate Ah readings (the current is the same everywhere in the circuit) - demonstrated here.

Stating that Ah readings without nominal voltage are meaningless is wrong. Stating that to get the actual Ah rating of a cell you have to divide by the nominal voltage is wrong. There are specific circumstances where the latter statement works, but it just can not be applied as a blanket statement. If you do a capacity test, test the Ah rating, and believe the Ah rating the measurement gives you, not the label, and not the silly divide by 3.2V 'rule'.
 
I will say when I capacity tested my 32x cells. I noticed the meter would show similar Ah ratings ( ie, 270-271) but the Wh ratings would vary quite a bit. I didn't understand why at the moment, I just brushed it off.
 
Agree 100% And, while an exact? ah metric a useful guide to design a battery bank, it's only a guide. The real world, charge/discharge cycle / actual useable power is a little fuzzier. There's nothing like overall charge/discharge operational experience of a system as a whole, likely operating less than 100% DOD, over a range of contexts to see exactly what a battery will deliver :)
 
Last edited:
I'm just going to say this: taking the Wh reading of your measurement and dividing it with the nominal voltage of the cell or battery is just wrong.
I gotta disagree a bit on this one.

Yes, if you don't do the math down to the 1/1000 of a volt over every second for multiple hours of testing then you don't get "Accurate" ratings...

However...

If my battery nominal of 12v and my battery says 100ah and I get over 1200wh, then I Don't Care. 99.99% of people playing with solar I would guess just don't care. It's a rounding error. It's no different than ranting about a 100w panel not producing 100w all the time.

It's rough math. There are going to be variances and inefficiencies and losses all over the place. To my mind, and I'm sure many others, if the napkin math says it should be X and it's X-ish then it's fine.

In addition, my 0v batteries are rated at 1,000,000,000,000 hours and you can't stop me so PTHBTHBTHBTHBTHB!!??
 
Let me toss this wrench into your thinking…

Ah ratings, as accurate as they are make it a challenge for the average user to decide on a bank capacity.

Heck, the average user has difficulty understanding why their 10 amp refrigerator can’t last 10 hours on a 100Ah battery…

Using Wh as a real world capacity factor evens the playing field and makes it a simple matter to build a battery bank.

A 2000W load running 24 hours needs a 48kWh bank… minimum.

What would that be in Ah? 1000? 2000? 4000? It depends on the voltage.
 
Ah is appropriate when troubleshooting cells, but Wh is more appropriate for system-level discussions. I know of no practical constant current applications for a battery relevant to a DIY solar forum.
 
I sure like to see the answer from Battery manufactures as how they perform the test and how they come up with the Ah result.
 
I sure like to see the answer from Battery manufactures as how they perform the test and how they come up with the Ah result.
Well, as I understand it there is the "Industry Standard" that is a .2c draw until the battery hits 10v, the "WallyWorld Standard" which is a 1a draw until the battery is dead, and the "20a Standard" which ignores capacity and just drains the battery at 20a until it just can't do 20a anymore.

The wonderful thing about standards is that there are so many to choose from. ?
 
Well, as I understand it there is the "Industry Standard" that is a .2c draw until the battery hits 10v, the "WallyWorld Standard" which is a 1a draw until the battery is dead, and the "20a Standard" which ignores capacity and just drains the battery at 20a until it just can't do 20a anymore.

The wonderful thing about standards is that there are so many to choose from. ?
I know about the discharging rate but I really like to know if they really measure the Wh and then divide it by the 3.2V nominal Voltage to get the Ah reading instead of what the Ah reading of the tester itself.
 
I'm just going to say this: taking the Wh reading of your measurement and dividing it with the nominal voltage of the cell or battery is just wrong.
All models are wrong, some however are useful.

This rant is, in essence, about
i. ignoring the integration step when equating energy (Wh) and charge (Ah), where

Energy = ∫ I.V dt

between the start and end times using the instantaneous values for current (I) and voltage (V) at each moment in time.
This is a necessary step to derive energy since both current and voltage vary over time.

and
ii. instead saying

Energy (Wh) = Ah x V
where V is some fixed "nominal" value.


While the formula Wh = Ah x V is not a correct means to equate the two, I see no big deal provided its use is limited to calculating a reasonable ball park estimate of the amount of work one can expect a fully charged battery to be able to perform, based on its stated charge capacity rating (Ah) and voltage class (nominal voltage).
 
Energy = ∫ I.V dt

Exactly this. I was not implying that the Wh is a bad unit, in fact, I think it's a great unit to define the energy in a cell/battery compared to the Ah.
My rant was only about using the measured Wh rating to come to the Ah rating of the cell by dividing by 3.2, instead of taking the measured Ah rating in the same test.
 
So starting with a fully charged 12.8V nominal 100Ah battery, it will measure 12.8V after being discharged 50Ah at .2C?

No, that's exactly the thing: there are voltage differences over the entire discharge curve. However an Ah reading is established by drawing a constant current from the cell and see how long that takes. To measure a Wh rating at the same time, you also have to take voltage at measurement - in essence the integral below the curve.

If you don't have to take voltage into account, this becomes a rectangle with current on the y axis, and time on the x axis: yAmps multiplied by xTime, if you do have to take voltage into account, it is not just a multiplication anymore; you have to sample at higher intervals, or do a curve fitting and calculate the actual integral.

And again, I'm not saying Wh is bad. It's very good - but it's being used to make the Ah rating seem higher. If you take the nominal voltage of a cell which is obtained at a rather high discharge of 0.2C - which will make this value slightly lower, and you use this to divide your Wh rating of your measurement, which will be slightly higher because of the lower C rate - you get a skewed result in favor of higher Ah readings.
 
While the formula Wh = Ah x V is not a correct means to equate the two, I see no big deal provided its use is limited to calculating a reasonable ball park estimate of the amount of work one can expect a fully charged battery to be able to perform, based on its stated charge capacity rating (Ah) and voltage class (nominal voltage).

I fully agree with that. What I don't like is that these calculations are used (including by some vendors) to show people they have a higher Ah capacity from their cell compared to the one they measured.
 
Ah is appropriate when troubleshooting cells, but Wh is more appropriate for system-level discussions. I know of no practical constant current applications for a battery relevant to a DIY solar forum.

And @Supervstech, @Lt.Dan, @Rednecktek and probably others:

I agree with you - the issue is that certain people (vendors) are using these calculations to tell users their measurements are wrong when they do a capacity test.
 
Watts is like asking asking "how much horsepower does your car have?"
Wildly fluctuating answers ^^ LOL.

I like Amps.
 
  • Like
Reactions: Cal
And @Supervstech, @Lt.Dan, @Rednecktek and probably others:

I agree with you - the issue is that certain people (vendors) are using these calculations to tell users their measurements are wrong when they do a capacity test.
Exactly this.
It's right up there with, "you weren't holding your tongue right during the test".
 
Back
Top