diy solar

diy solar

Rant about Ah and Wh...

I'm just going to say this: taking the Wh reading of your measurement and dividing it with the nominal voltage of the cell or battery is just wrong.

I have asked around some folks at Texas A&M regarding this and this equation: Watts = Amps * Volts is a very reliable means of calculating power and if you have 2 values you can get the third. The same applies to Watt-hours = amp-hours * average voltage.

https://en.wikipedia.org/wiki/Volt-ampere#:~:text=Volt-amperes%20are%20usually%20used,⋅A%20%3D%201%20W).

So if you have any 2 values you can get the third.
 
It becomes an issue when someone (a vendor) claims 320Ah, and then the test tells them 310Ah. If everyone just advertises these at 304Ah, it would not be an issue. Trying to claim false/wrong measurements and 'correcting' the measurement with the divide by 3.2V thing is what's my main issue.
Problem is having a testing standard that all adhere to. Take for instance a vehicles MPG (miles per gallon or kilometers per liter if you are metric) rating. Without an EPA in the US you could have rather divergent claims for what any vehicle had. This can be a big deal when people shop for a car.

When shopping for a battery people place a lot of emphasis on claimed ah or wh ratings. However many have only a hazy understanding of what either term means as it relates to a chemical device. I think the battery merchants mostly do a fair job of rating capacity.
 
I have asked around some folks at Texas A&M regarding this and this equation: Watts = Amps * Volts is a very reliable means of calculating power and if you have 2 values you can get the third. The same applies to Watt-hours = amp-hours * average voltage.

https://en.wikipedia.org/wiki/Volt-ampere#:~:text=Volt-amperes%20are%20usually%20used,⋅A%20%3D%201%20W).

So if you have any 2 values you can get the third.

That's only true if the voltage is constant over the period of time you do the measurement, or if you actually measure the average voltage. Just taking the nominal voltage is not correct since the nominal voltage is only equal to the average voltage at 0.2C.

Problem is having a testing standard that all adhere to. Take for instance a vehicles MPG (miles per gallon or kilometers per liter if you are metric) rating. Without an EPA in the US you could have rather divergent claims for what any vehicle had. This can be a big deal when people shop for a car.

When shopping for a battery people place a lot of emphasis on claimed ah or wh ratings. However many have only a hazy understanding of what either term means as it relates to a chemical device. I think the battery merchants mostly do a fair job of rating capacity.

That's the reason batteries have mostly been rated at amp-hours - and because you can check this easily with relatively basic equipment. If you for example have one of these:

maxresdefault.jpg


Your Wh test will be way off, but your Ah test will be pretty spot on.

Again, my main issue is that if you test with the above meter and come to an Ah rating, that the vendor then tells you that your measurement is wrong.
 
Every car battery I have see is rated in KW and same for large battery banks, utility bills, I don't think anyone uses AH. For some reason its all over the place with 12v batteries.

For me I found WH to be way more useful, and yes only using the wh/3.2v method did my Luyuan 280K cells meet listed capacity as they were 278 AH and 900 WH.

So for me I look at both, but to me WH is more important it tends to be more accurate and I have seen, I keep going back to this thread:


2 cells both showing 274 AH but cell 1 makes 880 WH and cell 7 makes 866WH

I know enough that an inverter needs watts to run so if you had a 3v inverter cell 1 would run it longer than cell 7.

So for me I don't look at AH as much as WH. Not 100% on the Wh/3.2 yet but it is starting to make sense a bit to me.
 
That's only true if the voltage is constant over the period of time you do the measurement, or if you actually measure the average voltage. Just taking the nominal voltage is not correct since the nominal voltage is only equal to the average voltage at 0.2C.



That's the reason batteries have mostly been rated at amp-hours - and because you can check this easily with relatively basic equipment. If you for example have one of these:

maxresdefault.jpg


Your Wh test will be way off, but your Ah test will be pretty spot on.

Again, my main issue is that if you test with the above meter and come to an Ah rating, that the vendor then tells you that your measurement is wrong.

That's only true if the voltage is constant over the period of time you do the measurement, or if you actually measure the average voltage. Just taking the nominal voltage is not correct since the nominal voltage is only equal to the average voltage at 0.2C.



That's the reason batteries have mostly been rated at amp-hours - and because you can check this easily with relatively basic equipment. If you for example have one of these:

maxresdefault.jpg


Your Wh test will be way off, but your Ah test will be pretty spot on.

Again, my main issue is that if you test with the above meter and come to an Ah rating, that the vendor then tells you that your measurement is wrong.

Sorry but I trust my ZKE-40 way more than that little thing...
 
How much of a difference are we talking about anyway, 2-4 AH? 5? I doubt its going to make a difference in the real world.
 
2 cells both showing 274 AH but cell 1 makes 880 WH and cell 7 makes 866WH

Yes, that's what I'm getting at: is this now a 275Ah (880/3.2) or 270Ah (866/3.2) 'for real'?
My point is that it does not matter, but that you can't use the Wh reading now to get a 'higher' Ah number to satisfy your label.
 
A battery manufacturer sent me a “110 Ah” battery to test, I ran a .2C load on it through my smart shunt, and it came up 96 Ah, the manufacturer claimed 96 is within the load range of a 110Ah rating, and that is why they are claiming 110Ah. I explained a brand new LFP battery should ALWAYS test higher than the listed rating, as there is always a 5% drop in capacity after a few dozen cycles… I would not recommend the battery, and if they want to reliable the battery 90 Ah, I would be happy to recommend it, but could not in ANY way recommend a battery that tests below the listed rating.
 
You're right. I assume that you are complaining about inaccurate capacity claims; no doubt if someone measures the energy capacity of a battery and then uses simple arithmetic to find an Ah capacity, then it's likely to be slightly inaccurate. But as far as I can see storage batteries are specified by Ah, I don't see Wh in manufacturers data sheets.

Car batteries aren't specced by energy capacity either, the most important spec is CCA.
 
Yes, that's what I'm getting at: is this now a 275Ah (880/3.2) or 270Ah (866/3.2) 'for real'?
My point is that it does not matter, but that you can't use the Wh reading now to get a 'higher' Ah number to satisfy your label.

I get it & agree with you.

Energy Storage Tanks; Batteries, Propane Tanks, Gasoline Tanks, Diesel Tanks ,,, All have this in common. How much energy “capacity“ is stored in these tanks? ,,, the answer is ,,, It Depends

I am fairly new to 12vdc Cabin & Travel Van use electrical systems ,,, roughly around 2016 / 2018 ,,, I have noticed a human phenomenon with batteries as an energy storage tank & health ,,, and the obsession with. I get it, cause the things are not cheap & they are a depreciating assets that do not get healthier over time nor can they do as much as they once could. They are very similar to us.

The battery label is just that ,,, as is the current battery health. If I say I have a 250Ah battery bank & you say do you ,,, My response is actually “no, but that is what the label says”. The capacity is what it is.
 
I agree with angels on the pin comment. It's all great theoretical discussion, but most consumers aren't going to care if we calculate battery capacity down to the minute or even hours... but what they do care about is, will this get me through TBD time period. Most have no idea how much energy their TV or sound bar or fridge even take to run. Right or wrong, mfg's have settled on AH as the "standard" parameter. Work to 12v and you're going to land "close enough" to real world application. Will has a starter 400ah kit recommendation, and it simplifies the discussion and component shipping.
 
Anyway, it seems I have a problem getting my point across. It's not important. Never mind, I had my rant.
Interesting and informative thread, and some good points raised by many.

When I started my first DIY CNC build 13 yrs ago reading some posts....to me and from my understanding....the terminology was like learning Chinese through Russian when you speak English...

But here thanks to you guys the battery stuff is easier and its starting to make proper sense to me?
 
I get your point but in the context of home consumers it's just way to complicated.

I actually posted this after reading some comments (again) from a certain vendor regarding individual LiFePO4 cells - and that was the context within which I wanted to frame this. I probably should have been more clear about that. It wasn't supposed to be within the context of random ready-made batteries people buy.
 
I actually posted this after reading some comments (again) from a certain vendor regarding individual LiFePO4 cells - and that was the context within which I wanted to frame this. I probably should have been more clear about that. It wasn't supposed to be within the context of random ready-made batteries people buy.
I made the exact same points after Sun Fun Kits was busy using V x Ah = Wh math to back into whatever answer they wanted. I got blow-back from several people, and decided they just didn't understand my point.

For cells, it is best to stay with Ah because the results will be repeatable. If you try to back into Ah by dividing Wh by ANYTHING, the answer is meaningless. If you try and measure Wh, you will have to account for the power losses in the test wiring / connections. You don't have to worry about the wire or connections when measuring Ah, because the amps flow through the circuit no matter what the losses are in wire and connections.

Anyway, thank you @upnorthandpersonal for trying again.
 
Nothing wrong with coulomb counting. It is extremely accurate. I don't get it. If the cell is rated in Ah, why on earth would you measure Wh and then try to convert it to Ah? (n) A Wh measurement is far less accurate. Not only do you have to measure amps over time, but also voltage over time. Your measurement error more than doubled.

Unless you account for the inherent IR drop, voltage measurement is not accurate. Charging current will give a voltage reading that's too high and discharge current will give a reading that's too low. While cell resistance may be low, measuring voltage at the bus bar might add 0.5 mohm resistance. Total resistance could be 0.75 mohm. Charging a 280 Ah cell at C2 (56 A) will read a voltage that's 42 mV too high. Discharging at 160A (my RV microwave) will give a reading that's 120 mV too low!

Not only do you have an IR error when measuring Wh, but also a I^2R error. At 160A, 19.2 W is lost to heat per cell. Neither error is present when measuring Ah.

There's a reason battery capacity is measured in Ah and not Wh. Why go against industry standards:?
 
I made the exact same points after Sun Fun Kits
This is the culprit/issue. They are convincing people that their cells are better by telling people to use the 3.2V nominal to divide which has been stated in this thread will always make it look like you have more Ah's then you really do and therefore you should pay them more for their "automotive grade" cells when in fact they aren't different then "regular" cells you can get cheaper from others.
 
why on earth would you measure Wh
Because how much Work (i.e. energy) a system can actually perform is usually what most people are interested in knowing. It's the end result of discharging the battery, and tells us what is required to charge it back up.

Work (Wh) is the integral of all the factors which affect this, be it due to voltage variability, system losses, internal impedance, power factors etc etc.
 
Because how much Work (i.e. energy) a system can actually perform is usually what most people are interested in knowing. It's the end result of discharging the battery, and tells us what is required to charge it back up.

Work (Wh) is the integral of all the factors which affect this, be it due to voltage variability, system losses, internal impedance, power factors etc etc.
First off: I think most of us would agree that Watts and Watt-hours are the correct things to design and judge your system by. In that regard, you SHOULD use Wh when sizing the battery for your system.

HOWEVER: This entire thread is about measuring, comparing, and discussing the capacity of cells. For the reasons several of us have already discussed, Ah are the only reasonable units to for cells. Brokers should stand by Ah, and we consumers - if we care - should be measuring Ah to decide if we got what we paid for. We also should be using Ah of the cells to try and assemble batteries that are of somewhat balanced capacity. Once you've got a battery, the granularity can go down a bit, can use 12.8V per 4 cells to estimate the Wh capacity of your battery for your system.
 
HOWEVER: This entire thread is about measuring, comparing, and discussing the capacity of cells.
The OP said this was about both cells and batteries.

From the very first post on the thread:
I'm just going to say this: taking the Wh reading of your measurement and dividing it with the nominal voltage of the cell or battery is just wrong.
 
The OP said this was about both cells and batteries.

From the very first post on the thread:
Context matters. What he said is absolutely correct. Trying to get Ah by dividing the Wh of a cell or battery by some "nominal" voltage is foolhardy, and will give you an arbitrary and wrong answer. However, nearly all of the rest of his post was about when people are trying to determine the capacity of a cell.
 
I guess I don’t understand.
If, at .2C a 1280Wh battery outputs 1280Wh, is that not a valid test of the Ah of the battery?

If it outputs 1320Wh would that not be more than a 1280Wh battery, thus more than 100Ah?
 

diy solar

diy solar
Back
Top