diy solar

diy solar

Rant about Ah and Wh...

I get your point but in the context of home consumers it's just way to complicated.

I actually posted this after reading some comments (again) from a certain vendor regarding individual LiFePO4 cells - and that was the context within which I wanted to frame this. I probably should have been more clear about that. It wasn't supposed to be within the context of random ready-made batteries people buy.
 
I actually posted this after reading some comments (again) from a certain vendor regarding individual LiFePO4 cells - and that was the context within which I wanted to frame this. I probably should have been more clear about that. It wasn't supposed to be within the context of random ready-made batteries people buy.
I made the exact same points after Sun Fun Kits was busy using V x Ah = Wh math to back into whatever answer they wanted. I got blow-back from several people, and decided they just didn't understand my point.

For cells, it is best to stay with Ah because the results will be repeatable. If you try to back into Ah by dividing Wh by ANYTHING, the answer is meaningless. If you try and measure Wh, you will have to account for the power losses in the test wiring / connections. You don't have to worry about the wire or connections when measuring Ah, because the amps flow through the circuit no matter what the losses are in wire and connections.

Anyway, thank you @upnorthandpersonal for trying again.
 
Nothing wrong with coulomb counting. It is extremely accurate. I don't get it. If the cell is rated in Ah, why on earth would you measure Wh and then try to convert it to Ah? (n) A Wh measurement is far less accurate. Not only do you have to measure amps over time, but also voltage over time. Your measurement error more than doubled.

Unless you account for the inherent IR drop, voltage measurement is not accurate. Charging current will give a voltage reading that's too high and discharge current will give a reading that's too low. While cell resistance may be low, measuring voltage at the bus bar might add 0.5 mohm resistance. Total resistance could be 0.75 mohm. Charging a 280 Ah cell at C2 (56 A) will read a voltage that's 42 mV too high. Discharging at 160A (my RV microwave) will give a reading that's 120 mV too low!

Not only do you have an IR error when measuring Wh, but also a I^2R error. At 160A, 19.2 W is lost to heat per cell. Neither error is present when measuring Ah.

There's a reason battery capacity is measured in Ah and not Wh. Why go against industry standards:?
 
I made the exact same points after Sun Fun Kits
This is the culprit/issue. They are convincing people that their cells are better by telling people to use the 3.2V nominal to divide which has been stated in this thread will always make it look like you have more Ah's then you really do and therefore you should pay them more for their "automotive grade" cells when in fact they aren't different then "regular" cells you can get cheaper from others.
 
why on earth would you measure Wh
Because how much Work (i.e. energy) a system can actually perform is usually what most people are interested in knowing. It's the end result of discharging the battery, and tells us what is required to charge it back up.

Work (Wh) is the integral of all the factors which affect this, be it due to voltage variability, system losses, internal impedance, power factors etc etc.
 
Because how much Work (i.e. energy) a system can actually perform is usually what most people are interested in knowing. It's the end result of discharging the battery, and tells us what is required to charge it back up.

Work (Wh) is the integral of all the factors which affect this, be it due to voltage variability, system losses, internal impedance, power factors etc etc.
First off: I think most of us would agree that Watts and Watt-hours are the correct things to design and judge your system by. In that regard, you SHOULD use Wh when sizing the battery for your system.

HOWEVER: This entire thread is about measuring, comparing, and discussing the capacity of cells. For the reasons several of us have already discussed, Ah are the only reasonable units to for cells. Brokers should stand by Ah, and we consumers - if we care - should be measuring Ah to decide if we got what we paid for. We also should be using Ah of the cells to try and assemble batteries that are of somewhat balanced capacity. Once you've got a battery, the granularity can go down a bit, can use 12.8V per 4 cells to estimate the Wh capacity of your battery for your system.
 
HOWEVER: This entire thread is about measuring, comparing, and discussing the capacity of cells.
The OP said this was about both cells and batteries.

From the very first post on the thread:
I'm just going to say this: taking the Wh reading of your measurement and dividing it with the nominal voltage of the cell or battery is just wrong.
 
The OP said this was about both cells and batteries.

From the very first post on the thread:
Context matters. What he said is absolutely correct. Trying to get Ah by dividing the Wh of a cell or battery by some "nominal" voltage is foolhardy, and will give you an arbitrary and wrong answer. However, nearly all of the rest of his post was about when people are trying to determine the capacity of a cell.
 
I guess I don’t understand.
If, at .2C a 1280Wh battery outputs 1280Wh, is that not a valid test of the Ah of the battery?

If it outputs 1320Wh would that not be more than a 1280Wh battery, thus more than 100Ah?
 
I guess I don’t understand.
If, at .2C a 1280Wh battery outputs 1280Wh, is that not a valid test of the Ah of the battery?

If it outputs 1320Wh would that not be more than a 1280Wh battery, thus more than 100Ah?
Well, there are two problems.

First, to get from Wh to Ah you have to divide by some voltage. What voltage do you use? If you use 3.2V per cell, just know that the answer may be "good enough" for whatever you are doing, but not really accurate since the voltage was changing some throughout the test from 100% to 0% SoC. Like I said, if you are just designing a system, stay with Wh and it will be good enough.

Second, a measurement of Wh is measuring the energy that is going through an entire circuit. So that will include the load are using for the measurement, but also the cables, terminals, and bus bars, etc. These all will have some resistance, and so will be included in the watts you actually take out of the battery, but they will be over and above what you measure.

If you are measuring amps the losses in the circuit don't matter. All of the amps go through the whole circuit, no matter how small the wires or how bad the connections are. So Ah is more repeatable, and can be both more accurate and precise than any measurement of Wh.

As long as you either (1) don't care much about the precision of the measurement, or (2) know exactly what the watt losses are in your measuring system and include that in your Wh calculation, or (3) have low enough watt losses that it doesn't matter you can do just that. Fact is, option #3 is the same as #1 since you will always have some losses.

If it outputs 1320Wh would that not be more than a 1280Wh battery, thus more than 100Ah?
Maybe I'm being dense, but I'm not sure what you are trying to say here. If you are only saying that 1320 is more than 1280, then obviously I agree. I don't think you really know what the Ah rating is on either of those two Wh measurements, so I don't think the number 100Ah fits.
 
Well, there are two problems.

First, to get from Wh to Ah you have to divide by some voltage. What voltage do you use? If you use 3.2V per cell, just know that the answer may be "good enough" for whatever you are doing, but not really accurate since the voltage was changing some throughout the test from 100% to 0% SoC. Like I said, if you are just designing a system, stay with Wh and it will be good enough.

Second, a measurement of Wh is measuring the energy that is going through an entire circuit. So that will include the load are using for the measurement, but also the cables, terminals, and bus bars, etc. These all will have some resistance, and so will be included in the watts you actually take out of the battery, but they will be over and above what you measure.

If you are measuring amps the losses in the circuit don't matter. All of the amps go through the whole circuit, no matter how small the wires or how bad the connections are. So Ah is more repeatable, and can be both more accurate and precise than any measurement of Wh.

As long as you either (1) don't care much about the precision of the measurement, or (2) know exactly what the watt losses are in your measuring system and include that in your Wh calculation, or (3) have low enough watt losses that it doesn't matter you can do just that. Fact is, option #3 is the same as #1 since you will always have some losses.


Maybe I'm being dense, but I'm not sure what you are trying to say here. If you are only saying that 1320 is more than 1280, then obviously I agree. I don't think you really know what the Ah rating is on either of those two Wh measurements, so I don't think the number 100Ah fits.
I see now. Ah is a test measurement that tracks the amperage output of the battery, where watt-hour is a test of watts over the measuring tool.
If the tools are both directly connected to the terminals of the battery, I can measure the Wh of the system, but that will NOT be an accurate Ah measurement of the battery.
I’ve never seen an Ah measuring tool…
 
i agree with the original post. Since i think in terms of Watts and Watthours i rarely bother to even do the conversion. The only time I care about Amps is when I am sizing wire, fuses or breakers. When buying bstteries or cells the metric I use is price per Watthour. As long as I am consistent with the voltage used that metric is useful for me.
 
I’ve never seen an Ah measuring tool…
They certainly exist. I have a couple of Turnigy power meters, they display instantaneous volts, amps, and watts, as well as amp hours, watt hours, and elapsed time.
https://www.amazon.com.au/Turnigy-180A-Meter-Power-Analyzer/dp/B0778J8V8R.

Amp hours are very easy to measure.
All you need is a current shunt and a voltage controlled oscillator, and a pulse counter.
The oscillator goes faster the higher the current, and the longer the time, the higher the number of total output pulses.

A crude example, if 1 amp = 1Hz after a minute you get a count of 60 at one amp.
If there are 2 amps, you get a count of 120 per minute. So this counts up in amp minutes.
If you divide by 60 you get amp hours.

Of course the oscillator is proportional. 3.015 amps = 3.015 Hz
 
Oh, I wasn’t doubting their existence, just stating I haven’t seen one.
I’m a fairly new electrician, batteries are mostly automotive with me.
 
I’ve never seen an Ah measuring tool…
Most of the capacity measuring devices measure Ah. The relatively cheap little boards with a big heat sink and fan do, and the more expensive ZKE EBC-A40L does too. They both may also show an estimate of Wh, but I've never even looked at that because I wouldn't trust it for the reasons I've explained.

When buying bstteries or cells the metric I use is price per Watthour. As long as I am consistent with the voltage used that metric is useful for me.
That's the key. Wh is fine for you to use, and it is what you should use when designing / sizing your system, as long judge them all consistently (using the same voltage as you described). I really don't like it when cell brokers try and use Wh to back into a Ah rating that makes their cells look better than they are.
 
Context matters. What he said is absolutely correct. Trying to get Ah by dividing the Wh of a cell or battery by some "nominal" voltage is foolhardy, and will give you an arbitrary and wrong answer. However, nearly all of the rest of his post was about when people are trying to determine the capacity of a cell.
I refer you to my initial post on this thread:

and the OP's response:

I think I get the context just fine.
 
Back
Top