diy solar

diy solar

What voltage to use when estimating kWh for LiFePO4 capacity?

krby

Solar Enthusiast
Joined
Nov 2, 2019
Messages
266
Location
SF Bay Area, CA
Or... "How many watt hours are in a single LiFePO4 cell?"

Before I bought my first big LFE, I had a spreadsheet of various options and in order to know if a given battery was "enough" for my use case AND to compare price per kilowatt-hour, I estimated the capacity of each battery in kWh. To do this I did:
amp hours x estimated_voltage / 1000 = kWh

I derated this value by some DoD percentage (80%, 85%, 90%) give me usable kWh, depending on how hard I wanted to be on the batteries.

My question is about that estimated_voltage value above. Originally I used 3.2V per cell "estimated voltage", because many people that sell these batteries list that as the nominal voltage (12.8/25.6/51.2 for 4/8/16S) respectively. Is that the right value? This page shows a chart where LFE spends most of its time at 3.3V. I know it's going to be complicated by the discharge current (for me this is going to be between 0.25C-0.5C in the worst case), but I'm just looking for an every number to plug in.

What got me thinking about this is that I over the weekend I did a couple of discharge tests (ended up about 0.3C) on a new 24V 200Ah LFE and I got about 4.45kWh of usable AC from a Victron MultiPlus, given the inverter overhead I saw during the discharge, that's about 4.9-5.1kWh of DC from the battery, which is more than I expected, I also had the charge voltage lower and discharge cut-off higher than the manufacturer recommended.

So, is 3.2V too conservative? Should I estimate with 3.25 or 3.3V per cell?
 
Last edited:
Go by nominal voltage. 12.8 @12v, 25.6 @ 24v

Usable battery watt hour is what I think your asking?
Ah x DoD x V = Wh
100 x .80 x 12.8 = 1,024Wh

Your battery... usable @ 80% DoD
200 x .80 x 25.6 = 4,096Wh
 
Ya, I know how to calculate Wh given Ah and V and DoD. I was really asking specifically about using 3.2V in that calculation.

So you're suggesting to use 3.2V per cell, that'll definitely be a safe estimate. But the battery spends quite a bit of time delivering power above 3.2V, so if I want something more accurate, what should I use?
 
If you want the exact measurement I think you would need to figure out how much energy is produced at each 1/10 volt increment and figure Wh from that.

1577685725739.png
So I just did this with data from 1 cell tested Notice that the actual Wh produced is less than listed nominal Voltage x Ah
 
If you want the exact measurement I think you would need to figure out how much energy is produced at each 1/10 volt increment and figure Wh from that.
<... IMAGE SNIPPED ...>
So I just did this with data from 1 cell tested Notice that the actual Wh produced is less than listed nominal Voltage x Ah
Nice test! What battery chemistry is it for? Is 2.3V nominal for an LTO?

The overall point isn't lost on me, I really should count on 3.2V when estimating total capacity and derate for DoD from that.
 
I just did an ~80% capacity test by voltage on a ‘dumb’ Daly bms that I don’t fully trust yet. Adjusting for the rated 90% efficiency of the inverter I got approximately 14.4v x 280ah.

These are a more likely grade A from a recent Amy/Luyuang delivery. I did not top balance instead relying on the BMS. I did trip the BMS on the initial charge using the default Victron settings.

I’ll now recharge for a more (?) accurate measure of total watt hours.
 
I would stick with 3.2v per cell to compute watt hours.

Most commercial batteries are rated that way like my GYLL and others I have seen.
 
Back
Top