diy solar

diy solar

What's the formula to calculate Columbic efficiency for LFP cell / battery pack both for chg and dchg?

so can I then say that for a LFP battery temp change on Columbic Eff is non-existent or can be neglected?
Yes, for LFP you can ignore temperatures affect on eff.

Many also ignore eff and Peurkert, although that will cause a measurable(but small) inaccuracy with the meter. Most meters will resync to 100% after a full charge. So if it reaches full charge everyday, and gets off by <5% during the day, most people don't notice or care.
 
Peukert has no effect on AH counting. AH = counting electrons in and electrons out of battery over time.

Peukert is used to correctly calculate battery capacity. It contains a POWER component.

Big difference!
 
Peukert has no effect on AH counting. AH = counting electrons in and electrons out of battery over time.

Peukert is used to correctly calculate battery capacity. It contains a POWER component.

Big difference!
just to clear I am after SOC calculation and want to know what effect the CE and Peukert has got into it. the more effects are taken into consideration the more accurate I can be in estimation
 
The largest error you could be seeing in a SOC calculation is meter accuracy and granularity.

Second largest error could be in the Peukert calculation. The largest component in the Peukert constant is battery resistance. LiFePO4 batteries have extremely low resistance, hence the low constant. Cell inter-connect resistance has not been accounted. Peukert constant will be larger than what’s listed in cell data sheet.
 
I have found the formula for both stages in a research paper and they are quite a diff from each other
Care to share details? You just state that they are formulas for both stages but not that they are formulas for CE. I can think of a lot of charging and discharging formulas. I continue to believe that there is no such thing as separate CE formulas for charging and discharging.
 
Sure, your questions were all slightly different - probably could have all been covered in the one thread.

I’m in a position where my cells often go for many months without getting fully charged, so i’ve been using coulomb counting to determine pack capacity for close to a decade.

An individual cell will not always have the same efficiency. The main difference is charge rate, but temperature and level of charge are also factors.

To be useful, your calculation error will have to be less than 0.1%.

I don’t think you will be able to implement a SOC only algorithm that will remain accurate over a 12month period on a LiFePO4 cell that is never taken into either voltage knee.

Very happy to be proven wrong - i will buy one of your SOC monitors :)
12 Months is a long time. For that you would also need to account for self discharge, which is non-zero even for Lithium, and will vary from cell to cell, and vary over time.

Accurate is subjective. I would consider an SOC meter accurate if it will go a week and still be within 3%. Many(most?) will be 3% off after a single day.
 
Accurate is subjective. I would consider an SOC meter accurate if it will go a week and still be within 3%. Many(most?) will be 3% off after a single day.
I agree. I have Coulomb counters in my inverter and BMS and see those variations. I have not tweaked the settings in each to try get them to correlate more closely. The simple solution is they reset each day if there is enough solar.
The bottom line in the context of the title of this thread is that Coulombic Efficiency is not a practical issue in my day to day use.
 
Last edited:
12 Months is a long time. For that you would also need to account for self discharge, which is non-zero even for Lithium, and will vary from cell to cell, and vary over time.

Accurate is subjective. I would consider an SOC meter accurate if it will go a week and still be within 3%. Many(most?) will be 3% off after a single day.
usually I have read that for Lfp self-discharge over a month is 3-5% so in this case how can one account for this in our meter readings?
Soc reading - (3%/30days = 0.003% per day ) is it correct ?
 
Care to share details? You just state that they are formulas for both stages but not that they are formulas for CE. I can think of a lot of charging and discharging formulas. I continue to believe that there is no such thing as separate CE formulas for charging and discharging.
watch this at 14:13 ;)
 
  • Like
Reactions: Cal
hi, this thread is really cool and i like the focus on learning!

tossing this video out there. part way through they talk a bit about the precise monitoring equipment they experimented with. microcalorimetry and stuff.

a low-drift LFP SOC meter is something i too wish to use and or engineer for self. still studying how to do it.

sensing pack temperature accurately enough can give input to power loss. if you calculate specific heat capacity of the entire cell casing, and insulate the battery, it’s possible to directly infer the inefficiency thermally.

planning on adding a small monitor microcontroller to at least one pack build. one thermometer per cell.

tallying up seconds that each cell has spent at what temperature and voltage. if one second elapses and cell 3 is at 3.452V and 20.3degC then one second would be added to a histogram type data structure. this would give me a general cell health indicator.

one part of the general idea of tallying the time spent at what conditions per cell is that efficiency will decline with SEI growth and such, which is strongly affected by temperature and voltage of cell from what i gather. so five years down the line if the cell has spent 80% of its time at 3.64V and 45degC then that will be a very strong signal. hope this helps in some way.

the shunt would tell how many amps going through all cells. important to tally up kWh discharge and kWh charge lifetime of cell.

all this kind of dances around exactly getting coulombic efficiency but there’s a lot to monitor so my post reflects that

i really want an SOC meter that takes all these factors into account and updates the SOC very precisely.

like i mean if the pack drops from 20degC to 0degC then i want the SOC meter to lower the remaining % figure proportional to the discharge capacity loss at that temperature. instantly infer the percentage loss and correct in real time sort of thing.

good luck with your studies!
 
He just stated the same formula of discharge and charge. I did not conclude he calculated a different CE for charge or discharge. He did use the same CE of about 1% for charging and discharging.
he said for charging we do calculate using the formula which is nothing diff but for discharging we don't bother using formula but instead, we assume it to be 100% and as for the formula part, if you are really curious then here is research paper that I am following. you can have a look at it :)
 

Attachments

  • 5.pdf
    59.5 KB · Views: 4
he said for charging we do calculate using the formula which is nothing diff but for discharging we don't bother using formula but instead, we assume it to be 100%
Yes. But he used an assumption of CE for charging and no assumption of any CE for discharging. That clearly confirms my opinion that there is no way to calculate CE separately for charging or discharging. He could just as easily assumed half of the CE for each. As discussed throughout this thread, for Lithium it is insignificant.
 
Last edited:
Good extra information - but by the time you build an enclosure that can control the climate accurately enough to be able to precisely monitor LiFePO4 cell temperature, i suspect you will have spent more money than your entire future power cost.

And you still won’t have an SOC meter that doesn’t need periodic reset!
 
i hope to distinguish in this case a between a dedicated metrology setup and an operating install

on the sensor side, please be sure to check out the MCP9808 I2C thermometer as it costs $5 and has typical accuracy of ±0.25°C over -40°C to +125°C range and precision of +0.0625°C. it’s factory calibrated and easy to mount to the cell.

So a 16S pack would incur an increase in the BOM of 16*5=$80 which is not nothing.

Not trying to argue against you, just saying the costs are not necessarily huge. It’s not trivial and not cheap but it’s what I’m calling for in my personal project, and happy if others do something different :) then I can learn from their build
 
Sorry that I didn’t mention, I expect to figure out in a year at soonest if my approach has merit ? how exciting to collect data on the precious cells that capture sky nuke waves ?

To me it’s all about the pursuit of having a time to go meter that is exceptionally punctual and consistent and accurate. Maybe I’ll fail and share my inept stumbling through the dark :D

To make it even more jumbled I want it to factor in my daily usage patterns into another time to go value. That would be very valuable. Dunno where to buy so I’ll hack some junk together…
 
Sure, your questions were all slightly different - probably could have all been covered in the one thread.

I’m in a position where my cells often go for many months without getting fully charged, so i’ve been using coulomb counting to determine pack capacity for close to a decade.

An individual cell will not always have the same efficiency. The main difference is charge rate, but temperature and level of charge are also factors.

To be useful, your calculation error will have to be less than 0.1%.

I don’t think you will be able to implement a SOC only algorithm that will remain accurate over a 12month period on a LiFePO4 cell that is never taken into either voltage knee.

Very happy to be proven wrong - i will buy one of your SOC monitors :)
Obviously, the solution to this problem is AI. :)
 
Back
Top