diy solar

diy solar

BMSes that can handle extended operations in a partial state of charge?

jdege

New Member
Joined
Dec 16, 2020
Messages
138
I've been thinking about using LiFePo4 on my boat.

That has new reading Nigel Calder's Boatowners Mechanical and Electrical Manual 4/E.

And in it, I read:
Battery management and cell balancing. As noted, if a lithium-ion battery cell is overcharged, it enters a self-heating (exothermic) state in which its temperature rises rapidly and potentially dangerously. At the other end of the spectrum, if a cell is repeatedly overdischarged, changes take place that can once again cause it to initiate a potentially dangerous self-heating process during normal recharges. In severe cases of overdischarge, cell reversal can occur, initiating self-heating. To prevent overcharge and over-discharge, most lithium-ion batteries have cell monitoring and cell balancing at the individual cell level that is then integrated into an overall battery management system (BMS).
Cell balancing (Figure 2-13) can be active or passive. Crudely speaking, in the former case, charging current is siphoned off of higher-charged cells to lower-charged cells until all are in balance, and in the latter case, as cells come to charge, charging current from the higher-charged cells is dissipated through a resistor until the lower-charged cells catch up. Both approaches can work. However, there is a problem in PSoC operation.
Cell balancing is normally triggered by voltage differences between the cells. One beneficial characteristic of lithium-ion batteries is that cell voltages on discharge and recharge remain almost constant until a full recharge or discharge is reached This is especially the case with the LiFePO4 chemistry. Other than at the very top and bottom ends of the state-of-charge spectrum, cells can be in a significantly different state of charge but still have only barely measurable differences in voltage. If a battery is operated for extended periods of time in a PSoC condition, the cells can become progressively more unbalanced (cell drift) without this being detected by many cell-balancing systems.
If the battery is now fully recharged, the more fully charged cells may approach their upper voltage threshold at a time when charging currents are still relatively high. At this point, the voltage on the fully charged cell(s) can climb rapidly to dangerous levels. The cell-balancing circuit will kick in to prevent this, but the necessary balancing currents (either active or passive) may be more than this circuit can handle, in which case the cell may still overheat, the balancing circuit may burn up, or the BMS may disconnect the battery from the boat’s circuits in order to protect the battery. In the latter case, if charging devices are still operating and are open-circuited when the battery disconnects itself, high voltage spikes can occur on the boat’s wiring, which can blow out diodes in charging equipment and damage other equipment on the boat.
At the discharge end of the spectrum, if cells get sufficiently unbalanced and the BMS loses track of this, on a deep discharge it is possible for one cell to get overdischarged, becoming unsafe.
In order to avoid these problems and keep cells in close balance, many lithium-ion BMSs periodically require an extended “conditioning” charge similar to a lead-acid conditioning charge—i.e., it requires up to several hours of charging, and if cells are substantially unbalanced as many as 24–48 hours at low current levels. If the energy source is an engine and there are no other loads on the engine, the engine will be operating extremely inefficiently.
Regardless of the cell-balancing approach, the key to optimized operation in the kind of buffering environment envisaged in this chapter is a BMS that can achieve cell balancing in a PSoC condition and that as such does not need some kind of an extended low-power conditioning cycle. At the time of writing, few lithium-ion BMSs can do this.

This seems like a significant concern, but I've seen no discussion of it, in my readings online.

Its this something I need to manage? If I DIY a battery from raw cells, and want a BMS that manages extended periods in a pSOC, what do I look for?

Or is this something that all BMSes manage, these days?
 
I've been thinking about using LiFePo4 on my boat.

That has new reading Nigel Calder's Boatowners Mechanical and Electrical Manual 4/E.

And in it, I read:


This seems like a significant concern, but I've seen no discussion of it, in my readings online.

Its this something I need to manage? If I DIY a battery from raw cells, and want a BMS that manages extended periods in a pSOC, what do I look for?

Or is this something that all BMSes manage, these days?
The article states:
"As noted, if a lithium-ion battery cell is overcharged, it enters a self-heating (exothermic) state in which its temperature rises rapidly and potentially dangerously."

Unfortunately, this falls into the "what is a lithium-ion battery" problem. LiFePO4 is a lithium ion battery, but not all lithium ion batteries are LiFePO4. To make this even more confusing, people often use the term lithium-ion to reference one of the various chemistries that are typically used in automobiles.

In the above statement, I am pretty sure the author is referring to the typical chemistries used in automobiles, and they have very different thermal characteristics than LiFePO4.

LiFePO4 does not suffer from thermal runaway and combustion. However, they can pop the pressure seal if they are overcharged and if the escaping gasses are exposed to a flame source they can burn. Furthermore, if a LiFePO4 cell is punctured, and the escaping electrolyte or gases are exposed to a flame source they can burn.

Bottom line: LiFePO4 is far safer than the EV chemistries, but care should still be taken to minimize the possibility of the cells being punctured or the pressure seals popping. Generally speaking, if LiFePO4 cells are reasonably protected physically and have a BMS that will cut charge/discharge before they get extreme, they are fairly safe. If you also have high voltage cut-off in your chargers and low-voltage cut-off in your loads, the system is even safer.

Note: Some BMSs have temp probes that will shut off the battery if the temp gets too high. Logicly this is an added level of protection but I have never heard data on how effective it is.
 
According to Calder all Lithium Ion batteries, to include LiFePo4, use organic electrolytes - and they're all flammable.

What's unique about LiFePo4 is that the temperatures reached in a thermal runaway aren't high enough to spontaneously combust. They can still runaway and rupture.

As for this problem he's identified, with cells becoming sufficiently unbalanced after long periods of partial charging, he specifically identifies LiFePo4 as being particularly prone to this, because it has the flattest voltage levels at different states of charge. Though I expect that it's also because their more likely to be used in boats that can go long between full charges.

SV Wisdom, on the YouTube channel The Rigging Doctor is just finishing an Atlantic crossing. They've been having problems with their regen - 42 days at sea and it's not be surprised if they'd not reached full charge in all that time.

Ideally a BMS would recognize when cells were out of balance, and keep the currents at a safe level during recharge - but do they?
 
Hi,

I see that book was from 2015, so maybe the BMSs now available, weren't.

One of these?:


Active balancing, many balancing settings to adjust to suit your needs. There are probably many more suitable ones available.

I can't see how any BMS could know the actual condition of a particular cell, apart from voltage. One that has a meter to detect current flow in and out of each cell would be the only option.

dRdoS7
 
Last edited:
I can't see how any BMS could know the actual condition of a particular cell, apart from voltage. One that has a meter to detect current flow in and out of each cell would be the only option.
The cells are in series; the current flowing through them must be equal. If you measure the current flowing through the string, you know the current flowing through each cell (assuming they are single cells in series rather than pairs/sets of cells bonded in parallel and then in series)
 
Under the title " BMS's that can handle extended operations in a partial state of charge'

This usually means active balancers. Simple resistive dump balancers do not usually balance until a cell gets to 3.4v or higher. This means less or no balancing unless brought near full charging periodically. The resistor dump will balance eventually if not fully charged periodically but usually the cells are very imbalanced before that happens and will cause a high cell voltage BMS shutdown if full charge is attempted.

Active balancers can have some problems if cells are not well matched. Poorly matched cells can have differing amount of terminal voltage drop with load current. This can cause an active balancer to balance in wrong direction as the result of load current induced terminal voltage difference between cells. Usually a weaker cell (not state of charge weaker) will have greater terminal voltage drop during discharge and greater terminal voltage rise during charging. This means an active balancer will do the right thing during discharge and wrong thing during charging.
 
Last edited:
So, what we would want in a LiFePo4 bank that would spend significant time in pSOC is an BMS with active balancers and well-matched cells?

How likely would I get that if I were to buy quality commercial batteries?

If I were to DIY with the cheapest components I could find on AliExpress?
 
Hi,

The cells are in series; the current flowing through them must be equal. If you measure the current flowing through the string, you know the current flowing through each cell (assuming they are single cells in series rather than pairs/sets of cells bonded in parallel and then in series)

OP contains a reference which states that cells in a continual partial SOC may become unbalanced (3rd quote), and next quote (#4) explains what may occur when charging. Apart from possible internal losses, the only way know in/out of an individual cell is to measure it, can't rely on it possibly being equal. I don't have a boat, and mostly, my cells are fully recharged ever day (3 or 4 days in Winter would have been the worst in a pSOC condition), so I don't know if it really is a problem. There's a 10A version of the Heltec, purely a balancer, so combined with the one posted, he'd have 12A total balance current.

dRdoS7
 
Hi,



OP contains a reference which states that cells in a continual partial SOC may become unbalanced (3rd quote), and next quote (#4) explains what may occur when charging. Apart from possible internal losses, the only way know in/out of an individual cell is to measure it, can't rely on it possibly being equal. I don't have a boat, and mostly, my cells are fully recharged ever day (3 or 4 days in Winter would have been the worst in a pSOC condition), so I don't know if it really is a problem. There's a 10A version of the Heltec, purely a balancer, so combined with the one posted, he'd have 12A total balance current.

dRdoS7
Correct, but in this case I belive the cells are becoming more and more unbalanced over time not because of a difference in the charging or discharging current flowing through the string, but because of internal differences which is causing inefficiency or losses inside the cell itself.

In other words, you could measure the current going through each busbar from cell to cell in a battery as it either drove a load or was charged, and that current would be equal at each point. (sort of a Kirchhoff's Law kind of thing, if I'm not mistaken) The problem is that it's not being utilized at exactly the same level of internal efficiency to recharge the cell. This isn't trivial to measure from the BMS' perspective.

With a cell chemistry that has such a flat discharge curve as the way that Lifepo4's do, state of charge is normally measured or calculated by measuring the cell voltage and by performing coulomb counting.... When the cells are in a high or low state of charge, the voltage starts to become a much more meaningful measurement of their charge, allowing the BMS to set a benchmark from which to begin coulomb counting of the current going through the cell. Over time measurement inaccuracies compound and accumulate during prolonged use at a partial state of charge.

I could be way off base here (and please correct me if so! I'm still very much learning) but I think those heltec balancers mostly trigger based on measuring an unacceptably large voltage difference between cells rather than trying to measure current differences between each cell over time.
 
Correct, but in this case I belive the cells are becoming more and more unbalanced over time not because of a difference in the charging or discharging current flowing through the string, but because of internal differences which is causing inefficiency or losses inside the cell itself.

In other words, you could measure the current going through each busbar from cell to cell in a battery as it either drove a load or was charged, and that current would be equal at each point. (sort of a Kirchhoff's Law kind of thing, if I'm not mistaken) The problem is that it's not being utilized at exactly the same level of internal efficiency to recharge the cell. This isn't trivial to measure from the BMS' perspective.

With a cell chemistry that has such a flat discharge curve as the way that Lifepo4's do, state of charge is normally measured or calculated by measuring the cell voltage and by performing coulomb counting.... When the cells are in a high or low state of charge, the voltage starts to become a much more meaningful measurement of their charge, allowing the BMS to set a benchmark from which to begin coulomb counting of the current going through the cell. Over time measurement inaccuracies compound and accumulate during prolonged use at a partial state of charge.

I could be way off base here (and please correct me if so! I'm still very much learning) but I think those heltec balancers mostly trigger based on measuring an unacceptably large voltage difference between cells rather than trying to measure current differences between each cell over time.
Balancers just look at voltage. Active balancers must be more accurate but in the case of leap frog charge transfer used for many, just need a simple comparitor to tell which of adjacent cell pairs have greater voltage. The comparison does need to be synchronous to all cells and balancing currents must temporarily cease when voltage comparison is done so not to influence comparison. Common inverter load or charging current is okay as long as all cell voltage comparisons are done synchronously at same point in time so the common current is same in all cells (and cells are relatively well matched).

You are sort of correct on aging although it is their terminal voltage drop for given load current or rise for given charge current that varies between cells. For series connected cells, current is same but for older, more aged cells, they have greater terminal voltage drop for same load current. This means they have greater internal loses that cummulates over time diverging balance.

It is often talked that cells life is based on degradation to 80% capacity. It is not just the AH loss that hurts. It is also the amount of terminal voltage drop for a given load current. Their ability to create and move lithium ions degrades with age, effectively giving them higher impedance. This reduces their peak current delivering capability. An 80% AH capacity capability translates to an even lower watt-hour capability, depending on current demand. More tolerable if battery usage is at lower discharge current rate. During high discharge current, the greater terminal voltage drop coupled with high cell current creates more cell internal heating which degrades a weaker cell even faster. This is what you really need to think about when buying used cells. If your peak current demands are low you will more likely to be happy with them.
 
Last edited:
RCinFLA, thanks, that makes a lot of sense.
While I've read that with lead acid batteries, limiting the depth of discharge on each cycle will extend the lifetime of the battery enormously, but that lithium chemistries are able to handle deep or even full cycles fairly well. Do you happen to know whether it's reasonable to expect a much longer lifespan of lifepo4 cells when they are discharged and charged at lower C rates?
In other words, if they are cycled to the same depth of discharge, but at lower C rates, do the batteries live to see more cycles before their performance degrades to 80%?
I wish I could find some charts or other data about these patterns; I'm sure they must be out there somewhere.

With my solar storage diy lifepo4 battery, I'm deliberately oversizing it somewhat and hoping that by keeping my charge and discharge rates relatively low, that I might see a longer lifetime of the battery.
 
I can't see how any BMS could know the actual condition of a particular cell, apart from voltage. One that has a meter to detect current flow in and out of each cell would be the only option.

dRdoS7

In addition to a normal MOSFET BMS to disconnect when any single cell is over or under safe voltage window,

I plan on logging the amount of time each cell spends at a given temperature.

One 16-bit thermometer per cell. FRAM non volatile memory for logging. Each cell will have say 100 “bins” each representing a temperature range, say 25C-30C. All day and night the logging will add time chunks to each bin based on what temperature the cell is that moment.

If Cell 1 were at 28C all day then the 25-30C bin would have 24 hours added to it in one day.

Representing the time as seconds should be good enough…

I want my LiFePO4 batteries to last 20 years assuming they aren’t replaced.

There are 630 million seconds in 20 years, so each bin second time counter only needs to go to 630,000,000 seconds max. That’s 29 bits. Round up to 32 bits per bin. Each bin maxes out at 136 years. Good enough.

Splitting the temperature range into single degree celsius bins, let’s just say -50C to 100C to include hopefully every human on earth climate wise.. That’s 150 different temperature bins.

32 bits per bin * 150 temperature bins = 4800 bits per cell or 600 bytes per cell to store literally their entire lifetime of temperature experience with single celsius and single second resolution.

A 48V pack is 16S or 16 cells to monitor the lifetime of. Wouldn’t it be cool to quantitatively make statements about the differential calendar aging of each cell? What about the data storage?

600 bytes per cell * 4 cells = 2,400 bytes
600 bytes per cell * 8 cells = 4,800 bytes
600 bytes per cell * 16 cells = 9,600 bytes

so let’s call it 10 kiloBytes per pack.

This 32 kiloByte FRAM memory chip is 10 USD: https://www.adafruit.com/product/1895

It uses 0.001W while operating.
Each byte can be rewritten 10^12 times before breaking.
100 years data retention at 25C. 10 years at 85C.

this is how you can speak about individual cell health.

not many people are doing this from what i’ve seen.

the theory is sound. hope this helps

a cell that’s logged 5 years at 40C is probably more worn out than a cell that’s logged 5 years at 25C.

TL;DR log the temperature of individual cells over lifetime of pack. it can take up as little as 10 kilobytes per 16S pack to store the entire temperature lifetime exposure.

Can do the same thing with voltage as well as temperature for each cell and the model will be even more useful. Store histogram of each cell voltage exposure.

Can do it for amps too. It will take up even less data because only need one entry for entire pack.

A pack that’s logged many hours at high temps high amps high volts can be quantitatively distinguished with cell life history.
 
I've been thinking about using LiFePo4 on my boat.

That has new reading Nigel Calder's Boatowners Mechanical and Electrical Manual 4/E.

And in it, I read:








This seems like a significant concern, but I've seen no discussion of it, in my readings online.

Its this something I need to manage? If I DIY a battery from raw cells, and want a BMS that manages extended periods in a pSOC, what do I look for?

Or is this something that all BMSes manage, these days?
From my own experience not all BMS can handle LFP, largely due to their inability to properly calculate SOC. The (dum)best candidate is ANT BMS - it requires to enter voltage thresholds for SOC percentage - the thing which can be applied to other chemistries, where volts/soc dependency is more or less linear, but not the LFP. Thus any BMS which can do coluomb metering of charge/discharge energy can manage extended periods in a pSOC. Jiabaida's BMS +Heltec 5.5.A active balancer did a very good job in handling a crappy DIY LFP battery, which I use as a testbench.
And regarding flamability. I have found a very good paper, which says that good ventillation is a key component of safety :) Anyone who can find similar research but with findings of gas volumes per cathode chemistry are welcome
 

Attachments

  • lithium battery gas composition.pdf
    11.2 MB · Views: 7
High temperature is detrimental to LFP electrolyte degradation so should be controlled.

'Blue' cells are designed with thick anode and cathode electrodes (120-160 um thick) to give them the highest AH for size and weight. Cell AH capacity is based on total amount of LiFePO4 cathode material and graphite anode material, in appropriate ratio, which accounts for most of the cell weight.

The downside of thick electrodes is poorer peak current performance. For a given electrode thickness there is a peak cell current knee where the onset of electrode ion stavation begins to occur. When this happens the overpotential voltage required to move the necessary quantity of ions to support demanded current must get greater. This is increased terminal voltage drop for the high discharge current. This higher terminal voltage drop increases cell losses and creates greater internal cell heating.

For 'blue' cells' electrode thickness, this starvation knee is at about 0.5 CA.

For 280 AH blue cell,
At 0.2 CA internal heating power loss is about 2 watts.
At 0.5 CA it is about 10 watts.
At 1.0 CA it is over 30 watts of internal heating.

The internal (electrolyte) temperature will be greater than temp felt at the outside surface of the cell case. Bundling cells tightly together reduces their ability to dissipate heat.

As cells age, their internal impedance gets greater, increasing their internal losses and reducing their peak current capability.

By comparison, a cell designed for high CA discharge rate has electrodes thickness in the 20-50 um range.
 
Do you happen to know whether it's reasonable to expect a much longer lifespan of lifepo4 cells when they are discharged and charged at lower C rates?
In other words, if they are cycled to the same depth of discharge, but at lower C rates, do the batteries live to see more cycles before their performance degrades to 80%?
The two dominate degradations in LFP cells is electrolyte decomposition and SEI protective layer growth.
Maintaining full state of charge, running high current rates, and high cell temperature accelerate these degradations.

Overdischarge causes a different problem of lithium metal dendrite growth which can short out a cell. A cell has to be maintained below 1.5v for some time before dendrite growth happens. Reason of manf. spec of 2.5v minimum is to allow for some time margin. At 2.5v it will not take too many days for self leakage rate to drop cell below 1v.
 
Last edited:
this something I need to manage? If I DIY a battery from raw cells, and want a BMS that manages extended periods in a pSOC, what do I look for?
I would not worry too much about was written relating to issues with lithium ion batteries. I would expect your DIY battery to be built using LiFePO4 cells that are safer and more forgiving than lithium ion. Good quality LiFePO4 cells will stay in balance.

The BMS you use with the cells should be regarded as a battery protection system. It will be programmed with voltage and current limits that the battery is intended to operate within. Working within the operation envelope should be determined by the settings on chargers and and the loads used. Should there be factors that cause the battery/cells to exceed the limits, then the BMS will shut down the charge or discharge path or both.

If you wish to work within SOCs that are not extremes, say 20% to 90% state of charge, then you will need additional equipment to monitor and control this, the BMS is not normally involved. Its easy to determine the lower limit as the battery voltage shows a significant change at low states of charge. The high state of charge is more difficult to determine by voltage and would need something like a BMV712 battery monitor that measures state of charge.

In practice the easiest technique is to use modest charge settings on any chargers, 13.8 volts for a 12v system with a float volts of 13.4 volts, to look after the high end, and not over stress the battery.
The low voltage disable settings on inverters together with low voltage protection devices like the Victron Battery Protect, to both control and set alarms, can be used where the battery capacity is nearing a low state.

Mike
 
Hi,

I could be way off base here (and please correct me if so! I'm still very much learning) but I think those heltec balancers mostly trigger based on measuring an unacceptably large voltage difference between cells rather than trying to measure current differences between each cell over time.

The Heltec (JK BMS) I have will accept .002V (IIRC) cell differential. I use .006V. There is also a start/stop voltage, but I can't remember what it's set at. Probably ~3.35V.

The OP quote #4, was concerned that the BMS shutting off charging, may cause failures. I haven't seen this happen. Yet. Touch wood. I have had my system running with LFP since Oct. 2020 to June 2021, when we left home to travel.

dRdoS7
 
Back
Top