diy solar

diy solar

Testing: Lead and lithium (LFP) battery banks in parallel

A few questions: if I understand your conclusions correctly from my thoroughly non-specialized POV, you are saying that this is a good way to both improve the performance and extend the life of existing lead installations?

Well, I'm not making any such claims. In principle this appears true but the devil is always in the details. And these devils tend to appear more frequently depending on how complex the system is and this system is trending towards increased complexity... especially with a mix of fundamentally different battery technologies.

However it seems like more and more people are becoming interested in this arrangement of LFP paralleled with LEAD so I thought I'd contribute with some test data 'for informational purposes'.

There are a number of pros and cons or 'concerns' that could be further discussed especially with respect to safety considerations and failure modes analysis.

If this is economically interesting given the installed base, are you saying it remains difficult to do with existing battery/management technology?

It's potentially interesting for sure, depending on one's circumstances, and this test just demonstrated that is wasn't difficult to do at all - i.e. it worked in the test environment with commodity 12V LFP batteries :) However, as seen above, there are ongoing concerns about how to best manage the LFP end-of-charge termination. Also I'd be concerned about what happens if one of a number of the lead cells became shorted - especially if it's an old battery as this could result in the lithium battery discharging into the lead bank and so on. Long story short batteries store significant energy and safety always needs to be a priority - which usually means following codes, using equipment for it's intended purposed and going easy on the 'experimentation' for permanent installations.

So what are you proposing?

Nothing really, it's simply an exploration to gather knowledge / experience and to better understand how these 12V LFP modules or blocks function in practice.

One interesting thing about the parallel lead bank is that it's completely insensitive to over voltage events and it can serve as a bit of a capacitor to snub high-voltage switching transients when the LFP 'protections' are activated or otherwise. For example, if I were on a sailboat in the middle of the ocean I would not want to be fully dependant on lithium batteries and I'd probably experiment with a parallel system with a LEAD, LFP or BOTH switch - as one the posters has suggested above.

It also gives one pause for thought about what an ideal 12V LFP building block would look like - i.e. what attributes or features it should have. Not for paralleling with lead in particular but rather just as a flexible, safe and robust 12V building block that is appropriate for a broad range of applications.

Also, more prosaically, how did you collect the data?

Yeah, the data comes from a number of precision zero-drift battery data loggers I personally built a few years ago. The white one on the wall is doing all kinds of 'AI' like estimations of the battery health, ongoing internal impedance assessment, high resolution data logging and so on. That big red battery has more than 6 years of 30sec logged data available. You can't design a good battery performance monitor w/o lots and lots and lots of field data to design around. I added another logger to measure just the LFP bank.
 
Last edited:
Lots of interesting things going on here.
Thanks for posting, it's really interesting and it's taking a while for me to absorb the data from your charts and try to understand the nuances.

you are saying that this is a good way to both improve the performance and extend the life of existing lead installations?
I'm not the OP but yes, the primary expected use case of this hybridisation is to extend the life of existing (good) lead–acid storage without spending nearly as much on a full capacity replacement with LiFePO4. It might be 1/4 or 1/2 as much LiFePO4 as you would need ordinarily.

The other is management of less stable (perhaps not the right word) charge sources, e.g. an alternator for which having only LFP may be a problematic and where having some lead acid in the system can help provide a buffer.

The main management issue I see is what absorption voltage is required for the lead acid battery bank you have and how long it stays there.

2nd Charge Observations
CHG2.png

If it's on the higher side e.g. ~58V like above then you probably want to have the extra battery management capability of self-disconnecting the LiFePO4 when it is fully charged so it doesn't experience a long period at that absorption voltage.

However if the bulk/absorption voltage of your LA bank is lower (e.g. 56.4V) then it's not as much of a threat to the LiFePO4 bank.

Coupled with 16 individual LFP cells in series and a 16s BMS with decent in-built cell top-balancing, then in this case I'd see the extra requirement for auto self-disconnecting the LFP battery as unnecessary (other than for normal battery protection managed by the BMS).

For example, the specs for my bank of Enersys SLA batteries are a bulk/absorption voltage of 56.4V and float voltage of 54.0V.

For a 16s LiFePO4 battery that's a bulk/absorption charge at 3.525V/cell, and a float voltage at 3.375V/cell. I don't see these voltages posing much of a threat to a LiFePO4 battery. Have the LFP BMS set for over voltage protection cut off at ~3.6-3.65V/cell and it shouldn't really ever be activated under normal operation, especially if the BMS has decent in-built top balancing capability.

from hour 7 to 13 there is a net discharge from the LFP bank into the LEAD bank. It's not much but it's 'wasted' LFP capacity as it's not available to the load. Given enough time the LFP may significantly discharge into the lead bank at a low float-like current.
This to me is a feature, not a bug, and is what will help sustain the LA bank for longer life. The LFP is well suited to this more frequent cycling while keeping the LA at float voltage is good for longevity. But if the LA battery's self discharge rate is too fast and too much of the LFP's capacity is being used simply to float the LA, then I guess the LA bank is approaching end of useable life and so that's when a full replacement with LFP is probably warranted.

Also I'd be concerned about what happens if one of a number of the lead cells became shorted - especially if it's an old battery - this could result in the lithium battery discharging into the lead bank and so on.
Would having each bank individually fused allay that concern?

Thanks again for posting, it's really interesting.
 
The main management issue I see is what absorption voltage is required for the lead acid battery bank you have and how long it stays there.

If it's on the higher side e.g. ~58V like above then you probably want to have the extra battery management capability of self-disconnecting the LiFePO4 when it is fully charged so it doesn't experience a long period at that absorption voltage.

However if the bulk/absorption voltage of your LA bank is lower (e.g. 56.4V) then it's not as much of a threat to the LiFePO4 bank.

Yes, but you can see in the data when the LFP bank effectively removes itself from the CV charge source when one of the internal charge FETs opens which effectively happens when the LFP battery is full (see the AH data). At that point the LFP bank no longer exposed to the CV voltage and experiences no charing current. It's like a single stage LFP charge that terminates when any one cell reaches max voltage. It's not necessarily a bad approach from my limited eye - maybe someone has deeper insights?

Coupled with 16 individual LFP cells in series and a 16s BMS with decent in-built cell top-balancing, then in this case I'd see the extra requirement for auto self-disconnecting the LFP battery as unnecessary (other than for normal battery protection managed by the BMS).

The LFP auto disconnect is fundamentally important to allow the higher or appropriate CV voltage to be present to fully charge the lead. Undercharging lead is not a good road to go down. And we are talking about simple independent 12V module building blocks :)

For a 16s LiFePO4 battery that's a bulk/absorption charge at 3.525V/cell, and a float voltage at 3.375V/cell. I don't see these voltages posing much of a threat to a LiFePO4 battery. Have the LFP BMS set for over voltage protection cut off at ~3.6-3.65V/cell and it shouldn't really ever be activated under normal operation, especially if the BMS has decent in-built top balancing capability.

You'd have to measure the AH of the lead battery to ensure it's getting it's 10% AH overcharge on regular basis.

Why are you seemingly set on providing the LFP with CV charging? Lead needs CV charging so much more than LFP. In fact the LFP in my test was fully charged w no CV charging whatsoever.


This to me is a feature, not a bug, and is what will help sustain the LA bank for longer life. The LFP is well suited to this more frequent cycling while keeping the LA at float voltage is good for longevity. But if the LA battery's self discharge rate is too fast and too much of the LFP's capacity is being used simply to float the LA, then I guess the LA bank is approaching end of useable life and so that's when a full replacement with LFP is probably warranted.

I tend to agree although I don't know when one realizes that too much of the LFP capacity is being used up.... see, devil's in the details :)

Thanks again for posting, it's really interesting.

My pleasure. It was a fun and interesting exercise.
 
Most appreciated. Always like these type of test to prove/disapprove general assumptions and realise the considerable effort.
Any chance of running the same test with the BMS removed just to differentiate between influence of the BMS and chemistry in the results?
 
I run either one or the other but not both simulatneously.
Just a thought.
Could you not run both systems in series...A (fla) B (li) C...each non grounded, common B and with their own CC....3 inverters= A (12v) B (12V)C and A (24V) C....??
 
Yes, but you can see in the data when the LFP bank effectively removes itself from the CV charge source when one of the internal charge FETs opens which effectively happens when the LFP battery is full (see the AH data). At that point the LFP bank no longer exposed to the CV voltage and experiences no charing current. It's like a single stage LFP charge that terminates when any one cell reaches max voltage. It's not necessarily a bad approach from my limited eye - maybe someone has deeper insights?
First off I'll repeat that the right way to use a BMS means that a disconnect for any reason (high voltage, low voltage, over-current, temperature, etc.) should be because something went wrong. I really don't think the BMS disconnect should be routinely used as just part of your daily life. I've said it before, and I've gotten the impression that most people don't agree with me, so....

The other problem I see is that I believe most MOSFET-based BMS's have a hysteresis voltage and delay time for reconnect. My JBD's (I have five of them now) seem to have a HVD at 3.65V, a release delay of 2 seconds, and a release voltage of 3.5V. So once the BMS does a disconnect due to a high cell voltage, it will stay disconnected for 2 seconds, and at any point after that the voltage gets down to 3.5V, it will reconnect. A cell will always drift down to 3.5V from 3.65V fairly quickly. So this means that - at least for a JBD BMS - the BMS will disconnect but will go through cycles of disconnect, pause for a few seconds, then reconnect, then a cell gets back up to 3.65V causing a disconnect, etc. The loop will continue, so as the AGM is held at its absorption voltage for a fairly long time, you are effectively holding the LFP cell(s) at max voltage for that whole time. I don't think that is good for LFP cycle-life.

I believe you also said earlier that there was a not-insignificant amount of current going between the AGM and the LFP, and I think you correctly pointed out that was wasted energy. Did you actually quantify that current over time?

As others have said, I really appreciate you doing this data gathering and posting. It gets us all away from just speculating. Good on you!
 
Yes, but you can see in the data when the LFP bank effectively removes itself from the CV charge source when one of the internal charge FETs opens which effectively happens when the LFP battery is full (see the AH data).
Yes I see that. But that's because the charge voltage at that point is 58V (3.625V / LFP cell) and it makes perfect sense to cease the LFP being exposed to that voltage for too long.

The LFP auto disconnect is fundamentally important to allow the higher or appropriate CV voltage to be present to fully charge the lead. Undercharging lead is not a good road to go down. And we are talking about simple independent 12V module building blocks :)
Yes, however I was referring to the charging specifications for the LA batteries I have which don't require absorption at 58V, but at 56.4V (3.525V / LFP cell). This is the manufacturer's specification on absorption voltage and then float at 54.0V (3.375V / LFP cell). It's a function of the type of batteries (data centre backup) that I have.

Why are you seemingly set on providing the LFP with CV charging?
I'm not. I'm just thinking through whether a simpler solution is feasible in my case with the battery bank I have.

I figure if the chemistry itself can take care of things that's better than relying on an electro-mechanical device.

I tend to agree although I don't know when one realizes that too much of the LFP capacity is being used up.... see, devil's in the details :)
Yep.
 
Last edited:
First off I'll repeat that the right way to use a BMS means that a disconnect for any reason (high voltage, low voltage, over-current, temperature, etc.) should be because something went wrong. I really don't think the BMS disconnect should be routinely used as just part of your daily life. I've said it before, and I've gotten the impression that most people don't agree with me, so....

My question is still why do you think this. You could well be right - but it would be nice to know why.

The other problem I see is that I believe most MOSFET-based BMS's have a hysteresis voltage and delay time for reconnect. My JBD's (I have five of them now) seem to have a HVD at 3.65V, a release delay of 2 seconds, and a release voltage of 3.5V. So once the BMS does a disconnect due to a high cell voltage, it will stay disconnected for 2 seconds, and at any point after that the voltage gets down to 3.5V, it will reconnect. A cell will always drift down to 3.5V from 3.65V fairly quickly.

So this means that - at least for a JBD BMS - the BMS will disconnect but will go through cycles of disconnect, pause for a few seconds, then reconnect, then a cell gets back up to 3.65V causing a disconnect, etc. The loop will continue, so as the AGM is held at its absorption voltage for a fairly long time, you are effectively holding the LFP cell(s) at max voltage for that whole time. I don't think that is good for LFP cycle-life.

This is a good point. However, this clearly didn't happen in the test - although I thought it might. Also, is 3.65V really that high? It seems to be a common recommended max CV voltage for these 12V LFP batteries. The absorption charge also lasts a finite amount of time. My (limited) understanding is that 4.0V is the absolute maximum voltage for LFP cells.

I believe you also said earlier that there was a not-insignificant amount of current going between the AGM and the LFP, and I think you correctly pointed out that was wasted energy. Did you actually quantify that current over time?

Well you can see it in the AH counts. It seemed to consume or use up a fair bit of the LFP capacity before servicing any loads! Maybe that's ok. I don't know. I don't think there is much real waste as most of it is probably stored in the lead battery.

As others have said, I really appreciate you doing this data gathering and posting. It gets us all away from just speculating. Good on you!

Thx, it's all very real and not something that's easily available!
 
My question is still why do you think this. You could well be right - but it would be nice to know why.
I guess it's philosophical. I believe the BMS's only job is to protect the LFP cells. So if the BMS does something, it means that something outside the BMS and cells did something that could harm the cells. Like I said, I accept that many others don't see it that way. My JBD BMS's have a count of how many faults have caused a disconnect. My goal is to keep those at zero. If I check them and find that one is not zero, it means something happened. Since your model would have intentional disconnects by the BMS, a disconnect logged by the BMS may be something bad, or it may just be something you intended. You wouldn't really know.

This is a good point. However, this clearly didn't happen in the test - although I thought it might. Also, is 3.65V really that high? It seems to be a common recommended max CV voltage for these 12V LFP batteries. The absorption charge also lasts a finite amount of time. My (limited) understanding is that 4.0V is the absolute maximum voltage for LFP cells.
I don't talk about it much here because I don't want people to get sloppy, but... There is at least one scientific paper that showed repeated and long term charging to 4.0V does not seem to harm LFP cells, as long as the cell isn't held at that voltage after reaching it. So yeah, you may be right. However, every single manufacturer of these cells says 3.65V is the max charge voltage. Why would they all say that if it really doesn't matter? If I go over 3.65V and something happens to my cells (for that reason or any other reason) I probably would have to just swallow hard and buy new cells, as I couldn't blame anyone / anything else.

Well you can see it in the AH counts. It seemed to consume or use up a fair bit of the LFP capacity before servicing any loads! Maybe that's ok. I don't know. I don't think there is much real waste as most of it is probably stored in the lead battery.
On the other hand, lead acid batteries are not really efficient, so lots of energy that goes into them is actually wasted.
 
And, frankly, won't the lead also provide lower cost storage kWh?
Not based on my assumptions. There is ongoing debate about long term cost of operation. All of the grid support battery packs that I know of are all Lithium chemistry of some sort. Also as @Horsefly mentioned Lithium chemistry is much more efficient. I still hear of ocassional Pb Telecom backups. Those installations are just backups and routinely changed out.
 
Last edited:
I'm not. I'm just thinking through whether a simpler solution is feasible in my case with the battery bank I have.

I figure if the chemistry itself can take care of things that's better than relying on an electro-mechanical device.

Well yes but the devils are always in the details.

If I used a conservative CV voltage of, say, 56V in the test the LFP bank sill would have disconnected on a high cell voltage somewhere. Even the test data shows that the LFP bank disconnects at a relatively low voltage - like 56V - and before the 58V 'absorption; CV stage.
 
Not based on my assumptions. There is ongoing debate about long term cost of operation. All of the grid support battery packs that I know of are all Lithium chemistry of some sort. Also as @Horsefly mentioned Lithium chemistry is much more efficient. I still hear of ocassional Pb Telecom backups. Those installations are just backups and routinely changed out.

Ahh some good points that we can discuss here. Lead is actually pretty darn efficient if you are not gassing the cells, i.e. in the lower SOC region. Alas you loose somewhere near 20% by energy when you are going regular charge reconciliation with the appropriate amount of surplus AH overcharge, i.e. 10% or so. You have to do the full charges to maintain life so...yeah. But for solar off grid this extra energy is often available from the PV so it can be a bit of a wash. Note that any time you operate in CV mode for 'absorption' or 'float' you are curtailing PV energy by definition by moving off the max power point.

For daily cycling and fast charging associated with grid support applications lithium will totally yield superior economics and performance.

However, for off-grid when, really, most of these systems deliver about 50-100 'full capacity throughputs - i.e. full cycles per year - and there is, by some definition, surplus or curtailed PV energy available then it's actually quite hard to make an economic case for lithium - especially when lead has so many other benefits. People seem to forget that for off-grid applications with a large battery with a few days of autonomy the battery bank does not work that hard or throughput that much energy. Ergo a 1000-1500 cycle rated lead battery bank can be a very economical 10-15 year battery that also contains zero electronics to fail. They main problem with lead for off grid, in my opinion, is that they are often so poorly managed (by people and poorly engineered equipment) that they get worse-than-deserved reputation for performance.
 
Last edited:
My JBD BMS's have a count of how many faults have caused a disconnect. My goal is to keep those at zero. If I check them and find that one is not zero, it means something happened. Since your model would have intentional disconnects by the BMS, a disconnect logged by the BMS may be something bad, or it may just be something you intended. You wouldn't really know.

This is in interesting contribution - thanks. I can now see where you are coming from. It is a bit philosophical. I maintain, however, that the 'single stage charging' where the end of charge is determined by the first cell to get to, say, 3.65V is not obviously a flawed design approach when attempting to parallel lithium with lead. At least not that I can see.
 
Here's something people may not realize. This is C/2 charge data with a short final stage of CV charging at 28.2V (56.4V equivalent). Point being that it doesn't take much cell imbalance to have one take off and trigger an end of charge. (credit to JT for producing the graphs)

1636920168869.png

Given the 12V modules discussed don't self balance with each other we can expect some amount of imbalance. And my test showed a high cell voltage disconnect with my 'balanced modules' at around 56V thus confirming this.

So you can lower the CV voltage to, say, 27.2V (54.4V equivalent) to prevent high cell voltages but then you ultimately slow down the charging time. This second plot is the same battery.
1636920421518.png

We did some work to explore how precisely lowering CV voltages would impact charge time while mitigating any high cell voltage events due to reasonable amounts of cell imbalance. At the end of the day chargers seem to generally suck so expecting or depending on a precise CV voltage in the field appears to be a fools errand anyway. So the 12V battery modules kinda need to work with a wide range of CV voltages to be practical.
 
Note that any time you operate in CV mode for 'absorption' or 'float' you are curtailing PV energy by definition by moving off the max power point.
Exactly. This is my PV output power (and load) curve for a typical day of my off-grid system. Pool pump mostly (and in this case also a pool cleaning robot for a couple of hours). There's so much unused capacity that efficiency losses in storage are not really a factor. Obviously the size of that Unrealised Capacity area varies with season and weather, even so, it is well above demand on all but the worst days.

The main exception is when the system is occasionally performing its primary task of household backup during grid outage.

Screen Shot 2021-11-15 at 8.09.50 am.png

The reason I'm interested in exploring the potential of adding some LFP is I could possibly run our general household circuits from this off-grid system and free up my grid-tied system (a bit) for other loads I plan to move onto it, hot water to begin with and later an EV.

But I do not want to cycle my lead acid storage much, hence having some LFP to look after most of the daily cycling, leaving the lead to be ballast and for grid outage backup.

Most of the household circuit separation is already done by the way I have our transfer switch wired to exclude high power draw circuits which I don't want to operate from backup supply (oven, induction stove, ducted aircon etc). They are only ever going to be grid connected. But I have one or two trickier circuits to solve before I can consider this option.
 
Given the 12V modules discussed don't self balance with each other we can expect some amount of imbalance. And my test showed a high cell voltage disconnect with my 'balanced modules' at around 56V thus confirming this.
Yep, and why, if I did do something like this it would be with 16 individual LFP cells with a BMS with an in-built top balancer (with reasonable balance current capacity) to keep any single cell from running away. There are options for this.
 
Thanks for doing this research!

A few thoughts:
1. Lead AGM is good (not great) for extreme environments. It works perfect at high and even a big a low temperatures,
2. Lead is good as backup (boats, Marine) where a damage BMS can leave you stranded. A lead battery will still start the engine.
3. Depending where you live, Lead is still cheap, when you do a calculation that you need 3 days of batteries - in 99% of the cases and 5 days in 2% - it's sometimes not wise to buy LFP for those few days. In stationary applications your get 10-12 years out lead and the recycling is good.
4. Lead has insane discharge rates, if you need to start engines, or compressors etc, per capacity they are much better then BMS limited LFP.
 
4. Lead has insane discharge rates, if you need to start engines, or compressors etc, per capacity they are much better then BMS limited LFP.
That's true, but a lead battery designed for starting engines has quite a different internal structure to an AGM battery used for general energy storage/cycling.

A starter battery has many more but much thinner lead plates providing a far greater surface area to enable a much higher discharge current. For occasional starting with a reliable engine, AGM is fine. For more frequent engine starting I'd be having a separate starter battery designed for that purpose.
 
That's true, but a lead battery designed for starting engines has quite a different internal structure to an AGM battery used for general energy storage/cycling.

A starter battery has many more but much thinner lead plates providing a far greater surface area to enable a much higher discharge current. For occasional starting with a reliable engine, AGM is fine. For more frequent engine starting I'd be having a separate starter battery designed for that purpose.

AGM is the wrong comparison as they can often provide huge starting currents. GEL or deep cycle flooded might be a better comparison in this respect.

AGM starting battery:

optima-image-4.png

Expensive and high quality for starting aircraft:
GPL-3100T_R.jpg
 
Yes and there are AGM batteries designed for both starting and deep cycling but inevitably these are a compromise and sacrifice one performance element, e.g. deep cycling longevity.
 
Back
Top