diy solar

diy solar

BMS or no BMS for LiFePO4?

Ponderosa Pete

New Member
Joined
Feb 1, 2021
Messages
6
Somewhere on the forum I believe Will Prowse said, "Considering the likelihood of over voltage situation from most high quality mppt, and the chance of matched LiFePO4 cells going out of balance is rare (and BMS will correct for cell drift over time), and that LiFePO4 can be over charged to 4.2v per cell before electrolyte degradation... I would say its safe to connect mppt directly to the battery bank, and bypass the BMS entirely. We have been doing it this way for years, but people still want to use a BMS." (Sorry, I can't find original thread.)
So what's the consensus? BMS or no BMS?
 
How can the BMS correct for drift over time ..... if you don't have a BMS?

Just do a search on here for bloated or overcharged cells and you will see that charging without cell level charge protection can have a disastrous result ... in some cases, even when people were monitoring their pack closely while charging.
 
matched LiFePO4 cells
Does this mean that the cells are identical in every way, including capacity? Or how does one define "matched" in this case?

Even if the answer was yes, I'd still think that the chances of "LiFePO4 cells going out of balance" are higher than rare.

But, to support your thoughts, I suspect that if you have a battery that is "sufficiently matched", it could be managed with a balancer with a significant current (more than the mili-amp balance currents in most BMSs). And this would also be aided by a conservative charge strategy (much less than the 4.2v per cell you alluded to being acceptable).
 
Safety belt or not safety belt in the car? For me the answer is the same
 
I would want to see the original quote from Will. I doubt your take on it is quite what he said or meant.

Not using a BMS is just plain crazy.
 
Somewhere on the forum I believe Will Prowse said, "Considering the likelihood of over voltage situation from most high quality mppt, and the chance of matched LiFePO4 cells going out of balance is rare (and BMS will correct for cell drift over time), and that LiFePO4 can be over charged to 4.2v per cell before electrolyte degradation... I would say its safe to connect mppt directly to the battery bank, and bypass the BMS entirely. We have been doing it this way for years, but people still want to use a BMS." (Sorry, I can't find original thread.)
So what's the consensus? BMS or no BMS?

Here is his response to your statement:

No, use a bms. In context on that thread, and in practice for some, you can go without a bms (I still do). But everyone should use a bms.

But here is a relevant quote:

And please do not tell people that I do not recommend using a BMS because I know how to run a pack without it. My recent videos are more beginner-friendly and state "always use a BMS no matter what". I figured someone using a chargery who doesn't wish to use relay/SSR control, or current limitations of FET based BMS options, can figure this all out on their own.

It is simple stuff overall, but it is easy to screw it up if you do not understand the purpose of these BMS features and go without one.

So please, if you are a beginner, PLEASE use a BMS with every safety feature.

If you experiment a lot and understand the chemistry charge/discharge characteristics and you want to run without a BMS, go for it. There are electric vehicles using LiFePO4 for nearly a decade without bms. Same with marine crowd. But like I said, if you do not understand use specific applications for that, do not run without a BMS.

In short, the consensus, were such a thing to exist, would be "If you have to ask, use a BMS. If you plan to run it without BMS then you should be able to explain why and how it works in your application."
 
Take a 12V LFP battery. Voltage Range is 10.0V (2.50 per cell) to 14.6V (3.65V per cell MAX)
You are charging the assembled battery pack at 14.0V (*theoretically 3.5V per cell).
Internal Resistance & Impedance will cause each cell to take as much voltage as they can BUT they will vary a bit.
As the cells reach normal settled "full state" around 3.425V per cell, innevitably one will "RUN" ahead of the others and reach the BMS Safety Overvolt cutoff (ususally 3.65V) unless programmed otherwise. THIS IS AT CELL LEVEL !
This will protect the entire pack based on any single cell deviating from normal.

IF there was no BMS, the Runner would continue to increase its voltage while the others are left behind.
You are still pushing 14.0V BUT now you have one cell @ 4.0V while one cell may be at 3.2, another at 3.3 and the third at 3.4.
That cell at 4.0 is BEING DAMAGED NOW... The whole "battery pack" does not know anything and will not stop till it is at 14.0V (if the Charger / SCC is programmed to stop there.
END-AMP readings would be of NO HELP because the IR/Impedance is skewed thanks to the runner and which ever cell is the weakling "Lazy".


Where the confusion comes from:
It all goes back to Lead Acid. A standard Lead Battery such as a Rolls S-550 which is a 6V Deep Cycle monster, has 3, 2V cells in one unit which is internally connected. If properly watered & maintained each cell within the battery will self-balance and stay level. That one battery will behave properly and the Amps taken is consistent and will decrease as it is supposed to as the battery reaches 0%. DOD (Full). With Multiple Batteries in in series / parallel the lead batteries will self-balance (more or less) IF they are properly watered & maintained. Over time the End Batteries (where the POS & NEG are connected) for output, will "tire" and start to lower the overall bank capacity. These should be manually rotated through the bank every 3 months but few do it. The lowest capacity cell will bring the entire bank down to its level, regardless, the weakling rules the roost. With Lead the End Amps value is taken from the entire battery bank as a collective measure. This cannot & does not apply to Lithium in the same way.

Far too many people do NOT understand that many of the "Lead-isms" do not apply to Lithium. These are NOT grocery store AA batteries !

Maybe this clears things up, maybe not.
Hope it helps in any case.
BTW: I run 956AH of Heavy Rolls Lead and within two weeks my final 280AH goes into the LFP bank for a total of 1,190AH.
Been at this for a few days, so just a wee bit of experience. /s
 
A high voltage situation may be unlikely with a quality SCC. There are an awful lot of cheap ones out there. But what about under voltage? High discharge rate? Too cold?

I think it is Caig that has a tag line of "BMS or be a mess".
 
The quality of the charging source has literally nothing to do with whether a cell in the battery will be destroyed because it's out of balance with the others.
 
You also need to take into account the expected lifespan of the LiFePO4 cells. Even if they are perfectly in balance when they start, they are likely to start showing differences after many years of service. So it is again an insurance policy to maximize lifespan of your battery bank.
 
I think Pete may have just been stirring the pot .... but .... If anyone wants to know why use a BMS we can just send them here .... LOL
 
I would want to see the original quote from Will. I doubt your take on it is quite what he said or meant.

Not using a BMS is just plain crazy.
When you read through the context it becomes clear. He has never said to not use a BMS...ever.

 
And then Will clarified in this thread:

 
Somewhere on the forum I believe Will Prowse said, "Considering the likelihood of over voltage situation from most high quality mppt, and the chance of matched LiFePO4 cells going out of balance is rare (and BMS will correct for cell drift over time), and that LiFePO4 can be over charged to 4.2v per cell before electrolyte degradation... I would say its safe to connect mppt directly to the battery bank, and bypass the BMS entirely. We have been doing it this way for years, but people still want to use a BMS." (Sorry, I can't find original thread.)
So what's the consensus? BMS or no BMS?
Will has said things along those lines in the past (but clarified it was not a suggestion to forego a BMS), Will recommends a BMS. Regardless of what Will does or does not recommend, most people here strongly advise against omitting a BMS (particularly for beginner and intermediate builders or anyone who doesn't trust themselves to be or doesn't want to be the human BMS).

I know myself well enough to know that human error, human distractability, and human overconfidence/obliviousness make me an unreliable protection layer. Delegating that responsibility (partially) to an automated protection layer is a no brainer for me.

I think there are two separate questions that often get confused (1) CAN you go without a BMS (yes), (2) Should you go without a BMS (no).

The way I see it, a BMS is cheap peace of mind, like a seatbelt or insurance, you may never need it, but if you do, you'll be grateful its there.

Anecdotally, the vast majority of avoidable problems I observe on the forum happen from charging or discharging without a BMS (more often charging I think)
 
What is a BMS? It is a cell balancer with some voltage monitors that can stop the flow of electrons. If you have a large system, I think it is more cost effective, safer and more efficient to Not have a BMS. Pushing (large) loads through FET's so one can balance and switch is not ideal. It would be better to have a high capacity cell balancer with a few SSR's to control loads, charging, heating and cooling based on voltage and temp. If you are a beginner, buy a reputable battery and some Victrons or build a small load system with a BMS. The larger my system grows, the less I see a BMS as a good solution. I see videos of youtubers building 16s 280a cells with BMS's. How are they ever going to use the capacity of those systems with a DIY BMS? Why would anyone choose to waste so much power on FET heat?
In the near future, I can see people spending more for their BMS than they do for their batteries.

I am not an expert, learning every day.
 
Just to say the ssr and fets are the same because the first one is constructed with the second one.
 
SSR are only used to turn devices off and on, like the inverter. They are not in the path of the main circuit.
 
You are right in that case. I like to interrupt the main path in case something goes wrong with the rest of the system. I like to have a standalone system to protect the battery regardless what is connected to it.

From the philosophical point of view I do not really like to rely in the device that it is generating the problem as protection. For example if mppt does not stop charging because it has a problem maybe telling him to stop does not solve the problem. The same if the inverter breaks.

The problem is provably me. Too many risks assessments at work where the contention agent can never be the responsible of the problem.
 
Last edited:
You are right in that case. I like to interrupt the main path in case something goes wrong with the rest of the system. I like to have a standalone system to protect the battery regardless what is connected to it.

From the philosophical point of view I do not really like to rely in the device that it is generating the problem as protection. For example if mppt does not stop charging because it has a problem maybe telling him to stop does not solve the problem. The same if the inverter breaks.
Just to be clear in your bms design thread I was advocating for the same thing as @larfme.
I've since modified my position slightly.
I want current carrying fets for the low current stuff and the option to use ssrs(dry switches) for the high amperage stuff that supports it.
I understand we have different design objectives, just wanted to put it out there.
 
Just to be clear in your bms design thread I was advocating for the same thing as @larfme.
I've since modified my position slightly.
I want current carrying fets for the low current stuff and the option to use ssrs(dry switches) for the high amperage stuff that supports it.
I understand we have different design objectives, just wanted to put it out there.
As you have said it is related with mindsets. Whenever you develop a product you need to make a risk assessment. A risk assessment lists the potential risks, evaluates the chance of happening and the severity, evaluates the mitigation measures and at last the severity after the mitigation.

An example would be:

Problem: Charger does not stop charging. (it can be due to shorting to panels, something burnt inside, MCU stucked.... this is just an example)
Chance: Extremely low
Severity: Extremely high (battery destruction, inverter destruction due to high voltage in the input, fire)
Mitigation plan: Switch cuts the current path
Severity after the mitigation: Low (system loses the energy temporarily)

Every time I have done a risk assessment like this for a product, using as mitigation plan the root of the problem is not allowed. If the charger is generating the problem due to malfunction, using serial commands or something like that, is not going to mitigate the problem.

I am not saying my approach is the right one. Just my opinion.
 
Last edited:
Back
Top