Last fire.. :-(

Luthj

Photon Sorcerer
Every source I have seen thus far indicates that lithium plating at any significant level requires either very low temperatures, or very high currents (1C in many cases). My understanding of the mechanic is that lithium deposition in LFP requires the ion mobility threshold of the electrolyte and anode to be exceeded. For this to occur near the top of the SOC range would require high charge rates, which saturate the surface of the anode, while there is still plenty of ions in the electrolyte and cathode. A low charge rate by its nature gives the anode lots of time to incorporate ions into its structure.

If instead you are referring to SEI (Solid Electrolyte Interface) disruption, that would be a different mechanic and discussion altogether.

As always I am interested in new research on the subject.
 

fafrd

Photon Sorcerer
Every source I have seen thus far indicates that lithium plating at any significant level requires either very low temperatures, or very high currents (1C in many cases). My understanding of the mechanic is that lithium deposition in LFP requires the ion mobility threshold of the electrolyte and anode to be exceeded. For this to occur near the top of the SOC range would require high charge rates, which saturate the surface of the anode, while there is still plenty of ions in the electrolyte and cathode. A low charge rate by its nature gives the anode lots of time to incorporate ions into its structure.

If instead you are referring to SEI (Solid Electrolyte Interface) disruption, that would be a different mechanic and discussion altogether.

As always I am interested in new research on the subject.
Well, when I was tipping off, we’re talking about a charge rate of less than 0.02C, so sounds as though lithium plating is at least one thing I shouldn’t need to worry about too much…
 

Luthj

Photon Sorcerer
Well, when I was tipping off, we’re talking about a charge rate of less than 0.02C, so sounds as though lithium plating is at least one thing I shouldn’t need to worry about too much…

That is my understanding. At rates that low, you should deplete the free ions before you saturate the anode enough to cause plating. Obvious you are operating within the envelope described by the manufacturer. Its hard to believe any reputable cell maker would have such a serious flaw as plating at low charge rates, and none of us know about it. That being said, a badly made cell could have too much lithium ion material, and thus be capable of plating near full charge. That a manufacturing flaw though, and beyond the scope of this discussion. I can't find the reference, but from memory, there is a continuous curve which describes the conditions required to plate the anode. It drops to zero current near freezing, and rapidly ramps up as the cell warms. At some point the current requires way in excess of 3.6V for most cells. If I recall the heating from charging at this rate near room temperature would cause most cells to overtemp before the plating threshold is reached.
 

toms

Solar Addict
Every source I have seen thus far indicates that lithium plating at any significant level requires either very low temperatures, or very high currents (1C in many cases). My understanding of the mechanic is that lithium deposition in LFP requires the ion mobility threshold of the electrolyte and anode to be exceeded. For this to occur near the top of the SOC range would require high charge rates, which saturate the surface of the anode, while there is still plenty of ions in the electrolyte and cathode. A low charge rate by its nature gives the anode lots of time to incorporate ions into its structure.

If instead you are referring to SEI (Solid Electrolyte Interface) disruption, that would be a different mechanic and discussion altogether.

As always I am interested in new research on the subject.

Our understanding of lithium plating (dendrite growth) and SEI (solid electrolyte interface/interphase) are different.

While most online sources i have read state the SEI is formed from electrolyte decomposition, there is also solid lithium deposits within the SEI. The initial formation process is vital to ensure that there are no “seeds” of solid lithium that will grow into dendrites.

Most university undergrad (ie almost every online paper) use high current charge/discharge and correctly observe lithium plating. They need to do this in order to have a paper to submit. If they did very slow current charging at cell saturation they would also observe lithium plating. My cell manufacturer has tests in this area that have been ongoing for over a decade.

In another decade the evidence (one way or another) will be conclusive.

It’s interesting that some of the leading battery chemists cannot come to an agreement on exactly what reaction is occurring at the anode of a lithium cell - us mere mortals have no chance.

I’m not sure how you are separating SEI formation from lithium plating, the two are closely related.

Like you, i’m always interested to get as much information and different views on these batteries as possible.
 

toms

Solar Addict
I’m not clear on what you mean by ‘tapers current to match balancing ability.’

The BMS (when it is balancing and a cell reaches its upper voltage) commands the charge controller to reduce current to match the balancing current. That gives the BMS maximum time to balance cells.
 

fafrd

Photon Sorcerer
The BMS (when it is balancing and a cell reaches its upper voltage) commands the charge controller to reduce current to match the balancing current. That gives the BMS maximum time to balance cells.
So that is a fancy feature of advanced BMSs and charge controllers that requires a communication protocol between them, right?

SCC floats down to the point that the high cell or cells are being bled while the others are not at which point the SCC tells the BMS what constant current it should supply so that the high cells are held in place while the others slowly charge up to match them.

Sound fantastic, but awfully complicated (and expensive) and I’m honestly not sure it’s worth much of anything to those of us with pedestrian solar applications.

I want to charge up my battery every day (never getting close to full). I want to drain it every night. I want to avoid my BMS going into brutal and sudden LVD so I want my cells bottom-balanced rather than top balanced.

After shutdown for the night (with an empty battery), I want my active balancer to even up all of the depleted cells as best it can so that I can avoid LVD the next day.

While I can see that the smart tech you are describing would be useful for applications that typically charge to 100% (or whatever) and maintain a full battery (for backup or whatever), a smart SCC can’t really help with bottom balancing after the sun has gone down…
 

Hedges

I See Electromagnetic Fields!
Sound fantastic, but awfully complicated (and expensive) and I’m honestly not sure it’s worth much of anything to those of us with pedestrian solar applications.

I think it is very simple. BMS just says what voltage it wants applied by SCC.

My Sunny Island + Sunny Island Charger (Midnight Classic supports same) have a similar function for lead-acid.
If Sunny Island is off, the SCC determines charging voltage (based on its settings and temperature sensor).
If Sunny Island is on, it communicates by CAN bus to SCC what voltage to drive.

That's it.
(Although it can also read parameters from SCC)

With BMS requesting a low enough voltage that current accepted by full cells doesn't drive their voltage higher (equal current bled off by balancing process), the full cells remain at constant voltage while low cells continue to charge.

You do similar at bottom, but with active balancer you put current into lowest cells without adding current to the others.
"At night" - I suppose you draw that from grid? Or do you draw off 48V pack, draining cells, while charging the lowest ones.
 

toms

Solar Addict
For sure, every application is different.

I think active balancing approaching your minimum SOC will work well for you.

I’ve only ever set up off-grid systems. It’s difficult to not have your batteries fully charged a lot of the time.
 

fafrd

Photon Sorcerer
For sure, every application is different.

I think active balancing approaching your minimum SOC will work well for you.

I’ve only ever set up off-grid systems. It’s difficult to not have your batteries fully charged a lot of the time.
Yes, makes sense. For an off-grid system where PV is typically overpowered compared to average daily consumption and you want to maintain batteries at max SOC on high-production days, using an extended (and smart) float phase to level out all the cells as perfectly as possible would be the way to go (and if the protocol has been standardized, shouldn’t cost that much…).
 

fafrd

Photon Sorcerer
I think it is very simple. BMS just says what voltage it wants applied by SCC.
By definition, any time an SCC and a BMS need to communicate, it’s less ‘simple’ (which typically translates to higher-end which typically translates to more $$$).
My Sunny Island + Sunny Island Charger (Midnight Classic supports same) have a similar function for lead-acid.
If Sunny Island is off, the SCC determines charging voltage (based on its settings and temperature sensor).
If Sunny Island is on, it communicates by CAN bus to SCC what voltage to drive.
I spent under $100 for my 300A Heltec (dumb) BMS and $250 for my 60A (dumb) Epever SCC. They don’t communicate anything to each other (except that when the BMS disconnects external P- ground from the internal B- ground of the battery, the SCC listens ;)). My guess is that your CAN-capable Sunny Island set up cost you a pretty penny more…
That's it.
(Although it can also read parameters from SCC)

With BMS requesting a low enough voltage that current accepted by full cells doesn't drive their voltage higher (equal current bled off by balancing process), the full cells remain at constant voltage while low cells continue to charge.
Yeah, understand the concept and it’s nice capability (if the price is right or you need it, such as for an off-grid rig).
You do similar at bottom, but with active balancer you put current into lowest cells without adding current to the others.
"At night" - I suppose you draw that from grid? Or do you draw off 48V pack, draining cells, while charging the lowest ones.
I’ve just started playing with my new system and at the moment, I’m connecting the active balancer manually while I figure out how often it’s needed.

If I only need to use it once a month or less, I’ll probably stick to that simple ‘maintainance mode’ but it supports a switch so it I need to use it nightly, I may eventually hook up to some sort of controller (ie: battery voltage below ‘X’ or maximum cell voltage below ‘Y’).
 

Hedges

I See Electromagnetic Fields!
By definition, any time an SCC and a BMS need to communicate, it’s less ‘simple’ (which typically translates to higher-end which typically translates to more $$$).

Agreed. I avoid communication as much as possible.
Analog would be simpler. If BMS delivered a voltage proportional to what it wanted, and SCC accepted a voltage input, regulating to a multiple of that, just a resistor-divider could scale between them (if BMS language is equal or higher voltage.)

A Mass Spectrometer company I worked for implemented a digital communication protocol. I said they should just drive a few channels of DAC according to the molecular concentrations they measured, plus a couple handshake logic ports ("Start", "Valid", etc.) A control systems guy might know "Ladder Logic" and could tell a PLC to control equipment in order to regulate an analog input. (e.g. adjust burner parameters so NOx measured in exhaust is minimized). But speaking a foreign digital protocol is a lot to ask.

I spent under $100 for my 300A Heltec (dumb) BMS and $250 for my 60A (dumb) Epever SCC. They don’t communicate anything to each other (except that when the BMS disconnects external P- ground from the internal B- ground of the battery, the SCC listens ;)). My guess is that your CAN-capable Sunny Island set up cost you a pretty penny more…

Couple of zeros ...

Just an interface card to convert protocols costs what one of your devices does. (Chips are less, but low-volume assembled & tested product always more.)

The SCC has temperature sensor. That ought to be configured for target voltage adjustment. It is for lead-acid, not for lithium. Maybe with dummy lead-acid settings targeting lithium voltage, it would work. Most SCC have that. Probably no BMS do, would require a uP talking to it and driving DAC.
 

Luthj

Photon Sorcerer
I think that for a budget build you can get similar reliability using "dumb" components this way.

  • Build 2 packs with their own BMS/disconnect.
  • Use conservative charge/discharge limits
  • Perform a weekly check of the voltage/balance during peak charging.
Its never going to be set and forget like a high end system, but the cost difference can be worth it for DIY depending on your risk tolerance and ability. For my next build (48VDC off grid home), I am leaning towards a higher end BMS and inverters/chargers with digital communication. It adds 3-4k$ to my final total, but our risk tolerance is a bit lower without any grid backup.
 

fafrd

Photon Sorcerer
I think that for a budget build you can get similar reliability using "dumb" components this way.

  • Build 2 packs with their own BMS/disconnect.
  • Use conservative charge/discharge limits
  • Perform a weekly check of the voltage/balance during peak charging.
Its never going to be set and forget like a high end system, but the cost difference can be worth it for DIY depending on your risk tolerance and ability. For my next build (48VDC off grid home), I am leaning towards a higher end BMS and inverters/chargers with digital communication. It adds 3-4k$ to my final total, but our risk tolerance is a bit lower without any grid backup.
Yeah, my little budget rig is just for time-shifting what we typically consume during the peak window (as well as to make up for the ~17% reduction is solar credit they are imposing on my 5-year-old NEM system).

My Epever SCC has two voltage-programmable relays, so that provides a very crude level of ‘communication’.

I could use one to activate my active balancer once the battery is empty (SCC’s LVD) but I’m using both relays for other applications currently and am hoping to find an easy alternative.

Worst-case, I make a little circuit to close a relay whenever battery voltage is below a programmable threshold (though I would like some hysteresis, so two thresholds would be better).

But I’m still in the process of characterizing how often I actually need to connect the balancer to keep things humming along.

First time I’ve actually been in a position to actually assess the quality of the 16 280Ah Eve cells I bought…
 

Hedges

I See Electromagnetic Fields!
I've put hysteresis in op-amp circuits with a diode that puts current through an additional feedback resistor.
Capacitors to delay response. This was an overload detector that shut off an RF amp if it failed to reach intended amplitude in circuit (e.g. due to mismatch.)

I also learned that there are "comparator" ICs which while similar to op-amp, behave a bit differently.
If an op-amp is allowed to rail its output, some latch up or take an indeterminate amount of time to recover. They also transfer power supply noise, even to other amps in the same package (PSRR is reduced).
Op-amps can have better input-offset stability.
Comparators can be faster recovering from railed condition (which is their normal use)


After learning this, I modified my circuit so op-amp was kept out of saturation rather than railing (only one polarity needed to be clean, because the other disabled operation.)
 

fafrd

Photon Sorcerer
I've put hysteresis in op-amp circuits with a diode that puts current through an additional feedback resistor.
Capacitors to delay response. This was an overload detector that shut off an RF amp if it failed to reach intended amplitude in circuit (e.g. due to mismatch.)

I also learned that there are "comparator" ICs which while similar to op-amp, behave a bit differently.
If an op-amp is allowed to rail its output, some latch up or take an indeterminate amount of time to recover. They also transfer power supply noise, even to other amps in the same package (PSRR is reduced).
Op-amps can have better input-offset stability.
Comparators can be faster recovering from railed condition (which is their normal use)


After learning this, I modified my circuit so op-amp was kept out of saturation rather than railing (only one polarity needed to be clean, because the other disabled operation.)
Appreciate the recommendation and the link. I know the problem is easily solvable (probably in about a million different ways).

For now, I’m just going to focus on how necessary it’s going to be (ie: truly well-matched LiFePO4 cells should rarely need rebalancing), and only once I get to the point that my manual maintainance has gotten to the point that it’s more trouble than yet another DIY project will I start planning an actual ‘automatic’ solution.

I’ve already got to adjust various timing and voltage parameters to make my time-shift system keep up with the changing seasons and PG&E’s seasonal peak hour changes on a ~3 month basis, so if I can easily keep my cells bottom-balanced by manually attaching my active balancer for one overnight session on that same schedule, that’s easier than hooking in yet another thing to keep an eye on (YATTKAEO ;)).
 

Hedges

I See Electromagnetic Fields!
Maybe you should implement a partial BMS function, able to detect any cell low voltage and cell voltage divergence so you can shut off inverting before BMS pulls the plug. If BMS gave a warning before disconnect, that would take care of it.

Would expect cells to diverge eventually, and faster as they get older. Does your active balancer at bottom keep working after BMS disconnect?

I wish my AGM had all cells electrically accessible. They're sealed 6V or 12V batteries; only the higher Ah 2V cells offer that.

I've though of pressing a DAS (relay matrix) into service for balancing, scanning to read voltages and connecting a power supply.
Expensive equipment for the application, but general purpose off-the-shelf equipment so can be moved from lab to this use and back.
 

fafrd

Photon Sorcerer
Maybe you should implement a partial BMS function, able to detect any cell low voltage and cell voltage divergence so you can shut off inverting before BMS pulls the plug. If BMS gave a warning before disconnect, that would take care of it.
I’ve got battery voltage detection in my SCC that can control a relay and shut down my inverters whenever I want, so shutdown function is pretty well covered without coming close the BMSes LVD (I’m currently shutting down Load at 25.25V and all 8 cells are within 10mV of 3.156V - that’s they mismatch / ‘stray’ I’m currently characterizing).
Would expect cells to diverge eventually, and faster as they get older. Does your active balancer at bottom keep working after BMS disconnect?
The Active Balancer is independent of the BMS (has it’s own independent direct harness to all 8 cells) and functions as long as a cell is no lower than 2.9V (still seeking clarity from the manufacturer about exactly what happens once a cell is below 2.9V, but appears that the balancer just ignores that one cell and continues balancing the remainder).

I only want the balancer connected when no charge or load currents are flowing and only when the battery has triggered the LVD of my SCC @ 25.25V.
I wish my AGM had all cells electrically accessible. They're sealed 6V or 12V batteries; only the higher Ah 2V cells offer that.

I've though of pressing a DAS (relay matrix) into service for balancing, scanning to read voltages and connecting a power supply.
Expensive equipment for the application, but general purpose off-the-shelf equipment so can be moved from lab to this use and back.
Again, I could use my SCCs Load Relay to turn on the Active Balancer only at night (timer) and only once LVD of 25.25V has been reached (ie: perfect solution) but I’m currently using that relay to turn off the inverters.

I’m toying with the idea of finding an 8-channel or 16-channel programmable voltage monitor to rig up as a safety feature (early warnng if any cell starts behaving strangely) and controlling the Active Balancer should be trivial from a platform like that.

I’m just so relieved to have finally gotten my system built, and pretty much functioning as intended, that I’m gonna’ take a breather before diving into my next project ;).
 

Hedges

I See Electromagnetic Fields!
I’m toying with the idea of finding an 8-channel or 16-channel programmable voltage monitor to rig up as a safety feature (early warnng if any cell starts behaving strangely) and controlling the Active Balancer should be trivial from a platform like that.

One idea I have is resistor dividers to attenuate differential taps from each cell down to the range zero ... 5V. Feed that into a mux front-end of ADC to be read. For 8 channel, need either 8 differential or 16 single-ended inputs. Might get than in a single microcontroller with mixed-signal functions. Everything I was doing was I2C or SPI.

Same scheme could feed comparators. If I build a BMS I might do analog monitoring/shutdown and digitally scanned & controlled balancing.

Another way to go is electronics per cell, with digital isolators.
 

Ellcon123

New Member
I have 2 packs with active balancers. I have a simple voltage circuit controlling a relay on the RUN jumper. When pack voltage hits 27V the balancers turn on. Turn off is around 26.8V. Only been using a week but seems to be working well.
 

fafrd

Photon Sorcerer
I have 2 packs with active balancers. I have a simple voltage circuit controlling a relay on the RUN jumper. When pack voltage hits 27V the balancers turn on. Turn off is around 26.8V. Only been using a week but seems to be working well.
I need the opposite (I’m using a bottom-balanced battery) but would appreciate seeing your circuit diagram is you’ve got something you can share…
 
Top