diy solar

diy solar

Charge termination "drinking bird"-style based on SoC

FyKnight

New Member
Joined
Feb 1, 2021
Messages
39
My DIY off-grid 48V LiFePO4 system will have a variety of chargers connected to it — at least 5, due to different arrays. Some of them will be smarter than others. How can they know when to stop their CV charging stage? I don't think I can use tail current / end amps because no one charger will have an idea of the total current going into the battery... even leaving out the other chargers, they don't know what the loads are doing. I don't want to leave the cells pinned to 100% for the whole day. So I think I'd better have a separate device to direct the end-of-charge process. A "charge director", between the BMS and (solar, mostly) charge controllers.

And for restarting charging, I understand the usual approach is absorption and float voltage set points, but I can't see how that could be reliable with the LFP chemistry voltage being so variable under different load conditions, and elapsed settling time. Using coulomb-counted SoC would be a much more ... functional... way of determining when to restart / re-bulk.

So yeah I'd like a device that can prevent charging (but still allow discharging) when a certain high SoC is reached, and then allow it again when the SoC has dropped to a given level. Ideally it could allow charging to an even higher SoC every once in a while... to allow for (more?) cell top balancing time, and to recalibrate its idea of what 100% SoC is.

Technically, the BMS has all this information and capability. But logically, it ought to be reserved for protection only, and not trigger as part of the regular charging process. (In fact I'm planning to set up my BMS to trip circuit breakers when it has an issue, so I'm forced to figure out what went wrong and fix it)

Does this "charge director" device already exist, or does any BMS implement this feature already? (I currently have the JiKong BMS, relay version)

If not, I guess I'll build one. It'll sit between the BMS and the loads + chargers. I'll use some SoC board (system on chip!) to talk to the BMS over RS-485 and get its idea of SoC (state of charge!), current and time. When the SoC has increased past a limit, then regardless of voltage or current, it will disable charging. It can disable charging but allow discharging the same way that FET-based BMSes do it, by bypassing a diode when the FETs are on (allows charge), and leaving it in-circuit when off (disallows charge). Although I'll probably use a relay and discrete diodes instead of FETs, since I'm all-in on relays now :) That will allow the full available solar power to go to loads, with any extra requirement still provided by the battery. OTOH if solar > loads, then the voltage of the capacitance in the system will quickly rise above the absoption level and the SCCs will stop. When the BMS reports that the battery SoC has dropped to the low set-point (or when the sun will soon set) it can allow charging again by closing the relay.

It'd be a fun project but I'd still prefer to buy this if it already exists. Thanks in advance for any responses — this is the best forum! In fact I'm sure some members have created something like this for themselves.
 
When i set up systems with multiple chargers that don’t communicate, i set all bar the one that gets the last sunlight to lower setpoints for absorb and float.

This works well as the balancing phase requires less current in any case.
 
The most common method is to set all the chargers to use a absorb timer. When the absorb setpoint is hit, they all start counting down.

You will need to verify that all their reported voltages agree (not all units are calibrated well). Also ensure a reasonably low voltage drop between the controllers and the pack.

The ideal setup is charge controllers which all communicate, but that isn't a requirement.

It is moderately accurate to limit your pack SOC by setting the absorb voltage. I don't see any requirement for full digital communication between modules for that.
 
Solar charger don't like it that you disconnect them from the batteries while charging.

You risk blowing up the MPPts.

Why do you care having your batteries sitting at 100 SOC ? They age out on you before the die of sitting at full charge.
 
Solar charger don't like it that you disconnect them from the batteries while charging.

You risk blowing up the MPPts.

Why do you care having your batteries sitting at 100 SOC ? They age out on you before the die of sitting at full charge.
1: You can disconnect the PV from the charger.

2: No they don’t die of age before they die of dendrite failure that is accelerated by holding at a voltage above 3.45V.
 
using multiple solar charge controllers with one battery and managing each in a decent orchestrated way is a goal for me too.

for now, i am approaching it by using victron bluesolar mppt because of the wired data port to read and importantly write data, including charge settings.

main microcontroller to communicate with BMS and/or Dedicated Shunt to get SOC and ampere flow, then send commands to each mppt to do what is wanted.

victron mppt can be enabled/disabled over the ve.direct port afaik which is just a UART, bidirectional, four wire interface

interested to learn more!
 
Why not use voltage like most do?
voltage should work, as long as each mppt is adjusted to the same real voltages. checking each of the mppt output terminals with multimeter can help find the settings for each device that result in the same real observed voltage.

dividing absorb end amps by number of chargers should provide a workable first estimation of new absorb end amps parameter?

arrays facing towards the latest sun of the day might benefit from having higher end amps setting than the "earlier" arrays
 
Solar charger don't like it that you disconnect them from the batteries while charging.

You risk blowing up the MPPts.
is this true with all MPPT? have seen a couple of threads mentioning unplugging battery while charging being related to a failure. however, i would be upset if my mppt let out the magic smoke only due to disconnecting battery. any links to examples would be welcome, i assume there would be a voltage spike on mppt output side upon disconnect so the circuit would need to respond quickly maybe? sorry if this makes no sense

mainly, if the battery hits overvoltage, which is likely to occur during charging, then of course it would be disconnected.

i expect manufacturers to support this situation due to the high probability of it occuring.

however, also accept that not all mppt will have the same protections
 
When i set up systems with multiple chargers that don’t communicate, i set all bar the one that gets the last sunlight to lower setpoints for absorb and float.

This works well as the balancing phase requires less current in any case.
That is ... a very logical and practical approach. The kind that seems so obvious in retrospect. Thanks for this advice*

I'll do this until I make an SOC-based charge director device. I think it'll still be better given variable loads and insolation.

* and your many other succinct posts I've read that are usually well ahead of the curve : )
 
The most common method is to set all the chargers to use a absorb timer. When the absorb setpoint is hit, they all start counting down.

You will need to verify that all their reported voltages agree (not all units are calibrated well). Also ensure a reasonably low voltage drop between the controllers and the pack.

The ideal setup is charge controllers which all communicate, but that isn't a requirement.

It is moderately accurate to limit your pack SOC by setting the absorb voltage. I don't see any requirement for full digital communication between modules for that.
Thanks. I think some of my chargers don't have this absorb / CV timer but I could always upgrade them. Definitely will be keeping the resistance between chargers and pack as low as possible.

I understood from reading other threads that I could get the pack to 100% SoC (albiet more slowly) at any absorb voltage over about 3.45 V/cell... but from there to lower you're probably leaving a lot of solar on the table.
 
Solar charger don't like it that you disconnect them from the batteries while charging.

You risk blowing up the MPPts.
I've read that in the manual for one of my SCCs, however I theorise that with sufficient capacitance on its battery side (even when cut off), how would the SCC know that it isn't just charging a tiny battery? I mean, it pretty much is. If the inverter input capacitors aren't enough, then I'll add my own.

I thought about cutting off on the PV side, and that is an option, but I'd like to continue using available PV for loads even while preventing the battery from receiving it. The device I've outlined would allow that. I'll still cut off on the PV side if the BMS triggers though (for overcharging or low temp charging)
 
main microcontroller to communicate with BMS and/or Dedicated Shunt to get SOC and ampere flow, then send commands to each mppt to do what is wanted.
Excellent, I'm on a similar path, except I know I can't communicate with some of my MPPTs. So I'll just cut them off instead... (i.e., trick them into thinking the battery has reached full charge)
 
voltage should work, as long as each mppt is adjusted to the same real voltages. checking each of the mppt output terminals with multimeter can help find the settings for each device that result in the same real observed voltage.

dividing absorb end amps by number of chargers should provide a workable first estimation of new absorb end amps parameter?
I think this approach does make sense in a 'steady state' situation. But given real-life load and solar charge variability, I don't see how any fixed voltage setting can achieve the aim of first allowing the cells to drop to a lower SoC.

Surely when you put on a load (or if you have a constant load and a cloud goes over) then that'll drop the pack voltage without lowering the SoC much. Which would undesirably trigger another charge cycle. This could repeat quite a lot, keeping your cells at high SoC... this is just the first example that comes to mind
 
is this true with all MPPT? have seen a couple of threads mentioning unplugging battery while charging being related to a failure. however, i would be upset if my mppt let out the magic smoke only due to disconnecting battery. any links to examples would be welcome, i assume there would be a voltage spike on mppt output side upon disconnect so the circuit would need to respond quickly maybe? sorry if this makes no sense

mainly, if the battery hits overvoltage, which is likely to occur during charging, then of course it would be disconnected.
every MPPT manual I've seen warned about that and I personally destroyed at least one PWM which had the same warning. By disconnecting the batterie before the PV.

Usually you don't charge a LFP battery until Overvoltage occurs. You set the maximum charge Voltage at 0.2V under that maximum charge voltage the manufacturer specifies for the battery.

Disclaimer - all following numbers a randomly picked - for illustration purposes ONLY: So a 48V battery might have a max spec voltage of 57.6V - You set you charger for 56.4V The BMS for that battery might only trigger Overvoltage 58.8V - which is far higher - and only then the MPPT would be risk being damaged.

You don't charge a battery until a BMS triggers high voltage cutoff - that's only for Youtube Videos.
I've read that in the manual for one of my SCCs, however I theorise that with sufficient capacitance on its battery side (even when cut off), how would the SCC know that it isn't just charging a tiny battery? I mean, it pretty much is. If the inverter input capacitors aren't enough, then I'll add my own.
Assume it's in the middle of the day - you SCC is pushing 40A - suddenly the battery disappears (relay opens) where does all that energy go?
A battery can accept a lot of current compared with capacitors. It's like closing a rushing river with a instant dam. The Voltage in the system will shoot up and depending how fast the software in the SCC reacts to limit - will arc between components.
 
@FyKnight ,why so difficult?

If you set all your chargers to an absorption voltage of 3.45 V per cell, you’ll never overcharge and the SOC will be max somewhere between 95%-99%. It’s all about staying away from the steep curves in the charging profile.

And with a float voltage of 3.35 V per cell the solar will take the load before draining the battery.
 
Surely when you put on a load (or if you have a constant load and a cloud goes over) then that'll drop the pack voltage without lowering the SoC much. Which would undesirably trigger another charge cycle. This could repeat quite a lot, keeping your cells at high SoC..
If you are concerned about the lifespan of the battery - just get more and never charge them full.

Just stop a 80-90% SOC and call it a day. My electric car is set from the Factory to charge to 86% of max and never go below 10%,
If you set all your chargers to an absorption voltage of 3.45 V per cell, you’ll never overcharge and the SOC will be max somewhere between 95%-99%. It’s all about staying away from the steep curves in the charging profile.

And with a float voltage of 3.35 V per cell the solar will take the load before draining the battery.
That's what I do - never charge full.
 
If you are concerned about the lifespan of the battery - just get more and never charge them full.

Just stop a 80-90% SOC and call it a day. My electric car is set from the Factory to charge to 86% of max and never go below 10%,

That's what I do - never charge full.
Firstly don’t conflate EV procedures. My 30kwh removed all that 80% nonsense abd has charged working day to 100% for 5 years. Ev batteries are such to very high C charge and discharge not the same as solar

The key To long li life

Charge to 100% then remove the charge voltage don’t float

Limit Depth of discharge so recharge whenever possible

Charge between 0.2V to 0.5V to minimise SEI layer growth.

Sense net battery current to terminate CV phase.
 
My DIY off-grid 48V LiFePO4 system will have a variety of chargers connected to it — at least 5, due to different arrays. Some of them will be smarter than others. How can they know when to stop their CV charging stage? I don't think I can use tail current / end amps because no one charger will have an idea of the total current going into the battery... even leaving out the other chargers, they don't know what the loads are doing. I don't want to leave the cells pinned to 100% for the whole day. So I think I'd better have a separate device to direct the end-of-charge process. A "charge director", between the BMS and (solar, mostly) charge controllers.
Smart mppt units like Victron can share a common battery current to determine net battery end amps. Each controller should then fully stop charging and effectively disconnect from the bank

If you solar controllers can use battery current sensing then all you need is to distribute that current information buffering it to each charge controller.
And for restarting charging, I understand the usual approach is absorption and float voltage set points, but I can't see how that could be reliable with the LFP chemistry voltage being so variable under different load conditions, and elapsed settling time. Using coulomb-counted SoC would be a much more ... functional... way of determining when to restart / re-bulk.
Voltage re-bulk does work as Li VI curve is t completely flat. It’s works in my system.

So yeah I'd like a device that can prevent charging (but still allow discharging) when a certain high SoC is reached, and then allow it again when the SoC has dropped to a given level. Ideally it could allow charging to an even higher SoC every once in a while... to allow for (more?) cell top balancing time, and to recalibrate its idea of what 100% SoC is

Many bms carry out this function
Technically, the BMS has all this information and capability. But logically, it ought to be reserved for protection only, and not trigger as part of the regular charging process. (In fact I'm planning to set up my BMS to trip circuit breakers when it has an issue, so I'm forced to figure out what went wrong and fix it)
Does this "charge director" device already exist, or does any BMS implement this feature already? (I currently have the JiKong BMS, relay version)

If not, I guess I'll build one. It'll sit between the BMS and the loads + chargers. I'll use some SoC board (system on chip!) to talk to the BMS over RS-485 and get its idea of SoC (state of charge!), current and time. When the SoC has increased past a limit, then regardless of voltage or current, it will disable charging. It can disable charging but allow discharging the same way that FET-based BMSes do it, by bypassing a diode when the FETs are on (allows charge), and leaving it in-circuit when off (disallows charge). Although I'll probably use a relay and discrete diodes instead of FETs, since I'm all-in on relays now :) That will allow the full available solar power to go to loads, with any extra requirement still provided by the battery. OTOH if solar > loads, then the voltage of the capacitance in the system will quickly rise above the absoption level and the SCCs will stop. When the BMS reports that the battery SoC has dropped to the low set-point (or when the sun will soon set) it can allow charging again by closing the relay.
Yes you could but accurate Coulumb counting is not easy and really you need a remote on off SCC feature ( mind mist have that these days. )
It'd be a fun project but I'd still prefer to buy this if it already exists. Thanks in advance for any responses — this is the best forum! In fact I'm sure some members have created something like this for themselves.
I think it could be easy done by buying a smart mppt controller family

Secondly it could be done by distributing battery current measurement to every solar charger

Or you could build a bms.
 
every MPPT manual I've seen warned about that and I personally destroyed at least one PWM which had the same warning. By disconnecting the batterie before the PV.

Usually you don't charge a LFP battery until Overvoltage occurs. You set the maximum charge Voltage at 0.2V under that maximum charge voltage the manufacturer specifies for the battery.

Disclaimer - all following numbers a randomly picked - for illustration purposes ONLY: So a 48V battery might have a max spec voltage of 57.6V - You set you charger for 56.4V The BMS for that battery might only trigger Overvoltage 58.8V - which is far higher - and only then the MPPT would be risk being damaged.

You don't charge a battery until a BMS triggers high voltage cutoff - that's only for Youtube Videos.

Assume it's in the middle of the day - you SCC is pushing 40A - suddenly the battery disappears (relay opens) where does all that energy go?
A battery can accept a lot of current compared with capacitors. It's like closing a rushing river with a instant dam. The Voltage in the system will shoot up and depending how fast the software in the SCC reacts to limit - will arc between components.
No it’s not. Water has mass. electricity dues not. The analogy has limits.

Hence if you open a load the current just stops. ( assuming non inductive systems )

Victron will survive battery disconnects for example.

The main issue is not to disconnect the battery ground to the mppt.

Also many mppt controllers have remote disconnect features anyway.
 
Victron will survive battery disconnects for example.

The main issue is not to disconnect the battery ground to the mppt.
You can always design a system to deal with anything. Victron is good about that - I wouldn't trust others.

You don't know how the SCC MPPT is designed internally. Does the computer-chip takes it power from the Batterie side or the PV side? Both?

Hence if you open a load the current just stops. ( assuming non inductive systems )
many MPPTs have large coils inside - so you got inductive loads.

Also many mppt controllers have remote disconnect features anyway.
that should work, I'm just advising the threat author to not disconnect the battery from the MPPT without knowing what they are doing - there are risks.
 
Depending on the design some PV controllers can be damaged be repeat disconnect at higher loads. There are a myriad of different designs for PV controller schematics. Check the manual for details. Some are totally fine with this, while others will experience an inductive/back EMF kick, which can damage them.

My Outback flexmax unit can spike to 50% above its typical output voltage if I disconnect in full sun charging. This isn't an issue, as I am running well under its maximum voltage ratings. However, if I was running near its voltage limits, this could cause damage to components.

Unless you live in a cool/cold area (or the cells are climate controlled to 50-65F), many cell makes will experience ~1% capacity loss per year from aging alone. Plus cycle losses. In hot locations, it can be even more. Assuming an active system with minor loads, terminate your charge at a sane voltage (<14.1V in most cases), and the system will settle under 3.35VPC pretty quickly. Depending on your current rates, that may mean a pack becomes difficult to balance due to IR changes after around 8-12 years, but there is a lot of variables. A very hot location cycled daily could see 5 years, and a cold location (or tightly climate controlled one) with rare cycling, could see double that.

The most critical thing is to avoid floating at 100%, and set the float voltage to allow the pack to discharge around% within the first hour after completing the absorb/balance phase. With multiple controllers, you can select one based on the PV orientation, or its size, so it will float at a higher voltage (about 0.1V higher), and supply most of the base loads, so you get the SOC down a bit, without wasting PV harvest. Though a PV dump into a hot water tank, or some other way to harness excess PV without a a high float, isn't a bad idea. I did see one system that triggered a surplus dump into a hot water tank via the man AC bus, using a SOC triggered relay logic. Seemed pretty slick, and pulled the system down from 100% to 98% after a balance/absorb, if excess PV was available.

As far as absorb timers, determine them experimentally by manually monitoring return current. Unless you have a very large PV array, 2-5% C return current at your chosen absorb voltage is fine.
 
Assume it's in the middle of the day - you SCC is pushing 40A - suddenly the battery disappears (relay opens) where does all that energy go?
A battery can accept a lot of current compared with capacitors. It's like closing a rushing river with a instant dam. The Voltage in the system will shoot up and depending how fast the software in the SCC reacts to limit - will arc between components.
As others have mentioned the dam metaphor isn't entirely relevant, but the other thing is this is for charge termination. It'll be happening when the SCCs are well into CV mode and only pushing a few amps. Let's say it's a combined 10 A at 50 V, so 500 W. If the SCC's control loop checks the target voltage every millisecond, that's 0.5 joules that the capacitors have to absorb. If we add half a joule to a say 20 mF capacitor at 50 V, then it'll go up to 0.5 = 0.5*20e-3*(V^2-50^2) ... V = sqrt(0.5 + 0.5*20e-3*50^2)/(0.5*20e-3) ... sqrt((0.5 + 0.5*20e-3*50*50)/(0.5*20e-3)) .... 50.5 V. Seems fine then? I can see an 80 V, 22 mF capacitor on Aliexpress for $6.50.

If you are concerned about the lifespan of the battery - just get more and never charge them full.
That's a good fallback option. But I'd like to take it close to full so I can do balancing and get a better idea of SoC / where full is. Also... why not do it better if I can?
 
@FyKnight ,why so difficult?
It's a character flaw, I admit it.
If you set all your chargers to an absorption voltage of 3.45 V per cell, you’ll never overcharge and the SOC will be max somewhere between 95%-99%. It’s all about staying away from the steep curves in the charging profile.
Sure but won't it then take a long time to get there? Maybe more time than I have daylight?
 
The key To long li life

Charge to 100% then remove the charge voltage don’t float

Limit Depth of discharge so recharge whenever possible

Charge between 0.2V to 0.5V to minimise SEI layer growth.

Sense net battery current to terminate CV phase.
That is all good except for the last point, it assumes like a lot of "charging procedures" that unlimited power is available. It's not. We've barely had a bright sunny day since Covid where I am. It's been really frustrating just trying to test second-hand panels to see what their max output is! And loads come and go, e.g. fridge or aircon cycling on and off. That would all affect net current. So that's why I was thinking of looking at SoC (best guess by BMS) and net current to terminate the CV phase, instead of just net current.

Smart mppt units like Victron can share a common battery current to determine net battery end amps. Each controller should then fully stop charging and effectively disconnect from the bank
It'd be nice if my budget could stretch to Victron gear. Maybe for my next build.
Voltage re-bulk does work as Li VI curve is t completely flat. It’s works in my system.
OK, thanks for this info. I was worried it'd be affected by loads too..
I think it could be easy done by buying a smart mppt controller family

Secondly it could be done by distributing battery current measurement to every solar charger

Or you could build a bms.
Option 1 is the go-to, I understand, and I kinda expected it to be suggested in the first reply. However I'd like to use the MPPTs in the much-less-capable all-in-one inverters I can currently afford.

Option 2 is also problematic with less expensive MPPTs, e.g. the MakeSkyBlue units that I have. They don't have any comms at all.

Option 3 is probably what I'll do then. I already have a BMS for insuring the battery against faults in other equipment. This "charge director" is somewhere between a BMS and a SCC in concept. (And directly between them physically, hah!). It'll only cost about $20 in hardware to implement, plus a relay of some kind — definitely under $100 anyway. And it might be better and cleaner overall too.

Thanks very much for your comprehensive reply! Excellent overview of my options.
 

diy solar

diy solar
Back
Top