diy solar

diy solar

Questions on BMS use with alternator

Thanks for the generous feedback. I am definitely in the BMS should do more than just be an emergency off switch camp. I am not aware of any DC/DC chargers that are able to turn themselves off automatically at a pre-determined voltage setting, so from a practical standpoint the BMS has to be the one handling the charge/discharge end-points. I definitely think there's room for second layer of protective devies (victron battery protect or example) to provide some redundancy.

I guess the real question i am trying to wrap my brain around is this: Let's say my batteries are fully charged and I set off on a long road trip. Presumably the DC/DC charger is still pumping out 40A at ~14.4V since it's constant current? My understanding is that would generate heat and degrade the battery?
 
I guess the real question i am trying to wrap my brain around is this: Let's say my batteries are fully charged and I set off on a long road trip. Presumably the DC/DC charger is still pumping out 40A at ~14.4V since it's constant current? My understanding is that would generate heat and degrade the battery?

CC/CV doesn't work that way.
Let me explain.
When the delta between the battery voltage and the target voltage is sufficiently large the charger lowers the charge voltage to maintain 40 amps. Restated... the charger limits the voltage differential to maintain 40 amps. That is what constant current really means.

When the differential between the battery voltage is low enough then the charge doesr not have to lower the voltage to maintain 40 amps.
This is the start of the constant voltage phase. Restated... the charger no longer has to lower the voltage to keep the current flow at 40 amps.

This is also the point when your charger starts its absorption timer.
From this point the current flow will diminish as the battery voltage and charger voltage converge.
When there is no differential in voltage there will be no current flow regardless of the timer.
When the timer reaches 0 the charger is disconnected.

tl;dr for current to flow there must be a voltage differential.
 
Last edited:
Helpful explanation thanks! My dc/dc charger doesn’t have any sort of timer in lifepo4 setting, it’s essentially a power supply at a preset voltage. So if my charger is pumping out 14.4v all day, it’s not a problem since the current is so low?
 
Helpful explanation thanks! My dc/dc charger doesn’t have any sort of timer in lifepo4 setting, it’s essentially a power supply at a preset voltage. So if my charger is pumping out 14.4v all day, it’s not a problem since the current is so low?

The current will be 0 if the batteries reach 14.4.
But I showed you the manual entry that says the charger has an absorption timer, what gives?
 
lead acid profile
s1=on and s2=on = 14.4 volts constant voltage
s3=off and s4=on = 13.5 volts float voltage
s5=on=lead acid

lithium profile
s1=off and s2=on = 14.4 volts constant voltage
s3=on and s4=on = no float voltage
s5=off=lifepo4

double check my work please my clerical skills are not good.
 
I guess the real question i am trying to wrap my brain around is this: Let's say my batteries are fully charged and I set off on a long road trip. Presumably the DC/DC charger is still pumping out 40A at ~14.4V since it's constant current? My understanding is that would generate heat and degrade the battery?
To add to what smoothJoey said... the battery's internal resistance rises as it reaches full charge and it will taper the current it will take during the absorption phase. That said, you would still overcharge and damage an LFP battery if you left it at 14.4V for long periods without terminating in some way. You have to terminate the charge. Charger algorithms usually terminate absorption on one of these conditions:
1) Charger terminates absorption when current has tapered to a lower current such as C/30. I think this is the best approach for charger-controlled termination.
2) Charger terminates absorption after set time. Yours seems to terminate absorption after 3 hours which is longer than I would want.

Or you can have your BMS control it, and then it will terminate based on actual cell voltages. You can add a small relay in the D+ ignition control line to let the BMS do the charge enable/disable.
 
A 3 hours absorption timer is not optimal but personally I would not lose 60 seconds sleep over it.
 
Last edited:
Or you can have your BMS control it, and then it will terminate based on actual cell voltages. You can add a small relay in the D+ ignition control line to let the BMS do the charge enable/disable.

That is assuming the BMS used supports this feature.
The vast majority of BMS do not have this feature.
 
That is assuming the BMS used supports this feature.
The vast majority of BMS do not have this feature.
Yes of course would need a BMS that can control external switches, if you want to control it via control lines.

But a separate port BMS with internal FET switches could also be allowed to control charge and discharge as normal operation, rather than as just as a protection layer with wider limits. It's just a matter of settings in the BMS vs settings in the charger. Whichever one is set tighter will be the one that has control. It can function safely either way. I just prefer cell monitoring for charge control.

The OP asked about BMS control of DC-DC charger so that's what I'm responding to.
 
Yes of course would need a BMS that can control external switches, if you want to control it via control lines.

But a separate port BMS with internal FET switches could also be allowed to control charge and discharge as normal operation, rather than as just as a protection layer with wider limits. It's just a matter of settings in the BMS vs settings in the charger. Whichever one is set tighter will be the one that has control. It can function safely either way. I just prefer cell monitoring for charge control.

The OP asked about BMS control of DC-DC charger so that's what I'm responding to.

The vast majority of BMS separate port or common port and not configurable.
 
Thanks for the generous feedback. I am definitely in the BMS should do more than just be an emergency off switch camp.

In this case, I think you should heed @Airtime's advice and further research your choice of BMS. A simple FET based BMS may not be the best choice for your goals.

I am not aware of any DC/DC chargers that are able to turn themselves off automatically at a pre-determined voltage setting, so from a practical standpoint the BMS has to be the one handling the charge/discharge end-points.

Hopefully someone more knowledgeable than me can clarify, but I think you are misunderstanding the nature of these devices.

There are 3 DC-DC chargers that I am aware of (Victron, Sterling, Renogy), all three limit absorption time (defaults: 2hrs, 0.5hrs, 3hrs respectively) and Victron and Sterling have user configurable profiles.

I don't know much about the Renogy or Sterling, but the Victron allows you to set bulk, re-bulk, float and absorption voltage and duration, as well as a configurable low voltage disconnect if the source side (starter) battery voltage drops too low, and a voltage based engine shutdown detection feature.
 
Last edited:
On page 22 of the manual it says that absorption is limited to 3 hours.
So the product does do charge termination albeit crudely.

I actually AM looking at the DCC50S for this same use case as OP ($290 for an B2B plus MPPT had me sold), and I just bought two 120A FET based BMS off ali-express today before finding this thread... (2P4S 190Ah for 380Ah total)

I was looking for the alternator charging info in the DCC50S manual but I don't see the same mention of 3 hour absorption as in the other product manual. It does say it cuts off at 15.5V and keeps boost charge at 14.4V.

I can deal with the BMS being used as a voltage controlled off switch (instead of the charger) since this will only occur on long drives anyway or an alternator issue -- really the voltage should just be at 14.4V the whole time and not reach these limits. I might just wire an manual off switch into the charger ignition line and check the BMS voltages when I stop for gas. It's my first build so I'm trying to simplify the number of components, although may transition to a dummy-proof BMS-controlled system like @Airtime suggests in the future.

Great info in this thread ?
 
So hypothetically speaking, if i set the BMS overcharge protection parameter to 3.5, recovery to 3.35. The charger is set to put out 14.2. When the battery hits 14v, the BMS will disconnect the charger right? any issue with that workflow?
 
So hypothetically speaking, if i set the BMS overcharge protection parameter to 3.5, recovery to 3.35. The charger is set to put out 14.2. When the battery hits 14v, the BMS will disconnect the charger right? any issue with that workflow?
Yes that would most likely have the BMS in control, and if cells were perfectly balanced then the battery voltage at the battery terminals would be 14V when the BMS shut off charging. If one cell was out of balance and started peaking early, then the actual battery voltage would be lower. For example if one cell hit 3.5 while others were still at 3.4, then the battery voltage would be 13.9V when BMS disabled charging. And once one out of balance cell gets full it will rise fast, much faster than the others. That's one of the advantages with having BMS control charging, you never have a cell go too high even if it did get out of balance.

I said BMS would be "most likely" in control because it's possible that voltage drops in the full charge path (including positive supply and negative return) could be more than 0.2V depending on wire size and quality of connections. Most chargers don't have remote sense so they can't see the actual battery voltage, just the voltage at the charger terminals. If you have 0.2V drop then you have a race condition as both charger and BMS would be hitting charge termination around the same time. And if you had say 10 milliohms of total resistance (not that hard to do) you would have a 0.5V drop at 50A charge current. So when the battery reached 13.7V, the charger could be seeing 14.2V. Charge current will prematurely start tapering just due to the voltage drop, not because the battery is full, and you'll get slower charging.

I'd set the charger voltage higher, in this model it is just a second layer of protection, never hit in normal operation unless say the BMS or remote switching failed. So set it to 14.4 or even 14.6. You'll charge faster and BMS will terminate charging when highest cell hits your set limit, regardless of any cell imbalance or cable voltage drops etc.
 

diy solar

diy solar
Back
Top