I'm upgrading my AGM battery to a Battle Born LiFePO4, and must upgrade my RV's charging system too (it can't get up to 14.4v), but I'm really trying to understand why even the most expensive chargers that'll do lithium (Victron Multi, Magnum) do NOT have an external shunt that tells the computer how many amps are going into the battery, so you can set a "Full Amps" setpoint (sometimes denoted as "Tail Current") so the absorption stage can be only what the battery needs to get fully charged. Instead I'm told that a timing algorithm is used to determine when the absorption voltage (14.4) gets dialed down to the float voltage (13.6). That's fine if the battery's SOC is almost zero, but if the SOC only goes down to 80%, and you plug back into shore power, unless I'm mistaken, that absorption time period will occur again, but this time the battery can be sitting at the absorption voltage (14.4) for way longer than it needs to. Am I correct in assuming that, unlike a SLA battery, with a LiFePO4 battery, this will not hurt the battery? And if so. why can't it stay at 14.4 indefinitely? (The charger needs to output something for my 12v loads). Meaning, why does there have to be a float mode... why not just stay in the constant voltage mode of 14.4v?
Any insights would be greatly appreciated, so I don't remain pissed that a $1200 charger doesn't have an external shunt like my 12 yo $400 SLA charger did that had a "Full Amps" setting.
Any insights would be greatly appreciated, so I don't remain pissed that a $1200 charger doesn't have an external shunt like my 12 yo $400 SLA charger did that had a "Full Amps" setting.