Don't know if this will help or confuse but understanding what charging looks like is helpful. When voltage is applied to the battery that is higher than the battery voltage, current will flow. Assume the battery can indeed take a charge, it will charge (current flow rate) at a rate that is dependent on the voltage difference and the series resistance of the power source, the wiring, and the internal battery resistance. Different battery chemistries have different charging curves but the voltage will rise on the battery, I mean the internal cell voltage, as the battery tries to reach voltage equilibrium with the power source. Use LiFePo4 for example.....if the battery has a low state of charge (SOC) and you apply 3.2v per cell, it will charge some but will come up to 3.2v at some point in time and current will no longer flow. Change the input voltage to 3.3v and if you hold it long enough, the battery will (someday) get to a pretty high SOC. Because we can't wait days to charge our battery, we will often apply higher voltages to get the current to flow faster.At the time of starting this thread I had got myself so confused.. I was first concerned whether or not I should in fact be using a BMS... there seemed so many similar settings (to me) in the Epever that I feared there could be damaging conflict..
I convinced myself the BMS should be used.. and made King.. (looking after charge and discharge).. and the Epever looked on as slave.. simply making controlled solar power available and only taking over as King if without BMS..
I now feel I understand so much more.. and my confusion and respect with Epever controller and its settings so much less and improved..
If I've got it closer to right.. then going forward I will now look on my Epever and its settings to be King of Solar to cell charging... and my BMS will generally look after my discharge limit and Balance the cells, with its over limit volt settings an added cell protection..
Thanks guys..
A smart charger can up the voltage enough to reach some preset current limit and let the battery take charge faster. But to protect the battery, the measured cell voltage will be held to a value that was set as "boost" voltage. The real key to understanding is to understand the series circuit and that the max amp current limit will cause a lot of the power supply voltage to appear across the current limit, which allows the battery to act more like a short (in the series circuit) and pull the voltage on its own terminals down so the voltage at the battery will stay no less than or equal to boost. Once the battery can't hold the voltage below boost, the charger starts backing off so to not push the battery over boost and the current tapers off. Without a smart charger, the power supply output voltage can't/shouldn't be higher than the boost voltage. From the battery voltage point of view, smart or dumb charger, the battery doesn't see higher than boost but with a dumb charger, getting to boost will take longer.
Also when you elect to move boost down to 3.55, 3.50, or even 3.45v per cell, you make the time needed to get to "fully charged" (or very near full charge) to be much longer. So, you boost hold time (for a given chemistry) should include your boost voltage, your available charge power, battery size, and other factors. An yes different people will have different solutions.