If I understand CC / CV (and I'm not sure that I do beyond a basic practical level), constant current limits the output to a certain maximum current, CV sets the voltage to a certain target. When the difference in voltage is high, the 'load' maxes out the CC rating and the charger operates in CC mode, as the battery is charged its voltage rises and at some point the difference in voltage will be small enough that current (I = V/R) will drop below the max CC at which point the charger is operating in CV mode current tapers and voltage flattens at its set point. Is this roughly correct in your understanding? Or am I fundamentally misunderstanding things?
That is a fairly good explanation. But don’t think of it as the voltage difference that maxes out the charger. It is better to think of it as the battery internal impedance being low because of a low state of charge. What happens in the battery is that, as it is charged, internal battery impedance rises, which means more resistance to current flow.
Going back to the solar panel example, an ~18V solar panel hooked up to a ~12V battery works without a DC-DC converter (MPPT) to convert the voltage and even without a PWM controller if you are not concerned with 3-stage charging so long as there is a means to disconnect the PV array from the battery before the voltage of the battery gets too high. Does this sound right to you?
Yes, but maximum battery life requires that you adhere to the manufacturers recommended charging voltage. The battery will accept a charge at higher voltage, but you are conducting an experiment with your battery. You are messing with the rate of important chemical reactions that take time to complete efficiently.
Likewise with a benchtop power supply, we would see something somewhat similar, right? Say you have a fully discharged lithium battery ~10V, you set your power supply to 15V (I know too high) and your current to some value. Immediately upon connecting your power supply you would see current rise to the max but voltage will be pulled down to the voltage of the battery bank, right?
i have noted you make this statement a couple of times (thread title, after all), but it is not useful to think that the battery will “pull down” the voltage of a properly designed charger, or a regulated power supply. What you will often see, (unless perhaps the battery is deeply discharged), with the charger connected, is the charger/supply voltage appears at the battery terminals, as long as the current draw does not exceed charger capacity. But that won’t be the battery voltage. You will only be able to read battery voltage after the charger is disconnected. A battery behaves differently in this way than a simple resistive load. The charger voltage will sometimes drop when the battery is connected, but it isn’t useful to think of it that as an intended part of the process.
You are quite correct, though, that a 3-stage charger is not needed. BULK stage is only used to speed the charging process. That can be skipped. And we don’t use FLOAT stage with lithium batteries at all. So, that just leaves ABSORPTION stage. You can accomplish all of the charging at the manufacturers recommended absorption voltage (or lower if you don’t want to charge to 100% capacity - and it isn’t even quite that simple because there is a time element that also determines state of charge, not just charging voltage).
Your BMS is set to disconnect at 14V and does once your batter bank reaches that voltage.
No, most BMS do not have a function to terminate a charge based upon pack voltage. (Some may. I’m not familiar with all of them). The BMS is usually monitoring individual cell voltages.
The term BMS is not precise. The BMS we buy for $50 or $100 is way less sophisticated than the BMS in a $50,000 EV.
In the way a BMS is used in conjunction with our home DIY solar systems, the BMS is used only to protect against things going wrong . . . Things that might damage your expensive lithium cells, such as a single cell drifting too high, or too low, in voltage, or ambient temperature being too low. It is not used to discontinue a normal charging process, and I discourage you from thinking of it that way.
You can use some kinds of battery monitors to terminate the charge, but don’t rely on the BMS that we typically buy.
In both of these examples, I believe, voltage difference between the source and the battery bank can be some amount higher than the battery bank voltage, and no harm will occur to the cells or the charger/pv panel so long as the cells are disconnected before the battery bank voltage exceeds its limit.
No, because the average solar-lithium BMS is not going to “alarm“ for full pack voltage. Definitely not recommended practice, at least in the way that we solar DIYer’s use a BMS
So now I'm trying to understand the above, in the context of the aforementioned hypothetical 19V laptop power supply, with nothing between it and the battery apart from a switch that disconnects automatically at 3.6V per cell? What separates this situation from the above? It still seems to me like it has to either be an inability to limit current or maybe just the higher difference in voltage?
Use a power supply that charges at the optimum voltage recommended by the battery manufacturer. Otherwise, you are experimenting, which is fine if are just messing around to educate yourself, or if using cheap salvaged batteries. But don’t do it with a valuable battery.
Lots of things will work, but most of them will give a result that is less than the best.