Coincidentally, this (switching power supplies) is what I have been googling for the last half hour or so. Haven't found a good conceptual level explainer yet.
But I wonder if this may not be precisely targeted to the point I'm trying to understand.
I believe
this, is a switching power supply as well, and I believe it can be used in the way I would like, (though maybe not with such a high difference in voltage?)
If I understand CC / CV (and I'm not sure that I do beyond a basic practical level), constant current limits the output to a certain maximum current, CV sets the voltage to a certain target. When the difference in voltage is high, the 'load' maxes out the CC rating and the charger operates in CC mode, as the battery is charged its voltage rises and at some point the difference in voltage will be small enough that current (I = V/R) will drop below the max CC at which point the charger is operating in CV mode current tapers and voltage flattens at its set point. Is this roughly correct in your understanding? Or am I fundamentally misunderstanding things?
Going back to the solar panel example, an ~18V solar panel hooked up to a ~12V battery works without a DC-DC converter (MPPT) to convert the voltage and even without a PWM controller if you are not concerned with 3-stage charging
so long as there is a means to disconnect the PV array from the battery before the voltage of the battery gets too high. Does this sound right to you?
Likewise with a benchtop power supply, we would see something somewhat similar, right. Say you have a fully discharged lithium battery ~10V, you set your power supply to 15V and your current to some value. Immediately upon connecting your power supply you would see the voltage drop to that of the battery bank and current rise to the max value. Your BMS is set to disconnect at 14V and does once your batter bank reaches that voltage.
In both of these examples, I believe, voltage difference between the source and the battery bank can be some amount higher than the battery bank voltage, and no harm will occur to the cells or the charger/pv panel so long as the cells are disconnected before the battery bank voltage exceeds its limit.
So now I'm trying to understand the above, in the context of the aforementioned hypothetical 19V laptop power supply, with nothing between it and the battery apart from a switch that disconnects automatically at 3.6V per cell? What separates this situation from the above? It still seems to me like it has to either be an inability to limit current or maybe just the higher difference in voltage?
I really want to understand what I'm missing here.