yes , but we just discussed that.
if your charger just measure what it sends, you could send 3.65V and but have only 3.4 at the battery terminals (passing through or not a BMS).
so the correct way to set your charger would be to set a voltage, and measure at the battery what you really get.
De facto LFP can be charged as high as 4.2V while i would see no reason to recommend that, since the increase in capacity is negligible but the wear will be probably certain.
if you charge an LFP to 4.2, it will not stay at 4.2( like a li-ion cell would) , it will go fast down to 3.7.
The lithium battery has this particularity that if you charge at nominal capacity (3.65) you will reach full charge value at about 80%.
So, most charger are pushing the charge a bit longer even if the battery looks charged and they sens the full charge with thermal measure, because the cell will start to heat faster when fully charged.
Another way would be to measure the current absorbed, since the purpose of charging a battery is not to reach a Voltage, but pushing Amps into it. The voltage is only the what allows you to transfer the amps from the charger to the battery.
the use of voltage to control the charge of the battery is just a lazy way to do it because measuring volt is a lot easier/cheaper than measuring current.
So , to make current pass form the charger to the battery, you need to apply a pressure (voltage) that will make flow the current from the charger to the battery. For technical reason you have to limit that voltage to less than 4.2V, or less you could burn the battery chemistry.
The expected behavior, is if the battery is starved, it will eat current fast and create a drop of voltage. When the battery is full , the voltage will rise.
So measuring voltage can reflect the SOC of the battery. (and we have seen that is not the case for a lead battery for example).