Be aware of the voltage difference between battery pack terminals (which the BMS will act upon) and the Inverter & Solar Charger Controller who will also see differing voltages thanks to wire connections and resistance. Unlike brute force lead, all Lithium is far more 'sensitive' due to the narrow voltage curve. Full LFP range is 2.500-3.650, General Working Range is 3.000-3.400 with 3.200 nominal. Matched/Batched/Binned A+ cells can easily work between 2.900-3.450 - All in Volts per cell. from 2.500-2.900 & 3.450-3.650 only represents roughly 3-5% gross capacity, NOT NET and are the "hockey stick curves". LFP can be charged to 3.650V per cell and saturated till <2A are taken and within 1 hour of settling that will drop to 3.500V per cell +/- a bit. This is perfectly normal.
BTW Endamps / Tailcurrent for LFP is 0.05C )0.05 X 100AH = 5A)
SCC's typically will read a higher voltage during charge and lower voltage when no solar input is available.
Inverter/Chargers will also typically read a lower voltage during discharge and higher voltage during charge mode.
Therefore Voltage Correction needs to be applied to the SCC during charge & inverter during discharge. Especially if using an Inverter/Charger AGS to auto-start a genset.
Check the voltage at your SCC during Charge and when there is no sun to get a good picture of what it sees at the "device" terminals compared to the battery pack terminals (NOT Busbar if there is one).
Check the voltage at Inverter & Battery during Inversion only & again during charge only.
Also if possible verify the Cell Voltages when the battery packs are @ 3.000 and @ 3.400+ Volts per cell and observe cell deviations. The higher OR lower the cell voltage, the higher likelihood for increased deviations between cells. These tend to be more pronounced outside of the "working voltage" range. Avoid levels where LVD or HVD cutoffs do occur. This is also when weak connections between cells or other issues may appear.
Hope it Helps, Good Luck