All battery monitors are 'dead reckoning navigation', meaning they keep a running tally based on last reference point taken.
There are three settings you must program into monitor setup. Usually there is a charge efficiency factor that derates charging current to account for inefficiencies in battery charging. For LFP it should be set to about 92 to 95%. You can tweak this number over use experience to achieve better accuracy on the dead reckoning tally. Many monitors have a default number of 85% which is typical for lead acid batteries so you need to change it for LFP.
The reference point taken by the monitor is when you reach a battery voltage level during charging that you setup to be defined as 100% state of charge voltage. It will reset its tally count to zero for AH useage regardless of what it has for its existing tally count. If you never fully charge battery to this point the reference 100% capacity will never be reset and you will accumulate more errors over time, as you are relying on cummulative charging and discharging tally over time (dead reckoning). Most battery monitors allow you to manually reset the 100% SOC point.
The last item you need to setup is the battery AH capacity. This is used to calculate % state of charge. It can be whatever you want to set it for. Some folks put a number of 50% to 80% of their battery's actual AH rating to give them a visual safety margin on battery state of charge readout.
Some battery monitors have a fourth factor they use which is a discharge factor where it decreases the % capacity consumption more for larger amps of discharge rate to account for battery efficiency degradation at greater discharge currents. If it does this it usually requires a setup entry for battery type (lead-acid, LFP) and it figures its own derating factor based on battery type and AH's you set for battery capacity.
Two most common mistakes is setting full charge voltage trip the same as charging absorb voltage and not setting charge efficiency factor correctly.