A controller's current rating is for output, i.e., a 40A controller will output 40A max to the batteries. Assuming MPPT controller, the input voltage and current doesn't matter provided they meet the controller's input limits, so 600W/12V = 50A is the relevant number. Your controller can't handle the existing panel power at peak.
Unfortunately, I can't see any details on the battery. Hopefully, it lists the float and bulk/absorp voltages, and you have your controller properly set to that.
The chart is typical. The unique in my experience that it lists charge and discharge parameters. If those discharges are for a C20 rate and charge for C10, then it's a reasonable resource. The charge is in question as it peaks at 13.8 when most FLA need to go to 14.X to achieve full charge. 13.8 is typical of AGM UPS type installations where the batteries are held in a state of high float for very long periods of time and only discharge infrequently.
The resting voltage column is consistent with other resources. The kicker is resting is RESTING... no current in or out for MANY hours. 10 at a minimum and 24 hours for best results.
FLA/AGM charging works as follows:
0.1C (in your case 67.5A) input to 14.4-14.8V (bulk phase)
Hold 14.4-14.8V by decreasing current as the battery fills (absorption phase)
Once current drops to 2% of C (.02 * 675 = 13.5A), battery is full.
Terminate charging and supply only enough current to maintain the float voltage, 13.2-13.8V typical.
Once the controller can't maintain float, the battery voltage will drop into the 12.6-12.9 range when left to sit.
Surface charge with FLA/AGM isn't a bad thing. It further discourages sulfation of the plates. Bleeding off 1% may be useful in estimating state of charge, but there are better ways.
Will lists a battery monitors on his website:
Building a vehicle mounted solar power system? Let me help.
www.mobile-solarpower.com
You program these with your battery's (not all include all items):
capacity
fully charged voltage
tail current (the 2% termination current above)
Peukert effect (how the change in current affects the change in a battery's capacity)
charge efficiency
The monitor resets itself with every full charge. Then it COUNTS the Ah used and compares it to the programmed capacity, 675Ah in your case. It then reports a % state of charge (SoC).
Any of those units should give a good approximation of SoC. However, the more expensive Victron BMV units include all of the parameters I listed and provides a very accurate SoC calculation.
The key in selecting any battery monitor is to ensure the shunt or hall effect sensor meets your current requirements according to the following formula:
peak Load (watts) / system voltage / 0.66
example:
2000W / 12V / 0.66 = 252A
A 2000W inverter on a 12V system would require a 252A minimum shunt. The 0.66 is a safety factor to ensure you're not running the shunt at its limit. The 0.66 does not apply to a hall effect sensor where you have a sensor around the cable to measure current.
In all cases a little bigger doesn't hurt. A 350 or 500A shunt would be fine for this example. 1000A would be a little much as the current sensing becomes less accurate.
If you're looking to throw a bit of money at the problem and get a reasonably accurate SoC, then a battery monitor is for you.
Lastly, FLA/AGM batteries require temperature compensated charge unless they're in a 77°F controlled environment. Hopefully, your charge controller has a provision for a temperature sensor.