I have a Sol Ark 12k (off grid only) connected to 6-48 volt Trophy 120ah batteries (listed as 110ah, but actually 120ah). I have the Sol Ark set to use voltage to estimate battery SOC since I don't have closed loop communication working with my battery. I can monitor my batteries using the trophy battery monitoring software, so I can can get deep real-time data on SOC/SOH cell voltages, cell temps etc..
The question I have is that the Sol Ark always is super pessimistic about the estimate it gives for the total SOC of my battery bank. Since the Sol Ark knows the total AH of my bank (720ah), and it measures PV in and loads out, it should be able to fairly accurately estimate the current total SOC of the bank. It's not clear if the Sol Ark is just using voltage cut offs alone to measure % SOC, or if it is monitoring actual amps/volts pulled.
For example, my Sol Ark is reporting 65% SOC, but the monitoring software shows a 86% SOC average across all batteries right now and it is consistently off by larger and larger amounts as it gets lower and lower SOC
I'm sure some of you understand how this work and could guide me in this?
The question I have is that the Sol Ark always is super pessimistic about the estimate it gives for the total SOC of my battery bank. Since the Sol Ark knows the total AH of my bank (720ah), and it measures PV in and loads out, it should be able to fairly accurately estimate the current total SOC of the bank. It's not clear if the Sol Ark is just using voltage cut offs alone to measure % SOC, or if it is monitoring actual amps/volts pulled.
For example, my Sol Ark is reporting 65% SOC, but the monitoring software shows a 86% SOC average across all batteries right now and it is consistently off by larger and larger amounts as it gets lower and lower SOC
I'm sure some of you understand how this work and could guide me in this?
Last edited: