diy solar

diy solar

Cumulative discharge displayed

Braddo

New Member
Joined
Sep 21, 2019
Messages
22
Kia Ora. I have a victron BMV-700 monitoring my batteries. For some reason ever since it was installed its estimate of the state of charge has always been reducing, despite the batteries getting a full top up every day and the voltage confirming they are charged. Right now my voltage is 13.53 with the Epever BT-50 showing only 1a being pushed, but the BMV-700 has it estimated at 23.9%. I thought the setup was meant to be pretty idiot proof, but maybe I’ve brought a bigger idiot game than they engineered for. The only connections to the shunt are in from the main battery negative, out to the distribution bus bar, and the accessory positive powering the unit, then the cable to the monitor.
It’s bound to be something correctable, but I’m at a loss.
Any suggestions or similar experiences?
 
screen cap and post your settings screen.

What is your absorption voltage?

Is there anything between the battery and the shunt?
 
Thanks for your comments.

I think I’ve figured it. I have the charge controller going to the opposite side of the battery bank, but it should only be going to the ‘load and charger’ side of the shunt. The load side and everything else was set up correctly.

What a dummy.
 
I kept things simple. All my load/charge side components are connected to a common bus bar. Then the bus bar connects to the shunt. So one connection to the shunt on that side.

Everything -> common bus bar -> shunt -> bms -> battery
 
I have a lovely brass bar that I’ve put aside for that exact purpose. It looks like I have a weekend project.
 
All battery monitors are 'dead reckoning navigation', meaning they keep a running tally based on last reference point taken.

There are three settings you must program into monitor setup. Usually there is a charge efficiency factor that derates charging current to account for inefficiencies in battery charging. For LFP it should be set to about 92 to 95%. You can tweak this number over use experience to achieve better accuracy on the dead reckoning tally. Many monitors have a default number of 85% which is typical for lead acid batteries so you need to change it for LFP.

The reference point taken by the monitor is when you reach a battery voltage level during charging that you setup to be defined as 100% state of charge voltage. It will reset its tally count to zero for AH useage regardless of what it has for its existing tally count. If you never fully charge battery to this point the reference 100% capacity will never be reset and you will accumulate more errors over time, as you are relying on cummulative charging and discharging tally over time (dead reckoning). Most battery monitors allow you to manually reset the 100% SOC point.

The last item you need to setup is the battery AH capacity. This is used to calculate % state of charge. It can be whatever you want to set it for. Some folks put a number of 50% to 80% of their battery's actual AH rating to give them a visual safety margin on battery state of charge readout.

Some battery monitors have a fourth factor they use which is a discharge factor where it decreases the % capacity consumption more for larger amps of discharge rate to account for battery efficiency degradation at greater discharge currents. If it does this it usually requires a setup entry for battery type (lead-acid, LFP) and it figures its own derating factor based on battery type and AH's you set for battery capacity.

Two most common mistakes is setting full charge voltage trip the same as charging absorb voltage and not setting charge efficiency factor correctly.
 
Last edited:
All battery monitors are 'dead reckoning navigation', meaning they keep a running tally based on last reference point taken.

There are three settings you must program into monitor setup. Usually there is a charge efficiency factor that derates charging current to account for inefficiencies in battery charging. For LFP it should be set to about 92 to 95%. You can tweak this number over use experience to achieve better accuracy on the dead reckoning tally. Many monitors have a default number of 85% which is typical for lead acid batteries so you need to change it for LFP.

The reference point taken by the monitor is when you reach a battery voltage level during charging that you setup to be defined as 100% state of charge voltage. It will reset its tally count to zero for AH useage regardless of what it has for its existing tally count. If you never fully charge battery to this point the reference 100% capacity will never be reset and you will accumulate more errors over time, as you are relying on cummulative charging and discharging tally over time (dead reckoning). Most battery monitors allow you to manually reset the 100% SOC point.

The last item you need to setup is the battery AH capacity. This is used to calculate % state of charge. It can be whatever you want to set it for. Some folks put a number of 50% to 80% of their battery's actual AH rating to give them a visual safety margin on battery state of charge readout.

Some battery monitors have a fourth factor they use which is a discharge factor where it decreases the % capacity consumption more for larger amps of discharge rate to account for battery efficiency degradation at greater discharge currents. If it does this it usually requires a setup entry for battery type (lead-acid, LFP) and it figures its own derating factor based on battery type and AH's you set for battery capacity.

Two most common mistakes is setting full charge voltage trip the same as charging absorb voltage and not setting charge efficiency factor correctly.
Thanks for such an excellent reply, and apologies for not acknowledging it earlier. I’ve gone through the setup and adjusted the charge efficiency factor. Still no advance in the original problem, unless I manually reset the SOC reference. Perhaps I need to play with it more, otherwise I’ve bought a very expensive voltage meter.
 
I noticed in the Victron manual that besides reset full charge voltage it also requires the charge amperage to drop below 4% of AH rating in amps, as default, for three minutes.

Their 'quick start' defaults may be getting you. Looks like quick start only requires you enter AH's of your battery. It then makes the decision for reset voltage based on measured battery voltage which is likely based an lead-acid battery type.

I don't like that but you might not be making the drop in charge current to 4% of AH entered in monitor during your charging so it never resets to full. That minimum charge current to terminate charge absorb phase is dependent on your charger setup. If charger aborts out of absorb at a higher current then monitor setting the monitor will not be reset.

It appears you can enter your own custom reset voltage, reset charge current, and time required for both being satisfied.

I assume if you set the entered charge current at full to 100% of AH then monitor only would require full voltage for specified time to reset to full.
 
Last edited:
Back
Top