diy solar

diy solar

Victron smart solar inaccurate voltage in app

dragonzzr

New Member
Joined
Dec 1, 2020
Messages
31
After reading and searching both here and in Victron community forums for an answer i am still clueless.
I have noted many users reporting deviations from what the app shows compared to the actual readings on battery as well as scc terminals and there seems to be a deviation of 0.1V to 0.3V, which in the case of lithium can be significant.
These reports go way back to 2017 while others are still valid today, so i believe that the firmware part is (almost) irrelevant.

My setup is dead simple: Amperetime 12v 50Ah Battery + Victron smartsolar 100/20.
Nothing else connected/attached to this system so far. The cable that connects Battery and SCC is only 20cm 6mm2.
Three different voltmeters show 13.36V at battery terminals, also same value on SCC Batt terminals.
But app shows 13.29V.

Solar panel 100W then connects to SCC and starts charging, with custom settings set to 13.8V bulk/termination voltage.
When it reaches the set point, battery and scc terminals show 14.1-14.2V and the app shows 13.8V.

Is this behavior normal for Victron controllers? Does anyone have a suggestion/solution?
 
When not charging, MPPT should report battery voltage and the voltage measured at the MPPT battery terminals.

When charging, MPPT will read higher than battery voltage, but should read the same as MPPT battery terminals.

.07V deviation on non-calibrated hardware is noise, BUT

The fact that your MPPT is reporting LOWER than battery voltage during charging suggests the unit is defective.

My suggestion is to contact your vendor for replacement.
 
I agree with sunshine_eggo. That's some odd readings. Normally, if there are connection issues we see a higher voltage reported by the solar charge controller that what is observed at the battery.
 
I am seeing the same thing. I chased down any place I found a voltage drop…fixed the big issue.

today I noticed the drop you mentioned. I checked the voltage at every place between the MPPT and the battery and I found that the voltage from the MPPT, the voltage on both sides of the fuse, the voltage at the buss bar, and the voltage at the battery are all the same. The difference seems to be that the MPPT is simply reporting slightly higher voltage than the volt meter is reading. It is a very small amount.
 
As i described, thre are no other connections between battery and charge controller, except tat small cable of 20cm of 6mm2.
i just purchased the scc and still i am testing it before i put it to production.
There is an almost stable difference of 0.1V to 0.12V between the real battery voltage and the bt app.
i just came from the garage where i performed another chargimg test from 13.36(real battery voltage) to 14.00V(real voltage), and the controller was reporting 13.29 to 13.9V(user defined bulk end)
At least this time it was playing at 0.07 to 0.1V deviation. Should i be happy ?
 
As i described, thre are no other connections between battery and charge controller, except tat small cable of 20cm of 6mm2.
i just purchased the scc and still i am testing it before i put it to production.
There is an almost stable difference of 0.1V to 0.12V between the real battery voltage and the bt app.
i just came from the garage where i performed another chargimg test from 13.36(real battery voltage) to 14.00V(real voltage), and the controller was reporting 13.29 to 13.9V(user defined bulk end)
At least this time it was playing at 0.07 to 0.1V deviation. Should i be happy ?

No, you should not be happy. The solar charge controller could charge the battery more than you want. Granted, .07 to .1 isn't a huge discrepancy but I wouldn't be too pleased about it.

If you remove the cables from the solar charge controller, what voltage reading do you get at the end of the wires. I want to eliminate the solar charge controller from the readings.
 
The values are as i said, 13.36 at the battery terminals,with or without the controller and also 13.36 at the scc battery terminals. the app reports 12.27-12.29V
Is only 0.07V the difference, but at the highr end of charging, it ramps up to 0.12V. The only safeguard i have is to set the bulk voltage to 13.9 and only 5 minutes absorb. Also when monitoring the trends screen,there are (very small) spikes shown.
 
The values are as i said, 13.36 at the battery terminals,with or without the controller and also 13.36 at the scc battery terminals. the app reports 12.27-12.29V
Is only 0.07V the difference, but at the highr end of charging, it ramps up to 0.12V. The only safeguard i have is to set the bulk voltage to 13.9 and only 5 minutes absorb. Also when monitoring the trends screen,there are (very small) spikes shown.

Where on the solar charge controller side are you getting your voltage measurement? On the terminal screws on top/front of the solar charge controller? Or are you checking at the cable itself? Rule out the solar charge controller by removing the cables from the solar charge controller and check the voltage at the solar charge controller end of the cables.
 
Solar panel 100W then connects to SCC and starts charging, with custom settings set to 13.8V bulk/termination voltage.
When it reaches the set point, battery and scc terminals show 14.1-14.2V and the app shows 13.8V.

This is 0.3-0.4V

There is an almost stable difference of 0.1V to 0.12V between the real battery voltage and the bt app.
i just came from the garage where i performed another chargimg test from 13.36(real battery voltage) to 14.00V(real voltage), and the controller was reporting 13.29 to 13.9V(user defined bulk end)
At least this time it was playing at 0.07 to 0.1V deviation. Should i be happy ?

This is potentially noise, but I'm concerned about the inconsistency. Is it the equipment or the tester?

The values are as i said, 13.36 at the battery terminals,with or without the controller and also 13.36 at the scc battery terminals. the app reports 12.27-12.29V
Is only 0.07V the difference, but at the highr end of charging, it ramps up to 0.12V. The only safeguard i have is to set the bulk voltage to 13.9 and only 5 minutes absorb. Also when monitoring the trends screen,there are (very small) spikes shown.

I assume you mean 13.27-13.29.

From my perspective your story keeps changing.

If 0.3-0.4V, horror

Recommend you push the bulk up to 14.4V and evaulate.
 
yes, i am sorry, my typing mistake, it was 13.27, not 12.27.
the rest is as described though. i also checked the battery voltage with three different multimeters(all reported within 0.03-0.05 difference) and so i can be sure that the difference between controller values and real ones are existing within those ranges.
You mentioned noise...from what this can occur?
Having lived for the past 6 years with a reputable(Steca PR2020) controller and tested it under all conditions, i have never observed such issues.
Ok, maybe the PWM controllers and VRLA batteries are much more "forgiving" than the very tight mppt and lithium setups, but still, i am starting to believe that its just not normal for a controller with the reputation of Victron to "see" one voltage at its termimals and report a different one within the app. At least i am not alone, i have seen others in forums having the opposite problem, where the scc reports HIGHER than the real.
 
Is the firmware in the solar charge controller up to date? The VictronConnect app will prompt you to apply the firmware update if it detects a difference. The firmware code is embedded within the VictronConnect app. As long as the app has been updated recently from whatever source you got it from, it will have the latest firmware.

Otherwise, I think your issue is beyond the DIY community and you need to get your retailer involved to perform further troubleshooting/diagnostics.
 
Yes, latest firmware. Thank you all for your valuable information, i think i should turn to the vendor for futher investigation.
 
yes, i am sorry, my typing mistake, it was 13.27, not 12.27.
the rest is as described though. i also checked the battery voltage with three different multimeters(all reported within 0.03-0.05 difference) and so i can be sure that the difference between controller values and real ones are existing within those ranges.
You mentioned noise...from what this can occur?

"noise" meaning you don't have sufficient standards to confirm accuracy. If one of your meters is a Fluke, then I'll capitulate.

Having lived for the past 6 years with a reputable(Steca PR2020) controller and tested it under all conditions, i have never observed such issues.
Ok, maybe the PWM controllers and VRLA batteries are much more "forgiving" than the very tight mppt and lithium setups,

This is not a thing.

but still, i am starting to believe that its just not normal for a controller with the reputation of Victron to "see" one voltage at its termimals and report a different one within the app. At least i am not alone, i have seen others in forums having the opposite problem, where the scc reports HIGHER than the real.

And that is normal comparing battery voltage (lower) to controller (higher), but controller should report the same voltage measured at the MPPT battery terminals.

Yes, latest firmware. Thank you all for your valuable information, i think i should turn to the vendor for futher investigation.

Agree.
 
"noise" meaning you don't have sufficient standards to confirm accuracy. If one of your meters is a Fluke, then I'll capitulate.
Its not a Fluke, its a Brymen. Accurate enough for this purpose, i guess.

And that is normal comparing battery voltage (lower) to controller (higher), but controller should report the same voltage measured at the MPPT battery terminals.
with a setup like this, how it can be normal to have different values on just different points of the circuit ?
Remind you, we are talking a bout a 20cm cable distance and no other stuff connected, crimped, soldered, attached, connected, whatever...
Problem.png

As for what i have already seen here and there, people even have different measurments between even their battery monitor by Victron compared to their controller by Victron, so besides the fact that i should probably check with the vendor what is happening, i get the feeling that Victrons may work, "perfectly" IF they are a correctly calibrated unit from the factory, which is probably not always the case.
 
Accuracy is a bitch, and calibration is something few consumers have in their toolbox.

I looked up the accuracy of Bryman. It's 0.2% + 3. That means a 14.00V value can be ± .06V. That's a total error of 0.12V. Hopefully, you understand why 0.07V difference is "noise."

I discovered the significance of meter accuracy when trying to measure AC idle draw of a newly installed 2 ton A/C unit. Per the AC clamp meter, it was showing nearly 80W. I know there can be a 40W draw due to the compressor heater, but just this didn't make sense, and the OAT wasn't below 50°F, so the heater should be off. Once I dug into the accuracy of the meter, I discovered that at this low current, the meter error margin was at least 4X greater than the value I was trying to measure. Checking the load on the DC side of things with a much smaller error, that 80W was actually 7W, which is likely just the standby draw of the unit.

I can't find an accuracy published for the MPPT voltage or current values. For their shunts, it's ±0.01V accuracy. I would not expect the MPPT to be quite that good, but it shouldn't be horrible.

The only thing working for you is the number of meters in use and the divergent behavior. You have no foundational claims of accuracy. You can just identify the Victron as an outlier. That may be real. It may not be. This is further complicated by your inconsistency in results. Sometimes, it's 0.3-0.4V error, but at other times, you identify the error as 0.12V max.

If you had a calibrated voltage reference, and your meters correlated to that reference, I would agree the Victron is out of spec. RIght know, the Victron is just an outlier, and there's a chance it's the only accurate meter you have. Not likely, but possible. Still, given the correlation of other meters, the probability is that the Victron is inaccurate, so I would pursue resolution there.

FYI, if I saw a consistent 0.07V difference between any measurement devices in my system, I wouldn't think twice about it. I also have the luxury of a voltage reference and a Fluke calibrated to a standard.
 
It was posts like the one sunshine_eggo just made that prompted me to purchase a voltage reference device and a better meter (Fluke). All my meters, even the dirt cheap ones, were within the margin of error and pretty much right on the money for most purposes. But it was nice to know that for sure.
 
Accuracy is a bitch, and calibration is something few consumers have in their toolbox.

I looked up the accuracy of Bryman. It's 0.2% + 3. That means a 14.00V value can be ± .06V. That's a total error of 0.12V. Hopefully, you understand why 0.07V difference is "noise."

I discovered the significance of meter accuracy when trying to measure AC idle draw of a newly installed 2 ton A/C unit. Per the AC clamp meter, it was showing nearly 80W. I know there can be a 40W draw due to the compressor heater, but just this didn't make sense, and the OAT wasn't below 50°F, so the heater should be off. Once I dug into the accuracy of the meter, I discovered that at this low current, the meter error margin was at least 4X greater than the value I was trying to measure. Checking the load on the DC side of things with a much smaller error, that 80W was actually 7W, which is likely just the standby draw of the unit.

I can't find an accuracy published for the MPPT voltage or current values. For their shunts, it's ±0.01V accuracy. I would not expect the MPPT to be quite that good, but it shouldn't be horrible.

The only thing working for you is the number of meters in use and the divergent behavior. You have no foundational claims of accuracy. You can just identify the Victron as an outlier. That may be real. It may not be. This is further complicated by your inconsistency in results. Sometimes, it's 0.3-0.4V error, but at other times, you identify the error as 0.12V max.

If you had a calibrated voltage reference, and your meters correlated to that reference, I would agree the Victron is out of spec. RIght know, the Victron is just an outlier, and there's a chance it's the only accurate meter you have. Not likely, but possible. Still, given the correlation of other meters, the probability is that the Victron is inaccurate, so I would pursue resolution there.

FYI, if I saw a consistent 0.07V difference between any measurement devices in my system, I wouldn't think twice about it. I also have the luxury of a voltage reference and a Fluke calibrated to a standard.
If my life depends on the tool (or to keep my sanity in check), I will buy a good quality tool. In this case, I need dependable tool such as Fluke meter with calibration sticker certification (if $$ permits). That's me; other may have different thoughts.
 
Well, at least i know now what noise is in this context !
To admit the truth, i was aware of these minor deviations as far as the specs of the multimeters are concerned, but nevertheless, what made me really sceptical was the repeatability of the difference between what the controller app reports and what the multimeters report.
its also strange that all of them were on the + side of the accuracy specs compared to the victron connect app and none on the - part of the digits.
Anyway, its good to see such knowledgeable people sharing their experiences and definitely every day we all learn something.
 
Accuracy is a bitch, and calibration is something few consumers have in their toolbox.

I looked up the accuracy of Bryman. It's 0.2% + 3. That means a 14.00V value can be ± .06V. That's a total error of 0.12V. Hopefully, you understand why 0.07V difference is "noise."

I discovered the significance of meter accuracy when trying to measure AC idle draw of a newly installed 2 ton A/C unit. Per the AC clamp meter, it was showing nearly 80W. I know there can be a 40W draw due to the compressor heater, but just this didn't make sense, and the OAT wasn't below 50°F, so the heater should be off. Once I dug into the accuracy of the meter, I discovered that at this low current, the meter error margin was at least 4X greater than the value I was trying to measure. Checking the load on the DC side of things with a much smaller error, that 80W was actually 7W, which is likely just the standby draw of the unit.

I can't find an accuracy published for the MPPT voltage or current values. For their shunts, it's ±0.01V accuracy. I would not expect the MPPT to be quite that good, but it shouldn't be horrible.

The only thing working for you is the number of meters in use and the divergent behavior. You have no foundational claims of accuracy. You can just identify the Victron as an outlier. That may be real. It may not be. This is further complicated by your inconsistency in results. Sometimes, it's 0.3-0.4V error, but at other times, you identify the error as 0.12V max.

If you had a calibrated voltage reference, and your meters correlated to that reference, I would agree the Victron is out of spec. RIght know, the Victron is just an outlier, and there's a chance it's the only accurate meter you have. Not likely, but possible. Still, given the correlation of other meters, the probability is that the Victron is inaccurate, so I would pursue resolution there.

FYI, if I saw a consistent 0.07V difference between any measurement devices in my system, I wouldn't think twice about it. I also have the luxury of a voltage reference and a Fluke calibrated to a standard.
You just solved my question I posed tonite online. You nailed it with the terms noise and calibration….it’s a real thing in several other hobbies I deal in ..
it can exist in the charting in commodity trading charts and in the interaction of large venue PA systems I deal with and stereo equipment… THD ….weather radar ….and in many other things ..
it’s also called static by some , meaning the variable amount of usless information that is usually included in good information…about the measurement of a “thing.” and if things are not calibrated to sync together you will get a spead of good info…

yes my new Victron charger does not read on the blue tooth display what all my meters are reading under load both at the batt terminals or at the output terminals on the ip22 12v charger…the BT display says one voltage and the two terminal points at the battery and at the charger outputs are different by .015 v From the BT reading. Everything else works great with the unit and the 24 v charger too.
I have already found a way to adapt ,but it’s annoying…but hey , whatever..
this thread shows I’m not alone in this quandary.
it’s cool.. no prob… tomorrow only brings new promise..

each day a new problem is solved… soon I will know what the hell im doing…

J.
 
Last edited:
Back
Top