diy solar

diy solar

Calibrating Voltage of your system to ensure Optimal Operations. SCC, Inverter/Charger, voltage matching.

Steve_S

Offgrid Cabineer, N.E. Ontario, Canada
Joined
Oct 29, 2019
Messages
7,758
Location
Rural NE Ontario Canada
Good Day everyone,

This is a topic that seems to be generally neglected by not being mentioned or even considered in many posts. This is unfortunate because components such as Solar Charger Controllers, Inverter/Chargers need to "know" the precise voltages being dealt with. With Lead Acid / AGM batteries there is a bit of elbow room such as it is but with Lithium Based batteries, accurate voltage sensing is essential.

This is not a difficult process to do but as equipment varies a great deal on how they are configured and what options they have, I cannot address individual components as such, you will have to refer to the manuals for your particular equipment.

! You will require an accurate DVOM (Digital Volt Ohm Meter) or DMM (Digital Multi-Meter) to accomplish this task.

Simple Steps: Do this when there is no charging from the SCC, best just after sundown, so there is no solar activity.
  1. Ensure your batteries are charged and "at rest" meaning no loads or charging for 1 hour.
  2. The SCC, Inverter/Charger connected and ON, as well if you have a Buck Converter / Step Down converter have that on BUT NO LOAD.
  3. Take a voltage reading @ Battery Terminal (if only one pack) or at BUS Terminals if multiple packs in parallel. Test "after" the BMS as the BMS is on the "batt side". NOTE the Voltage as ##.## volts (IE 28.92vdc or 14.86vdc)
  4. Measure the Voltage at the Inverter/Charger DC Input Terminals and again note it.
  5. Next measure the voltage at the SCC "Battery Terminals" (not the solar input terminals) NB: SCC should not be getting any sun, no input. NOTE the Voltage seen..
You will now see a difference in readings between the Batteries, the SCC & the Inverter Charger. This is the result of "deration", essentially the wire and every single connector in between adds a bit of loss through the whole circuit and this must be addressed. ! ALERT ! If the discrepancy is more than 1 Volt you may have other problems, such as a loose connection, poor crimps or damaged wire / components. This must be addressed first and once done, redo above readings. The BATTERY reading (be it a single or a bank of packs) is the one that RULES and the remaining equipment must "match up" to be effective.

Example using basic numbers to Keep It Simple:
Assume the Battery reads 24.0 VDC, the SCC reads 23.75 VDC and the Inverter/Charger sees 23.60 VDC.
IF the desired CHARGING cutoff is 24.0 VDC, then the SCC would have to be corrected for the 0.25V shortfall in readings, so it would be programmed to cutoff @ 24.25 VDC. The Inverter Charger "Charge cutoff" would then also have to be corrected to 24.40 VDC to compensate for the 0.40 VDC difference.

Same for LOW Volt Disconnect !
The Inverter will have it's own LVD setting and this is really important. While 0.40V is not a big difference, it can be if you want to keep with a very specific range and with Lithium .40V at the bottom edge CAN BE SIGNIFICANT ! So you would have to Correct the voltage the Inverter/Charger sees so that it cuts off exactly at the voltage specified "at the battery terminal end". So IF you want the LVD to kick on when the cells reach 2.75VDC ea / 22.0 VDC for the 24V pack/bank the LVD setting will have to be adjusted to 22.40. This way when the Inverter/Charger sees 22.40 Volts it cuts off as the actual batteries are at 22.0VDC. 21.60 VDC = 2.70v per cell.(uncorrected) * REMEMBER, that below 2.80V per cell the voltage drops very fast as you in the "bottom 20%" of cell capacity.

Don't make the BMS do the work it shouldn't do...

The BMS of course will cut off for High Volt, Low Volt etc but this is not it's job, those are "safety" features to protect you batteries and are more or less the "fail safe mechanism", as such they should not be doing that work as a matter of normal operations. This is really the job of the SCC & Inverter/Charger to manage on an ongoing basis. Continually using the BMS to do this lifting can actually affect the BMS negatively and even cause burn outs on FET based systems with repeated "abuse" being shifted to the BMS, it is not what they are designed to do.

I expect that some will want to dig into minutia and details... This is just a basic overview and I am writing this while still working on Coffee #1 so I'm sure typo's and some minor details may be left out. This is also quite GENERIC because different equipment handles settings & configurations in their own way and that makes it pretty difficult (read impossible) to address all the variables.

I hope this helps
Steve
 
Does this need to checked periodically?
Would differences in ambient temperature skew the the results in any significant way?
What about oxidation?
 
Does this need to checked periodically?
Would differences in ambient temperature skew the the results in any significant way?
What about oxidation?
Once done properly it can stay, recheck every 6 months to make sure nothing changed.
As long as everything is at the same "normal" temp the temp variables are minimal… obviously, it's best to do when at "normal" temps between 18-25C temp (64-77F) not when temps are in the extremes.
Oxidation can introduce resistance and a bit of loss. To that end, I use a little bit of Ox-Guard (same as noalox) wherever I have connections but I am static on land. I believe it's certainly more important in a mobile type setup and an absolute must for Marine applications, especially on sea going vessels ! Salty Air is nasty !
 
Actually it would be really interesting to see how things change over time. I think Im gonna put a log book in my solar closet and write these numbers down periodically. Great post!!
That is a good idea to log it and track it as it can also provide you with a longer view of the charge rate & performance.

BTW... Current Calibration is not a new idea at all... it's actually discussed a bit in the Midnite Solar Controllers docs as making sure everything is matched is an important detail. I also believe it is brought up in the Samlex Documentation which is almost "too detailed" but that's good because they really cover a lot of minute details and tweaks.

It is unfortunate that @Will Prowse hasn't mentioned this particular issue, well not that I have seen and I believe I have pretty much downloaded & views all of his video's off YT. I do not recall any postings about it either from Will. This is one of those "little points" that can make a big difference. I have seen some threads where I believe this was an issue but neither the poster nor responders knew the solution to the problem lurking in the shadows. There is just so many "little details" it's easy to lose track of some.
 
Do you know of a device to check voltage/amps (calibration device?)
 
Maybe the wrong answer to your question but is that not why we use DVOM's ?
That is what I use... a goof ProPoint with all the extras including AC/DC Clamp attachment.
What is a "goof ProPoint with all the extras including AC/DC Clamp attachment." Predictive type?
 
Pro Point is a brand for my DVOM.
I have tested this with other meters and devices and it is very accurate but a bit of a learning curve. It is NOT a fluke but damned close TBH. Only Pet Peeve, is it eats batteries fast, so you have to have a spare 9V and 3 AAA's in the pouch to replace them. The price was too good, I took the chance and ended up happy with it.
 
Last edited:
I agree with creating layers of protection and the bms Fail safe switch being the last layer. My impression is that variable current is the enemy of accurate voltage reading on autonomous charging and load consuming devices with LVD/Hvd. If we can purchase devices with dedicated voltage sense input, and use the voltage sense wire correctly, we do not need to consider voltage drop for a given load IF we have a dedicated voltage sense wire with no loads on it. The Victron smart battery sense does this wirelessl for Victron MPPT. Here is a video of my moment of discovery where I learned this.
 
This video is actually about a separate subject than calibration. The video should be in separate thread.
I guess I was confused on why you would need to calibrate voltages at the SCC and Inverters IF you had a separate voltage sense wire feeding them which I believe is the best option. It it not necessary to calibrate voltages on the the Victron MPPT controllers IF you have use the smart battery sense module. The calibration in this article would only apply a specific wiring run with a specific current or lack of current. Am I wrong?
 
Don't make the BMS do the work it shouldn't do...
The BMS of course will cut off for High Volt, Low Volt etc but this is not it's job, those are "safety" features to protect you batteries and are more or less the "fail safe mechanism", as such they should not be doing that work as a matter of normal operations. This is really the job of the SCC & Inverter/Charger to manage on an ongoing basis. Continually using the BMS to do this lifting can actually affect the BMS negatively and even cause burn outs on FET based systems with repeated "abuse" being shifted to the BMS, it is not what they are designed to do.
Hi Steve.
Ok I am confused [I know, what's new LOL o_O:p ].
If we don't use the BMS as the protector for the voltage then what are we supposed to use to automatically turn off everything if the voltage gets to low or stop the charging if it gets to high?
I know the charger should turn off automatically but what if I don't want it at 100% but at 90% instead to preserve the batteries and give them a longer life?

Thanks Steve. :)
 
You can program the SCC & Inverter to cut off at the lower voltage.
The BMS can be programmed similarly and it could force cut off at a lower "full" voltage and a higher Low Volt Cutoff, but that should also be reflected in the SCC & Inverter. Unfortunately this is easy to say but with so many types of SCC's & Inverters it really depends on your equipment and how they work and what capabilities they have.

As for "turning everything off" that can mean anything. Obviously, the Inverter should be disconnected if Batt Volts are too low to prevent damage but the SCC will be needed to charge the batteries when the sun comes up. The SCC should go to 'low volt float' once the battery bank reaches the desired voltage. Again the trick is with the different SCC's how they can be programmed and how they read the battery status.
 
Hi Steve.
Ok I am confused [I know, what's new LOL o_O:p ].
If we don't use the BMS as the protector for the voltage then what are we supposed to use to automatically turn off everything if the voltage gets to low or stop the charging if it gets to high?
I know the charger should turn off automatically but what if I don't want it at 100% but at 90% instead to preserve the batteries and give them a longer life?

Thanks Steve. :)

Many (maybe most?) people consider the BMS to be the 'backstop' protection (i.e. last layer of protection) that shouldn't come into play in normal operation.

So your understanding of the function of a BMS is correct, its just that many / most people (with some notable exceptions) consider it the second layer of protection.

As for if you want to stop charging at 90%, In my opinion it should be your charger (or your BMS controlling the charger, in some setups) that is configured to cut charging at 90%.
 
You can program the SCC & Inverter to cut off at the lower voltage.
The BMS can be programmed similarly and it could force cut off at a lower "full" voltage and a higher Low Volt Cutoff, but that should also be reflected in the SCC & Inverter. Unfortunately this is easy to say but with so many types of SCC's & Inverters it really depends on your equipment and how they work and what capabilities they have.

As for "turning everything off" that can mean anything. Obviously, the Inverter should be disconnected if Batt Volts are too low to prevent damage but the SCC will be needed to charge the batteries when the sun comes up. The SCC should go to 'low volt float' once the battery bank reaches the desired voltage. Again the trick is with the different SCC's how they can be programmed and how they read the battery status.
Ok this is embarrassing. I will be using it on my mobility scooter. So no inverter and nothing solar either. I suppose I should have mentioned that LOL.
Ok so a scooter, 8 batteries at 280 amps.
Motor, charger, BMS for 8 batteries. It will have 2 relays as well.
I am getting the Chargery if that helps.
 
Many (maybe most?) people consider the BMS to be the 'backstop' protection (i.e. last layer of protection) that shouldn't come into play in normal operation.

So your understanding of the function of a BMS is correct, its just that many / most people (with some notable exceptions) consider it the second layer of protection.

As for if you want to stop charging at 90%, In my opinion it should be your charger (or your BMS controlling the charger, in some setups) that is configured to cut charging at 90%.
Thank you. That was my thought as well. To charge to 90% and to not let it go below 10% of charge. That 80% in the middle is more than I will probably ever need.
 
  • Like
Reactions: Dzl
Is your charger programmable? Can the end of charge voltage be customized?
 
Back
Top