Steve,Good Day everyone,
This is a topic that seems to be generally neglected by not being mentioned or even considered in many posts. This is unfortunate because components such as Solar Charger Controllers, Inverter/Chargers need to "know" the precise voltages being dealt with. With Lead Acid / AGM batteries there is a bit of elbow room such as it is but with Lithium Based batteries, accurate voltage sensing is essential.
This is not a difficult process to do but as equipment varies a great deal on how they are configured and what options they have, I cannot address individual components as such, you will have to refer to the manuals for your particular equipment.
! You will require an accurate DVOM (Digital Volt Ohm Meter) or DMM (Digital Multi-Meter) to accomplish this task.
Simple Steps: Do this when there is no charging from the SCC, best just after sundown, so there is no solar activity.
You will now see a difference in readings between the Batteries, the SCC & the Inverter Charger. This is the result of "deration", essentially the wire and every single connector in between adds a bit of loss through the whole circuit and this must be addressed. ! ALERT ! If the discrepancy is more than 1 Volt you may have other problems, such as a loose connection, poor crimps or damaged wire / components. This must be addressed first and once done, redo above readings. The BATTERY reading (be it a single or a bank of packs) is the one that RULES and the remaining equipment must "match up" to be effective.
- Ensure your batteries are charged and "at rest" meaning no loads or charging for 1 hour.
- The SCC, Inverter/Charger connected and ON, as well if you have a Buck Converter / Step Down converter have that on BUT NO LOAD.
- Take a voltage reading @ Battery Terminal (if only one pack) or at BUS Terminals if multiple packs in parallel. Test "after" the BMS as the BMS is on the "batt side". NOTE the Voltage as ##.## volts (IE 28.92vdc or 14.86vdc)
- Measure the Voltage at the Inverter/Charger DC Input Terminals and again note it.
- Next measure the voltage at the SCC "Battery Terminals" (not the solar input terminals) NB: SCC should not be getting any sun, no input. NOTE the Voltage seen..
Example using basic numbers to Keep It Simple:
Assume the Battery reads 24.0 VDC, the SCC reads 23.75 VDC and the Inverter/Charger sees 23.60 VDC.
IF the desired CHARGING cutoff is 24.0 VDC, then the SCC would have to be corrected for the 0.25V shortfall in readings, so it would be programmed to cutoff @ 24.25 VDC. The Inverter Charger "Charge cutoff" would then also have to be corrected to 24.40 VDC to compensate for the 0.40 VDC difference.
Same for LOW Volt Disconnect !
The Inverter will have it's own LVD setting and this is really important. While 0.40V is not a big difference, it can be if you want to keep with a very specific range and with Lithium .40V at the bottom edge CAN BE SIGNIFICANT ! So you would have to Correct the voltage the Inverter/Charger sees so that it cuts off exactly at the voltage specified "at the battery terminal end". So IF you want the LVD to kick on when the cells reach 2.75VDC ea / 22.0 VDC for the 24V pack/bank the LVD setting will have to be adjusted to 22.40. This way when the Inverter/Charger sees 22.40 Volts it cuts off as the actual batteries are at 22.0VDC. 21.60 VDC = 2.70v per cell.(uncorrected) * REMEMBER, that below 2.80V per cell the voltage drops very fast as you in the "bottom 20%" of cell capacity.
Don't make the BMS do the work it shouldn't do...
The BMS of course will cut off for High Volt, Low Volt etc but this is not it's job, those are "safety" features to protect you batteries and are more or less the "fail safe mechanism", as such they should not be doing that work as a matter of normal operations. This is really the job of the SCC & Inverter/Charger to manage on an ongoing basis. Continually using the BMS to do this lifting can actually affect the BMS negatively and even cause burn outs on FET based systems with repeated "abuse" being shifted to the BMS, it is not what they are designed to do.
I expect that some will want to dig into minutia and details... This is just a basic overview and I am writing this while still working on Coffee #1 so I'm sure typo's and some minor details may be left out. This is also quite GENERIC because different equipment handles settings & configurations in their own way and that makes it pretty difficult (read impossible) to address all the variables.
I hope this helps
I'm a bit late coming to this party, but I use an EVO in my motorhome and recently upgraded to 2 x 100Ahr 12V batteries and a search of the site lead me here.
Reading this post has me a bit confused, but I think I know what you are trying to say.
-> I think you want to use the DVM to measure the voltage at the battery, then compare that to the voltage on the display of your SCC, inverter and charger. You want to see what each device's internal voltage reading is in comparison to each other so you can correct for each device's internal measurement error.
What I read in your instructions is to use the DVM to read the voltage at the battery terminals of each device. Under your instructions of "no load" (ie. no current flowing between the battery and any of the devices), all the reading should be the same since no current flow, then no voltage drops (IR losses. Zero current times any resistance = zero voltage drop) and you are using the same voltage measurement device at each terminal.
When current is flowing (charging or consuming), any resistance in the circuit will cause a voltage drop. In my small motorhome there is no way to get my EVO1212F closer to the batteries without major renovations to the cabinetry and loss of storage space. When Bulk charging at 30Amps, I have a drop of 0.45V between the EVO and the batteries. This voltage drop then decreases during Absorption as the current decreases. This means I spend less time in Bulk and more time in Absorption.
There is about 8ft of #1 cable and a couple of short #4 jumpers, the negative side is through the aluminum body framework. As someone else mentioned, remote voltage sensing would correct this issue. I contacted Samlex about it and they acknowledged it would be a good idea, but of course refused to help me implement it on my unit. Correcting this will be a project for this winter.