diy solar

diy solar

Calibrating Voltage of your system to ensure Optimal Operations. SCC, Inverter/Charger, voltage matching.

Good Day everyone,

This is a topic that seems to be generally neglected by not being mentioned or even considered in many posts. This is unfortunate because components such as Solar Charger Controllers, Inverter/Chargers need to "know" the precise voltages being dealt with. With Lead Acid / AGM batteries there is a bit of elbow room such as it is but with Lithium Based batteries, accurate voltage sensing is essential.

This is not a difficult process to do but as equipment varies a great deal on how they are configured and what options they have, I cannot address individual components as such, you will have to refer to the manuals for your particular equipment.

! You will require an accurate DVOM (Digital Volt Ohm Meter) or DMM (Digital Multi-Meter) to accomplish this task.

Simple Steps: Do this when there is no charging from the SCC, best just after sundown, so there is no solar activity.
  1. Ensure your batteries are charged and "at rest" meaning no loads or charging for 1 hour.
  2. The SCC, Inverter/Charger connected and ON, as well if you have a Buck Converter / Step Down converter have that on BUT NO LOAD.
  3. Take a voltage reading @ Battery Terminal (if only one pack) or at BUS Terminals if multiple packs in parallel. Test "after" the BMS as the BMS is on the "batt side". NOTE the Voltage as ##.## volts (IE 28.92vdc or 14.86vdc)
  4. Measure the Voltage at the Inverter/Charger DC Input Terminals and again note it.
  5. Next measure the voltage at the SCC "Battery Terminals" (not the solar input terminals) NB: SCC should not be getting any sun, no input. NOTE the Voltage seen..
You will now see a difference in readings between the Batteries, the SCC & the Inverter Charger. This is the result of "deration", essentially the wire and every single connector in between adds a bit of loss through the whole circuit and this must be addressed. ! ALERT ! If the discrepancy is more than 1 Volt you may have other problems, such as a loose connection, poor crimps or damaged wire / components. This must be addressed first and once done, redo above readings. The BATTERY reading (be it a single or a bank of packs) is the one that RULES and the remaining equipment must "match up" to be effective.

Example using basic numbers to Keep It Simple:
Assume the Battery reads 24.0 VDC, the SCC reads 23.75 VDC and the Inverter/Charger sees 23.60 VDC.
IF the desired CHARGING cutoff is 24.0 VDC, then the SCC would have to be corrected for the 0.25V shortfall in readings, so it would be programmed to cutoff @ 24.25 VDC. The Inverter Charger "Charge cutoff" would then also have to be corrected to 24.40 VDC to compensate for the 0.40 VDC difference.

Same for LOW Volt Disconnect !
The Inverter will have it's own LVD setting and this is really important. While 0.40V is not a big difference, it can be if you want to keep with a very specific range and with Lithium .40V at the bottom edge CAN BE SIGNIFICANT ! So you would have to Correct the voltage the Inverter/Charger sees so that it cuts off exactly at the voltage specified "at the battery terminal end". So IF you want the LVD to kick on when the cells reach 2.75VDC ea / 22.0 VDC for the 24V pack/bank the LVD setting will have to be adjusted to 22.40. This way when the Inverter/Charger sees 22.40 Volts it cuts off as the actual batteries are at 22.0VDC. 21.60 VDC = 2.70v per cell.(uncorrected) * REMEMBER, that below 2.80V per cell the voltage drops very fast as you in the "bottom 20%" of cell capacity.

Don't make the BMS do the work it shouldn't do...
The BMS of course will cut off for High Volt, Low Volt etc but this is not it's job, those are "safety" features to protect you batteries and are more or less the "fail safe mechanism", as such they should not be doing that work as a matter of normal operations. This is really the job of the SCC & Inverter/Charger to manage on an ongoing basis. Continually using the BMS to do this lifting can actually affect the BMS negatively and even cause burn outs on FET based systems with repeated "abuse" being shifted to the BMS, it is not what they are designed to do.

I expect that some will want to dig into minutia and details... This is just a basic overview and I am writing this while still working on Coffee #1 so I'm sure typo's and some minor details may be left out. This is also quite GENERIC because different equipment handles settings & configurations in their own way and that makes it pretty difficult (read impossible) to address all the variables.

I hope this helps
Steve
Steve,
I'm a bit late coming to this party, but I use an EVO in my motorhome and recently upgraded to 2 x 100Ahr 12V batteries and a search of the site lead me here.
Reading this post has me a bit confused, but I think I know what you are trying to say.
-> I think you want to use the DVM to measure the voltage at the battery, then compare that to the voltage on the display of your SCC, inverter and charger. You want to see what each device's internal voltage reading is in comparison to each other so you can correct for each device's internal measurement error.

What I read in your instructions is to use the DVM to read the voltage at the battery terminals of each device. Under your instructions of "no load" (ie. no current flowing between the battery and any of the devices), all the reading should be the same since no current flow, then no voltage drops (IR losses. Zero current times any resistance = zero voltage drop) and you are using the same voltage measurement device at each terminal.

When current is flowing (charging or consuming), any resistance in the circuit will cause a voltage drop. In my small motorhome there is no way to get my EVO1212F closer to the batteries without major renovations to the cabinetry and loss of storage space. When Bulk charging at 30Amps, I have a drop of 0.45V between the EVO and the batteries. This voltage drop then decreases during Absorption as the current decreases. This means I spend less time in Bulk and more time in Absorption.

There is about 8ft of #1 cable and a couple of short #4 jumpers, the negative side is through the aluminum body framework. As someone else mentioned, remote voltage sensing would correct this issue. I contacted Samlex about it and they acknowledged it would be a good idea, but of course refused to help me implement it on my unit. Correcting this will be a project for this winter.
 
Warning, operating on Mugga Joe #1, 3rd sip, so not quite "full on" yet.

I say under "No Load" but that isn't true actually because the equipment is ON but not appliances / devices that draw juice otherwise. Inverter, SCC etc all take a bit (pending on size it varies) but that is as neutral as anyone can get. In my case I have a 0.5V difference (and I'm running Fine 4/0 but 14' batt cable run) so not entirely unexpected. That can be bad given how sensitive Millivolt Accuracy counts with LFP (Any Li Ion actually).

The gotcha is some equipment like Midnite SCC's for instance have a compensation/correction feature but the Samlex Inverters don't so you have to play with the voltage setting to compensate. Every product is different.

Readings will be different under Load and will vary with the amount of load being pulled. Charging similarly will be different as well. Then to further complicate charge Push, is the concurrent diversion to ongoing loads to service whatever is drawing power which will vary in any "in use solar system".

The thicker the wires & finer the strands the less loss over distance. You mention #1 & short #4, I am assuming that is #1 AWG and not 1/0 AWG, huge difference. See Royal Excelene table below. TBH, I woud have used at least 1/0, Very Fine copper like Excellene (some of the best wire available) and ALL BATT wires, including your Jumper wires as you call them to be same guage & length with identical lugs etc. Even Tinned vs Bare Copper can make a difference, did nto believe it till I saw it myself. Little Lessons Learned all of which carried $$ fo the education.

NB, In your case, due to Mobile Use, fishing wire through potentially nasty holes, I would Highly Suggest UL Listed Super Excelene with CPE Jacket (really tough stuff) to prevent any abrassion issues. Is it possible to use the existing fished wires to "pull" new wire through ?

Link to Super Excelene Wire: http://industrial.southwire.com/en/tile/10/spec/70300/?country=CA

Regular Excelene Table Non-UL:
1627653160643.jpeg

Royal Excelene UL Listed CPE Sheathed Cable Table:
1627654192536.png
 
Pro Point is a brand for my DVOM.
I have tested this with other meters and devices and it is very accurate but a bit of a learning curve. It is NOT a fluke but damned close TBH. Only Pet Peeve, is it eats batteries fast, so you have to have a spare 9V and 3 AAA's in the pouch to replace them. The price was too good, I took the chance and ended up happy with it.
I use a Fluke 88 Automotive Kit at home and also a Fluke 80i-410 AC/DC clamp meter. Most of solar measurements can be made with that set up. But the Digital Multimeter I use in my calibration lab at work is the Bugatti of DMM's Fluke 8508A 8.5 digits of resolution with a basic dc accuracy of 3 PPM and it cost my lab $14,000.00. I keep my test equipment at home calibrated using a Fluke 5522A Multiproduct Calibrator also expensive at $24,000.00. But do you know what having all the nice test equipment has done for my knowledge of solar systems? Not a damned thing, I can make measurements like there is no tomorrow but when it comes to solar and battery technology, I am damned knuckle dragging nincompoop. I was drawn to this article because you used the term calibration so I was like, boy oh boy, something I do sort of understand. You are right about one thing. Generally speaking DMM's these days are cheap and moderately accurate given the resolution they have. Most of the monitoring voltmeters that comes with an ATS or Charge Controller, etc are usually around 100 mv resolution so I am always biting my tongue when using them. I was interested in that one meter with clamp you linked to but the web page is expired. Come to think of it, I didn't check the age of this thread. Whooopsie.
 
I too have access to equipment that is way over any price point that it should be at ... BUT my grab and go meter is a https://www.amazon.com/gp/product/B07Z398YWF/ref=ppx_yo_dt_b_search_asin_title?ie=UTF8&psc=1
Wait a minute. If I read the information that one will measure both ac and dc current? I have a similar meter at home, but it only measures AC current. I think it is an Amprobe ACD-21SW. I know what my next purchase will be. I like my Fluke but it is an older model and the zero is always having to be adjusted and since the zero and full scale adjustments are not labeled, I have accidently adjusted the full scale adjustments and have to bring the unit into work and re-calibrated. I think the next time I will put a void label over the full scale. Thanks for linking that meter. I will either buy one or have my oldest son buy me one. I like the all in one convenience of that type of meter. Edit: If you have a decent current source you can use loops to calibrate the high end of that meter. For example 10 loops at 10 amps will make that clamp read 100 amps I use 25 loops at 20 Amps to give me 500 amps on the clamp. Plenty high enough for our needs. Fluke makes a nice 5500A/COIL but we are back to crazy prices for what is effectively a 50 turn coil of wire.
 
Last edited:
I use a Fluke 88 Automotive Kit at home and also a Fluke 80i-410 AC/DC clamp meter. Most of solar measurements can be made with that set up. But the Digital Multimeter I use in my calibration lab at work is the Bugatti of DMM's Fluke 8508A 8.5 digits of resolution with a basic dc accuracy of 3 PPM and it cost my lab $14,000.00. I keep my test equipment at home calibrated using a Fluke 5522A Multiproduct Calibrator also expensive at $24,000.00. But do you know what having all the nice test equipment has done for my knowledge of solar systems? Not a damned thing, I can make measurements like there is no tomorrow but when it comes to solar and battery technology, I am damned knuckle dragging nincompoop. I was drawn to this article because you used the term calibration so I was like, boy oh boy, something I do sort of understand. You are right about one thing. Generally speaking DMM's these days are cheap and moderately accurate given the resolution they have. Most of the monitoring voltmeters that comes with an ATS or Charge Controller, etc are usually around 100 mv resolution so I am always biting my tongue when using them. I was interested in that one meter with clamp you linked to but the web page is expired. Come to think of it, I didn't check the age of this thread. Whooopsie.
Back in the day when I had my electronics shop it was fully fluked out but those days are long past. Today, most DMM/DVOM are pretty darn good, not "Lab Grade Milspec" but hey, do we need to be that anal ? Seriously ?? NO obviously. But it is the Individuals level of ExLax consumption they have to decide on LOL...

Thing is with Solar Systems, is these are BUGGERS ! To put it simply, the whole deal plays weird depending if you are charging or not as 2 Rules apply and you have to find happiness between the strands of "Razor Wire" (INSERT IMAGE OF STRADDLING MILITARY RAZOR WIRE) there IS ouch potential !
Lead is "brute force" and can take abuses to a fair limit actually.
Lithium on the other hand NOT Brute force and quite mean if abused, it'll burn your keister for revenge !
LIFEPO Thank Goodness is a POLITE Lithium Chemistry hence why it is best for ESS and most especially DIY but it still mustn't be abused.
While Millivolts count always, they are ever more important with Lithium Chemistries and when equipment reads voltages differently and act upon those readings like cutting off charge or discharge to protect the system and the batteries, it has to be right.

BTW: Several meters with Amp Clamps can do AC or DC without issue. Now finding one with decimal places ie: 000.000 Accuracy get's trickier (it's a bitch really)

Good Luck, Have Fun.
 
Back in the day when I had my electronics shop it was fully fluked out but those days are long past. Today, most DMM/DVOM are pretty darn good, not "Lab Grade Milspec" but hey, do we need to be that anal ? Seriously ?? NO obviously. But it is the Individuals level of ExLax consumption they have to decide on LOL...

Thing is with Solar Systems, is these are BUGGERS ! To put it simply, the whole deal plays weird depending if you are charging or not as 2 Rules apply and you have to find happiness between the strands of "Razor Wire" (INSERT IMAGE OF STRADDLING MILITARY RAZOR WIRE) there IS ouch potential !
Lead is "brute force" and can take abuses to a fair limit actually.
Lithium on the other hand NOT Brute force and quite mean if abused, it'll burn your keister for revenge !
LIFEPO Thank Goodness is a POLITE Lithium Chemistry hence why it is best for ESS and most especially DIY but it still mustn't be abused.
While Millivolts count always, they are ever more important with Lithium Chemistries and when equipment reads voltages differently and act upon those readings like cutting off charge or discharge to protect the system and the batteries, it has to be right.

BTW: Several meters with Amp Clamps can do AC or DC without issue. Now finding one with decimal places ie: 000.000 Accuracy get's trickier (it's a bitch really)

Good Luck, Have Fun.
Many of the measurements I make at home rarely care about accuracy but rather the relative presences of whatever I am measuring. Yesterday I was more interested in how the total amount current being produced by my three panels were being distributed between each panel. Each panel was pushing 4 amps and it was pushing 12 amps into the house so I was happy, not really caring about the exact amount of current. Yes, another member linked me to an AC/DC all in one clamp. I paid 70 dollars for a used Fluke 80i-410 when I could have paid 50 bucks for the one linked and it's easier to use. Seems even an old dog calibrator can still learn about TMDE. Thanks man and it is a load of fun, when everything is working. When it's not, my Mr. Fix it gene kicks in and shit goes sideways when I am out of my element.
 
I have a 48 volt lithium iron phosphate battery pack. What do I want my high and low voltages to be set at 45 and 51.3?
 
Do you know of a device to check voltage/amps (calibration device?)
If you are talking about a device that would calibrate the meter you are using to check the voltage and amps I would suggest this:


If you want a good meter that will give the most accurate readings then I suggest this:


Disclaimer: I am just joking around. Basically those are the Lamborghini of meters and meter calibrators. Don't listen to me when it comes to measurement science. The guys here are far more realistic with their advice.
 
This article and this forum resource are great places to start your learning, and give you the knowledge and confidence to make those determinations on your own.
I looked at both of your references and am confused as to why my battery mfg told me 100 percent was 51.3 and that i could run it down as low as 45 or 46. Your references are alot different and Im not sure which to believe now. I have been reading as much as i can on here and still havent found the info Im looking for. Thanks for being the librarian
 
  • Like
Reactions: Dzl
I looked at both of your references and am confused as to why my battery mfg told me 100 percent was 51.3 and that i could run it down as low as 45 or 46. Your references are alot different and Im not sure which to believe now. I have been reading as much as i can on here and still havent found the info Im looking for. Thanks for being the librarian
I am curious what is the make and model for your battery set? Obviously you have a 48 volt set up going. But I also looked at the charts and I agree the numbers are significantly different than what your supplier gave you and to be honest, even between websites and application sites the values differ greatly. Too many minds trying to do too many different things and all that has an influence on what your best practices are for using and maintaining your batteries, some people want short term power, others want to be in that happy place that gets the longest life out of the battery. I am just a calibration tech but I work at a postgraduate school so I bet I can find an engineer that can answer that question. Something tells me that it all comes down to the performance of a singe cell and everything else is just put into series or parallel and the numbers adjusted accordingly. Or not.
 
I am curious what is the make and model for your battery set? Obviously you have a 48 volt set up going. But I also looked at the charts and I agree the numbers are significantly different than what your supplier gave you and to be honest, even between websites and application sites the values differ greatly. Too many minds trying to do too many different things and all that has an influence on what your best practices are for using and maintaining your batteries, some people want short term power, others want to be in that happy place that gets the longest life out of the battery. I am just a calibration tech but I work at a postgraduate school so I bet I can find an engineer that can answer that question. Something tells me that it all comes down to the performance of a singe cell and everything else is just put into series or parallel and the numbers adjusted accordingly. Or not.
Thank you. I have 9 200 ah Rosen Powerwalls bought from China. I want everything I can have out of them for the longest I can have it. It would be nice to get 20 years out of them before they need to be replaced. That said Im using them in an off grid set up north of MT and it would be nice to know how much capacity they actually have before I might need to start my generator in the cold winter like weve just been thru
 
I looked at both of your references and am confused as to why my battery mfg told me 100 percent was 51.3 and that i could run it down as low as 45 or 46.
It may have something to do with the chemistry and how many cells or cell groups in series. Do you know what the configuration actually is?
51.3 ÷15 = 3.42 per cell.
 
Last edited:
  • Like
Reactions: Dzl
I looked at both of your references and am confused as to why my battery mfg told me 100 percent was 51.3 and that i could run it down as low as 45 or 46.
my guess would be either:
  1. Something was lost in translation or possibly a misunderstanding by you, them, or both
  2. Your battery 'manufacturer', or the specific sales or cs rep you spoke with doesn't have the technical depth of understanding to give correct technical information (bear in mind most brands, even those that imply or claim to be manufacturers don't actually design or manufacturer what they are selling).
  3. Maybe these are 15S packs ("48V" for lifepo4 typically = 16S, but its possible they've gone with a 15S config maybe(?) which would make the numbers make slightly more sense)
Do you have a link to what you bought? Or to the documentation or datasheet?
 
Your references are a lot different and Im not sure which to believe now.
Pretty much all reputable resources and companies for the most part are roughly inline with one another as to the working range and limits for LiFePO4 cells, with some slight variations. There are a couple unique outliers like Winston or Battleborn which have their own quirks due to chemistry or some other factor, but that is the exception to the rule.

Your 'manufacturer' is giving numbers that are extremely out of sync with the rest of the industry (specifically the 51.3V = 100% claim) assuming they are LiFePO4 cells in a typical "48V" 16S configuration. You should ask them for documentation, ideally from the cell manufacturer that confirms the numbers they are telling you.

What @fblevins1 tells you is correct, it all comes down to single cell voltages, everything else is just an extrapolation of those numbers by multiplying by the number of cells in series (x4 for "12V" (12.8V) x8 for "24V" (25.6V) x16 for "48V" (51.2V). Because individual cell specifications are what matters most here, the most authoritative document with respect to proper voltage limits (in my eyes) is the spec sheet from the cell manufacturer. If you buy prebuilt batteries, it may not be possible to get this, but it is worth asking for, or at least asking the CS rep you are speaking with to confirm it themselves on the spec sheet or with an engineer.
 
It may have something to do with the chemistry and how many cells or cell groups in series. Do you know what the configuration actually is?
51.3 ÷15 = 3.42 per cell.
I do not but I now know what question to as the mfg. Thank you. Even if there were 15 cells isnt 3.42 per low ?
 
my guess would be either:
  1. Something was lost in translation or possibly a misunderstanding by you, them, or both
  2. Your battery 'manufacturer', or the specific sales or cs rep you spoke with doesn't have the technical depth of understanding to give correct technical information (bear in mind most brands, even those that imply or claim to be manufacturers don't actually design or manufacturer what they are selling).
  3. Maybe these are 15S packs ("48V" for lifepo4 typically = 16S, but its possible they've gone with a 15S config maybe(?) which would make the numbers make slightly more sense)
Do you have a link to what you bought? Or to the documentation or datasheet?
Let me see what the vague 3 page owners manual says...
 
Pretty much all reputable resources and companies for the most part are roughly inline with one another as to the working range and limits for LiFePO4 cells, with some slight variations. There are a couple unique outliers like Winston or Battleborn which have their own quirks due to chemistry or some other factor, but that is the exception to the rule.

Your 'manufacturer' is giving numbers that are extremely out of sync with the rest of the industry (specifically the 51.3V = 100% claim) assuming they are LiFePO4 cells in a typical "48V" 16S configuration. You should ask them for documentation, ideally from the cell manufacturer that confirms the numbers they are telling you.

What @fblevins1 tells you is correct, it all comes down to single cell voltages, everything else is just an extrapolation of those numbers by multiplying by the number of cells in series (x4 for "12V" (12.8V) x8 for "24V" (25.6V) x16 for "48V" (51.2V). Because individual cell specifications are what matters most here, the most authoritative document with respect to proper voltage limits (in my eyes) is the spec sheet from the cell manufacturer. If you buy prebuilt batteries, it may not be possible to get this, but it is worth asking for, or at least asking the CS rep you are speaking with to confirm it themselves on the spec sheet or with an engineer.
Just a guess but I think BYD might be the MFR of the LifePO4 batteries used in the Rosen Powerwalls as they show BYD as a partner. If that is the case then the specifications for the BYD standard 3.2 volt cell exactly matches the original references given for a LiFePO4 battery, not the values given to him by Rosen Solar. But, since his is a powerwall (a nice one too) then who knows maybe they set some kind of limit (BMS setpoints?, do I even know what I am talking about?) I get confused because I never if the voltage values given are loaded or unloaded, or does floating value mean unloaded and if you do load it...how much of a load to need to give a reasonable state of charge. Ugggg.
 
Back
Top