diy solar

diy solar

Your Charge Controller "Charges" the line between your Battery.

Status
Not open for further replies.
Oh yea, man. I totally agree with you; that sweet spot of 3.45 for both charge and disconnect voltage would be great! But I recently have been running in parallel a second super cheap 200Ah Chinese Battery with no BMS App or settings, and so, I picked 3.55v - a balance between the batteries. I think the cheapo battery disconnects at 3.6 because they do disconnect at about the same time, but I have not babysat them to see the exact voltage the cheapo turns on/off. They are close. But if I could, I'm all about that sweat spot of 3.45.

And so you know, and I did state this in my crazy controversial post about how I configured my charge controller, when I run my system, it is never full, and when I don't use the system I charge it up, and then disconnect and turn everything off...

I also did say very clearly at the top of the post that this system isn't perfect, but it not only works, and I ACTUALLY AM able to get a lot more power of of the system, all be it at the cost of what I consider to be overly cautious "industry norm" parameter settings...

As well as I go through the whole shabang of qualifying exactly what components are in the system, clearly stating its for small LFP systems that do not have data sharing etc. We are getting there - Resolve the Delta discrepancy, is it the wire, is it only my system, is it still a problem with a larger wire connected directly to the battery from the controller... etc.

Thanks again for the help and information, its been really great working with such a knowledgeable community.
Don’t worry it’s the wire , if it’s not you’re on the way to an interplanetary FTL drive invention
 
Your are using this comment and drawing incorrect conclusions , a battery under charge has an equivalent resistance. The resulting battery terminal voltage is defined by ohms law , given that there’s such resistance and a finite small resistance in the wire. It’s usually that the SCC voltage is marginally higher , this higher voltage is SOLEY down to the resistance of the wire and any voltage difference is lost as heat in the wire.

What’s absolutely technical nonsense os the charger voltage is higher then the battery terminal voltage to push more current. That’s technically IMPOSSIBLE. Yes given a powerful charger you can raise the charger voltage and yes more current will flow into the battery ( usually if it’s not fully charged ) but the battery terminal voltage , ie at the wire end is the same as the charger , less any resistance drop due to the wire.

Your suggestion to allow the charger to rise to 15V when your BMS cutoff of set lower is USELESS , with a proper low resistance cable. The bms will disconnect the battery when the charger voltage exceeds its setting

Note HVC should not be used to terminate charge anyway
Hi, yes, others have pointed out that Browneyes statement didn't reflect the large 1 volt delta that is being seen from the controller terminal voltage to the battery terminal voltage. I have been working on thanking everyone for their help and learning as much as I can from this great community.

I have stated this several times now from replies to different peoples posts, but; yes, there are several technicalities I need to address, but basically the first thing I am doing is swapping out the 8 gauge AWG wire with a high quality #6AWG and i will connect the controller directly to the battery. Then I will take the readings and report my findings.

Once I have that issue under control, I will re-evaluate my settings, and parameters, for sure.

A 200Ah+ LFP battery that is half way discharged can absorb a lot of energy, how this reflects in real world calculations, along with all the other excellent science and math should line up with real world results.

That is why I am asking people to tell me their system specs, how they use their system and if I can ask you;

Can you please take a reading of your Charger & Battery Terminals at Peak Sun tomorrow - I am trying to build a data set and if you could double check your actual terminal readings AT Peak Sun, and report the numbers that would be greatly appreciated (And if you have Apps, screenshot the values in the apps too)

again - thank you for all the help and the great community knowledge.
 
Don’t worry it’s the wire , if it’s not you’re on the way to an interplanetary FTL drive invention
Disclaimer: The next statement is a tongue and cheek joke/hypothetical analogy/conceptual conveyance of (its way to late tonight) far-out concept

My Battery is a Black Hole and the Event Horizon is somewhere on the line from the Battery to the Charge Controller! lol
 
I picked 3.55v
After your quote I added a further explanation to clarify what I think may be happening. This is just conjecture and you may be able to verify if in fact that is what is happening.
Your setting is 3.8 per cell but at a 25 Amp charging current the voltage drop reduces that to 3.55 at the battery. However when the charge controller hits what it thinks is 3.8 volts per cell, the current begins tapering down from 25 Amps. The charge controller is still putting out 3.8 volts per cell but as the current drops so does the voltage drop which results in the voltage at the battery increasing. Eventually as the current tapers more the BMS sees battery voltage hitting the disconnect voltage which is presumably 3.65 volts per cell and the BMS does it's thing and disconnects the battery.
It's late at night and I will reflect on this in the morning but would be curious if the wiser minds than I think this is a possible explanation?
 
Last edited:
After you quote I added a further explanation to clarify what I think may be happening. This is just conjecture and you may be able to verify if in fact that is what is happening.
Your setting is 3.8 per cell but at a 25 Amp charging current the voltage drop reduces that to 3.55 at the battery. However when the charge controller hits what it thinks is 3.8 volts per cell, the current begins tapering down from 25 Amps. The charge controller is still putting out 3.8 volts per cell but as the current drops so does the voltage drop which results in the voltage at the battery increasing. Eventually as the current tapers more the BMS sees battery voltage hitting the disconnect voltage which is presumably 3.65 volts per cell and the BMS does it's thing and disconnects the battery.
It's late at night and I will reflect on this in the morning but would be curious if the wiser minds than I think this is a possible explanation?
yea, its late, I can't wind down, but I can't think anymore either. I hope to tear into it tomorrow - hope its not too hot. Thanks again for the great community and all the details, I really missed a lot of points and I think growing pains are good. It will mean I have a rock solid system - the last thing you want is a failure during... who knows what might be coming next... Will report on all progress. thanks again
 
After you quote I added a further explanation to clarify what I think may be happening. This is just conjecture and you may be able to verify if in fact that is what is happening.
Your setting is 3.8 per cell but at a 25 Amp charging current the voltage drop reduces that to 3.55 at the battery.

That’s a fair bit of wire drop but ok. A good mppt controller should have a separate battery sense line , Victron uses it’s VE.smart Bluetooth networking to feed back actual voltage at the battery terminal to the mppt chargers. This compensates for voltage drop in the high current cable.
However when the charge controller hits what it thinks is 3.8 volts per cell, the current begins tapering down from 25 Amps. The charge controller is still putting out 3.8 volts per cell but as the current drops so does the voltage drop which results in the voltage at the battery increasing.

The solar charger can only tapers the current by reducing its voltage. The normal way is the load tapers the current. In a battery the current tapers because the load ( battery ) resistance has risen. Hence for a given charger voltage the battery is determining the current not the charger ( unless it doesn’t have enough power available ?

The voltage at the battery is ohms law , battery resistance times current flowing.
Eventually as the current tapers more the BMS sees battery voltage hitting the disconnect voltage which is presumably 3.65 volts per cell and the BMS does it's thing and disconnects the battery.

Bms HVC cutoff should not be used to control charging. The charger settings should be programmed to stop charging itself. HVC cutoff is a safety trip not a normal operation
It's late at night and I will reflect on this in the morning but would be curious if the wiser minds than I think this is a possible explanation?
Good stuff. It’s true that if you don’t sense battery voltage via a sense mechanism , then your mppt settings will need to be adjusted up slightly to compensate for wire voltage drop. Nothing new or controversial in that
 
Hi, when you say "ground wire length" are you talking about the "Black Negative Wire" - if so, it is the same size as the Positive Negative Wire.

As for a "Ground, Ground" - like in modern home AC systems. There is no ground from the panels to the controller, battery, or inverter. They are all connected ONLY by positive and negative cables. My Charge Controller may be grounded because it is screwed to the metal in my RV.

Also, I have been asking people to take a reading of their Charge Controller Terminals and Battery Terminals at Peak Sun, so I can build a baseline of what is actually being reported "in the wild" and have some data sets to compare. If you could do that it would be greatly appreciated.

PS: Also spec out your system, and how you use it. Thanks again for all the great information and community.

PPS: I am tearing everything apart and swapping out/upgrading the wire tomorrow, checking all the fittings and connections and I will connect the charger directly to the battery and take a reading with the larger #6 wire and report the findings - thanks again!
Yes, I was talking about negative wire when I was asking about ground wire. Ground and negative are often used as synonyms but maybe its clearer if we talk about negative wire.

Reason I was asking for it is that sometimes people forget to consider the negative wire voltage loss/drop. (Negative wire voltage drop is really common with car "under the hood problems" but often overlooked or totally forgotten)

I'd strongly recommend that you find the exact cause of your voltage drop before going and "blindly" replace components and wires!
8 AWG _should_ be enough for 30A in 6 foot roundtrip(3ft positive+3 feet negative wire) so you have something funky going on.

Set your system so that you have nice charging current going to battery and start taking voltage loss measurements over the parts:
1. CC positive terminal to battery positive. Try to measure from the terminal itself, not cable lug
2. CC negative terminal to battery negative. Try to measure from the terminal itself, not cable lug

Both readinds should be ballpark 0.05 volts at 30A charge current. If one is lot bigger than another start going trough connections on that side.
Next you can measure the voltage drop between terminal and the cable lug(if you use cable lugs) Again you should see very small voltage, ballpark 0.03 volts at 30A
After that cable lug to wire strands itself. If there is a little bit of wire strands visible between the insulation and the cable lug you can put one probe on the wire strands and another probe on the lug. I would use sharp needle point multimeter probe to pierce the wire insulation to get reading from the wire strands. Again, you should see very small voltage, ballpark 0.03 volts at 30A

I don't have "my own" system as it is barely in planning stages and solar power here in 61N is kind of limited with 4 months of darkness. :rolleyes:
Now before you jump into conclusion "that guy doesn't even have his own solar panel system yet he is telling me my system is set up wrong" ;) I'll have to say that I have 30 years experience in electronics design, fault finding and troubleshooting.
 
That’s a fair bit of wire drop but ok. A good mppt controller should have a separate battery sense line , Victron uses it’s VE.smart Bluetooth networking to feed back actual voltage at the battery terminal to the mppt chargers. This compensates for voltage drop in the high current cable.
The central issue here is the one volt drop in voltage. The OP has verified a one volt loss in voltage from the charge controller to the battery. It could be in the wire, it could be in a connection or two and some of it could be in the bus bar which apparently is brass. Brass is a poor conductor not withstanding the fact that the bus bar is rated for 250 Amps. I can put 250 Amps through a big enough piece of steel to not heat the steel, but the resistance of the steel will still cause a voltage drop. Most of my shunts use steel as the conductor because of the resistance.
It has been suggested that the OP test the voltage drop at several points along the path. So far he has not done that. Replacing one wire section may or may not fix the issue. A sense wire will already reveal what we already know. A sense wire will not fix the voltage drop issue.
 
Last edited:
Much of the focus seems to be on improving the connectivity between his charge controller and his battery. I think this is all good advice, so I won't add to that. Other things that I've noticed and not sure they have been addressed clearly:
  1. The OP has some serious misunderstandings about how charging works, even after the 8 pages of discussion. Charge controllers do not have a "high voltage disconnect" setting. Charging is stopped when it is completed, based on the charge profile settings of the charger. He should read up on and understand the concept of 3-stage charging, and the terms "bulk voltage", "constant current phase", "absorption voltage", "constant voltage phase", "tail current", "max absorption time", and "float". Many (most?) of these terms have acronyms depending on the device, but almost all adequate charging devices will have these or similar settings.
  2. On the first page of this thread the OP posted a BMS screen shot that showed his cells were getting wildly out of balance while charging. This appeared to me that he probably did not top balance the cells before assembling the battery. I searched and did not find any discussion about top balancing in the thread, so if someone did point that out already I apologize.
  3. I know it has been said before, but I want to emphasize: The purpose of the BMS is to prevent damage to the cells, and only that. A disconnect by the BMS means that something is wrong, and changes should be made to remedy whatever is wrong.
  4. For a 4S LiFePO4 battery, most experienced people here would say that a charge voltage (bulk / absorption) of 14V is high enough (that's what I use), and man prefer something more like 13.8V. The OP has posted several screen shots showing some 15V or higher, which of course will always eventually cause the BMS into HVCO.
Not to take anything away from all the helpful comments that everyone has given, but I think that some losses due to wire and connections are acceptable, so the OP should not be striving to completely eliminate any warmth in the wire or voltage drop (1V is clearly too much). At some point you simply have to account for the fact that there will be some loss when charging at higher current, and that the losses will drop as the battery approaches fully charged. In one of my systems that has a high-current generator as the primary charging source, I set the bulk voltage a bit higher than the absorption voltage. The charger sees that higher voltage while the battery is still a fair amount lower. It then switches into a lower absorption voltage, and the two observed voltages approach each other.
 
That is why I am asking people to tell me their system specs, how they use their system
It is overcast today and I may only get a 20 Amp charging current. Earlier my measurements do not show any loss. I do use 4/0 cable even though the maximum I can charge at is 100 Amps, I rarely see even that many Amps into or out of the battery.

I will describe my system beyond what is in my signature. I have a 42 kWh pack in my garage connected to my hybrid inverter. My hybrid is set for Self Consumption Mode which means I only use the grid for backup. Last night was typical in that I used about 8 kWhs from when the sun went down until the solar started covering my loads and the battery stopped discharging. Normally in summer the batteries are charged before Noon and the rest of the day my solar covers the loads and any extra gets sold to the grid. Overall I generate about 45 kWhs a day and average about 26 kWhs per day sold to the grid. I charge two EVs and run some other loads outside the inverter. I won't go into the nuances of Net Energy Metering because it is not germain to this thread.
 
Last edited:
Charge controllers do not have a "high voltage disconnect" setting. Charging is stopped when it is completed, based on the charge profile settings of the charger.
Correct, his has a Boost setting and his is set at 15.2 volts which presumably compensates for the one volt loss in the wiring. That would mean the battery only sees 14.2 during the CC stage which reportedly runs at 25 Amps.
During charging the voltage delta is 0.04 volts which I would not call "wildly out of balance". However that reading was only at the beginning of the knee (3.4 volts per cell) and we do not know what it looks like when the BMS actually cuts out. It could be a runner cell or total voltage since his charge controller is set to 15.2 volts or 3.8 volts per cell.
 
Last edited:
Correct, his has a Boost setting and his is set at 15.2 volts which presumably compensates for the one volt loss in the wiring. That would mean the battery only sees 14.2 during the CC stage which reportedly runs at 25 Amps.
Right, but he has the max boost time set to 10 hours (600 minutes). So as the battery reaches full charge the current drops, so the voltage lost on the connections is decreased, so the battery reaches over 15.2V, no matter how much loss he has on the wire. So he will ALWAYS have a BMS HVCO.
 
Right, but he has the max boost time set to 10 hours (600 minutes).
I did not notice the Boost time. I did notice the float voltage of 15.2 volts but did not comment on that. Thank you for confirming my assumption in my late night post that as the current tapers in boost, the voltage drop will decline such that the battery could see as much as 15 volts (3.75 volts per cell) thus triggering the BMS.
I still think if the voltage drop in the wiring system could be reduced he could lower his charge voltage setting and have the charge controller turn off like most systems do. Notice that I said "wiring system" because it could be any where along that path and not just one wire.
 
I found an older Renogy app that lets me set my SCC High Voltage Disconnect. My guess is that many people just can't access that setting.
Well then I guess I should temper my statement a bit, and say that very few SCCs would have such a setting. I've never seen it. My guess is that the High Voltage Disconnect in an old PWM SCC would be another term for Bulk/Absorption voltage. My personal opinion is that it is a bad term.

Thanks for the correction though.
 
With what you have said, as well as so many others in this forum, (thank you by the way) - what do you recommend I do.

I still recommend to do a 'proper' (series) voltage drop test to confirm the numbers/math adds up to the measured voltages on each end, they should all add up.


What system are you running? Do you have modern hardware? Do you have LFP, or Lead Acid?

Modern Victron SCC's with DIY LFP batteries, and copper bus bars and Victron BMV-712 shunt in between 2 negative bus bars.


What Charge Controller are you using? What is the reading at the posts of your charge controller terminals, and battery terminals during peak sun? Do you see any Vdroop?"

Thanks again.


Sure no prob...

So I did say what I kind of system have before to some extent, but I can list it again (all in the one place) real quick. I was able to go out and take a couple quick measurements, while my batteries were charging full-bore, both charge controllers were pegging at their maximum current of 100a output each. Victrons running very warm for sure.

My configuration:

-I have 2 Victron 250|100 charge controllers (each has their own 8-panel solar array, so 16 panels, 405w each, 4s2p wired).

-I am using 8 AWG cable on the PV solar circuit (from panel banks, each through a respective DC combiner box, then to each Victron SCC)

-I am using 2 AWG cable going from each of those Victron charge controllers to the main copper bus bars (one bus bar for + and two bus bars for - cable attachments on each side of shunt), and as such, the 2 AWG size is the largest gauge that can fit into the Victron SCC connection lugs, and what their manual says to use there (couldn't go bigger, unless I cut off some strands on a larger gauge cable).

-I am running from the copper bus bars to batteries, 3 SEPARATE sets of 2/0 cables, to each PAIR of 2x 12v battery packs

-Charging to a full bank of six, 280Ah 12v LFP DIY packs (1680Ah, 22KWh), with 6 Overkill 120a BMSs (JBD BMS)


My observations...

I do have a little bit of voltage drop in the circuit between charge controller and batteries.

-Sample voltage measured at Victron SCC terminal lugs is about 14.15v
-Sample voltage measured at the batteries connectors is about 13.79v

If I do a proper voltage drop test on the positive cable (place the + lead of voltmeter to the Victron +, and put the - lead of voltmeter to the + of the battery connection), I measure .2v drop in that run.

If I also do a proper voltage drop test on the negative cable in the same way (place the + lead of voltmeter to the battery connection -, and put the - lead of voltmeter to the - of the Victron), I measure .17v drop in that run.

So the math all adds up. Add those two voltage drop numbers together from the positive and negative cable voltage drop tests, which is .37v. For reference: 14.15v - 13.79v = .36v (.01v off but oh well, it basically adds up almost nuts-on).

I measure the temperature of the wires connecting the Victron SCC to bus bar (the 2 AWG cables) using an infrared temperature gun, I see 105° F (cable is rated for 221° F max).

I consider this to be about as good as I can get, since Victron doesn't make larger battery connection lugs on the 250|100. I could use bigger cables and trim out a few strands to make it fit the lugs if I wanted to try and lower the cable temperature, but I am willing to accept the minor loss, since the energy from the Sun is free anyways, it's just the cost of doing business at a most reasonable price vs expectation point.


A little bit different scale than your system, but kind of showing the workflow of checking out voltage drop is the same. In your case, the loss is not acceptable due to a full 1v drop being perceived here.
 
Last edited:
I think you will find the Renogy high volts disconnect relates to a disconnect of battery to load terminals, its not related to charging.
 
Well then I guess I should temper my statement a bit, and say that very few SCCs would have such a setting. I've never seen it. My guess is that the High Voltage Disconnect in an old PWM SCC would be another term for Bulk/Absorption voltage. My personal opinion is that it is a bad term.

Thanks for the correction though.
I have a Renogy Rover MPPT charger and their older control app (Renogy BT) has many user settings that their current app doesn't. I would bet that many SCCs have settings that you just can't access.
 
I think you will find the Renogy high volts disconnect relates to a disconnect of battery to load terminals, its not related to charging.
That may be true. I'll see if I can verify that. One thing that I notice is that the value of the High Voltage Disconnect differs according to battery type. Wouldn't that indicate it's related to charging, and not the load terminal? Edit: In the user manual High Voltage Disconnect is listed under battery charging parameters.
 
Last edited:
@Michael77
The sun came out and my inverter is charging the battery at 60 Amps and I am seeing a 0.1 volt drop from the inverter/charger terminal to the battery. That is for a nominal 48 volt system The actual readins were 53.9 volts at the battery and 54.0 volts at the inverter charger. That amounts to less than one tenth of one percent (0.001). That compares to one volt on a 14 volts system of seven percent (0.07) As mentioned throughout this thread, that is a lot of voltage loss.
 
Last edited:
That compares to one volt on a 14 volts system of seven percent (0.07) As mentioned throughout this thread, that is a lot of voltage loss.
That's also 7% of the solar energy lost to heat and not making it into the battery. I thought it was worth pointing out, since the OP's quest began with trying to get more output from is PV.
 
I still think if the voltage drop in the wiring system could be reduced he could lower his charge voltage setting and have the charge controller turn off like most systems do.

That is pretty much what everyone has been trying to explain to him this whole thread (with limited success).
 
Status
Not open for further replies.

diy solar

diy solar
Back
Top