diy solar

diy solar

Acceptable voltage drop from CC to battery?

Texas-Mark

Solar Addict
Joined
Aug 4, 2021
Messages
1,284
I have a Renogy Rover 40 CC with 8 100W panels (4S2P) on my cargo trailer conversion. The panels are flat mounted, so obviously output is less than 800W most of the time. From the Rover, I have 4 feet of #6 cable. Note: The Rover manual states max size of #8 for the connectors, but #8 is only rated for 40A and #6 fits in there ok. From there, I am connected to a small buss-bar. From the buss-bar to the batteries (3 100Ah lifepo4 batteries in parallel). I have about 6 feet of 2/0 cable for that. Both battery cable and CC cable are on the same lug, so no drop from the bus-bar. Note: I need to stay 12V for a few reasons that are not important to get into.

So, I typically am between 35A & 40A around noon. During this time I will see a voltage drop around .4V from the Rover output (what it is reporting) and the battery BMS sum voltage. Note: I have swapped out all cables just to be sure I did not have a bad crimp and before anyone asks, all connections are clean and tight.. The 2/0 cable should have practically no drop for the short run. The #6 is rated for 55A, so it should have little drop, and I am not sure if I can go any larger. The BMS cables might have a bit of a drop too. Nothing gets warm to the touch.

In any case, the issue is that I have to adjust the Rover settings up a bit in order to factor in this drop, otherwise it will go into float mode too soon. However, when there is little current flow, there is less voltage drop and the CC and BMS report about the same voltage. So I have to be careful not to go too high. No Rover settings are above 14.4V. I have found some settings that are acceptable, but I was curious as to what other people see for voltage drops between the CC output and BMS cell sum. Of course this question would not apply to a controller that has separate sense lines (which IMO they all "should" have).

On another side note, I read several threads here that said the Rover reads about .1V low, but at low current flow, mine is reading the same as the BMS. It's just at max current that the difference is higher (thus the Rover is reporting a higher voltage than what is actually at the cells.
 
Last edited:
I have a Renogy Rover 40 CC with 8 100W panels (4S2P) on my cargo trailer conversion. The panels are flat mounted, so obviously output is less than 800W most of the time. From the Rover, I have 4 feet of #6 cable. Note: The Rover manual states max size of #8 for the connectors, but #8 is only rated for 40A and #6 fits in there ok. From there, I am connected to a small buss-bar. From the buss-bar to the batteries (3 100Ah lifepo4 batteries in parallel). I have about 6 feet of 2/0 cable for that. Both battery cable and CC cable are on the same lug, so no drop from the bus-bar. Note: I need to stay 12V for a few reasons that are not important to get into.

So, I typically am between 35A & 40A around noon. During this time I will see a voltage drop around .4V from the Rover output (what it is reporting) and the battery BMS sum voltage. Note: I have swapped out all cables just to be sure I did not have a bad crimp and before anyone asks, all connections are clean and tight.. The 2/0 cable should have practically no drop for the short run. The #6 is rated for 55A, so it should have little drop, and I am not sure if I can go any larger. The BMS cables might have a bit of a drop too. Nothing gets warm to the touch.

In any case, the issue is that I have to adjust the Rover settings up a bit in order to factor in this drop, otherwise it will go into float mode too soon. However, when there is little current flow, there is less voltage drop and the CC and BMS report about the same voltage. So I have to be careful not to go too high. No Rover settings are above 14.4V. I have found some settings that are acceptable, but I was curious as to what other people see for voltage drops between the CC output and BMS cell sum. Of course this question would not apply to a controller that has separate sense lines (which IMO they all "should" have).

On another side note, I read several threads here that said the Rover reads about .1V low, but at low current flow, mine is reading the same as the BMS. It's just at max current that the difference is higher (thus the Rover is reporting a higher voltage than what is actually at the cells.

Code requirement is typically 3%. Less is better. 0.4V is likely < 3% in your situation.

I would not trust a voltage drop measurement to two separate Chinese voltmeters. I would use the same voltmeter on the MPPT terminals and the battery terminals, and THAT's my voltage drop. I would also check across each component in the string for issues. If you have a switch and a fuse on the + wire, check voltage across the switch, across the fuse and end-to-end on the (-) as well as across the BMS. All of that should add up to the total measured between MPPT/battery terminals. Heat of any connection or component is a likely source of voltage drop.

Given the charge nature of LFP, if it's at 14.4V, and the battery is at 14.0V, that battery should be nearly fully charged, and I wouldn't worry about dinking with the settings. Can you simply adjust the boost duration to an additional 30 minutes?
 
I would not trust a voltage drop measurement to two separate Chinese voltmeters.

The problem is that it really does not matter if they are accurate or not, I have to adjust something to compensate for the difference. I trust the BMS readings more, because all of them read the same sum voltages.

Can you simply adjust the boost duration to an additional 30 minutes?

Yes, that is one of the changes I made.

Just wish they had sense leads, and this would not be an issue at all.
 
The problem is that it really does not matter if they are accurate or not, I have to adjust something to compensate for the difference. I trust the BMS readings more, because all of them read the same sum voltages.

If you're going to establish a voltage drop, you need correlated readings. Using a common meter accomplishes that.

Yes, that is one of the changes I made.

That should be more than sufficient to attain full or near full charge. Cells at 3.5V are pretty much charged and typically require a very short absorption period. Furthermore, as the current decreases as the MPPT holds at 14.4V, so should the voltage drop. Once you see 0.05C (15A) @ 3.55V/cell, the battery is poked full.

Just wish they had sense leads, and this would not be an issue at all.

Agreed. Unfortunately, that's a premium feature even with higher end stuff like my Victron. My integrated system feeds shunt measured OCV, measured amps and temperature data to all components. My wallet is still aching. :)
 
If you're going to establish a voltage drop, you need correlated readings. Using a common meter accomplishes that.
What I mean is the CC and the BMS are going to do their programming routine based on what they think the voltages are, regardless of the accuracy.

My main concern was the voltage drop (real or not) at max current vs low current. Since the reported drop varies based on the current, I had to find a happy middle ground. .4v may not mean much at the top end, but it can when close to it. Hence the controller strarted float too soon when the batteries were still taking full charge because the CC was seeing a higher voltage than was actually at the cells.
 
Back
Top