Texas-Mark
Solar Addict
- Joined
- Aug 4, 2021
- Messages
- 1,309
I have a Renogy Rover 40 CC with 8 100W panels (4S2P) on my cargo trailer conversion. The panels are flat mounted, so obviously output is less than 800W most of the time. From the Rover, I have 4 feet of #6 cable. Note: The Rover manual states max size of #8 for the connectors, but #8 is only rated for 40A and #6 fits in there ok. From there, I am connected to a small buss-bar. From the buss-bar to the batteries (3 100Ah lifepo4 batteries in parallel). I have about 6 feet of 2/0 cable for that. Both battery cable and CC cable are on the same lug, so no drop from the bus-bar. Note: I need to stay 12V for a few reasons that are not important to get into.
So, I typically am between 35A & 40A around noon. During this time I will see a voltage drop around .4V from the Rover output (what it is reporting) and the battery BMS sum voltage. Note: I have swapped out all cables just to be sure I did not have a bad crimp and before anyone asks, all connections are clean and tight.. The 2/0 cable should have practically no drop for the short run. The #6 is rated for 55A, so it should have little drop, and I am not sure if I can go any larger. The BMS cables might have a bit of a drop too. Nothing gets warm to the touch.
In any case, the issue is that I have to adjust the Rover settings up a bit in order to factor in this drop, otherwise it will go into float mode too soon. However, when there is little current flow, there is less voltage drop and the CC and BMS report about the same voltage. So I have to be careful not to go too high. No Rover settings are above 14.4V. I have found some settings that are acceptable, but I was curious as to what other people see for voltage drops between the CC output and BMS cell sum. Of course this question would not apply to a controller that has separate sense lines (which IMO they all "should" have).
On another side note, I read several threads here that said the Rover reads about .1V low, but at low current flow, mine is reading the same as the BMS. It's just at max current that the difference is higher (thus the Rover is reporting a higher voltage than what is actually at the cells.
So, I typically am between 35A & 40A around noon. During this time I will see a voltage drop around .4V from the Rover output (what it is reporting) and the battery BMS sum voltage. Note: I have swapped out all cables just to be sure I did not have a bad crimp and before anyone asks, all connections are clean and tight.. The 2/0 cable should have practically no drop for the short run. The #6 is rated for 55A, so it should have little drop, and I am not sure if I can go any larger. The BMS cables might have a bit of a drop too. Nothing gets warm to the touch.
In any case, the issue is that I have to adjust the Rover settings up a bit in order to factor in this drop, otherwise it will go into float mode too soon. However, when there is little current flow, there is less voltage drop and the CC and BMS report about the same voltage. So I have to be careful not to go too high. No Rover settings are above 14.4V. I have found some settings that are acceptable, but I was curious as to what other people see for voltage drops between the CC output and BMS cell sum. Of course this question would not apply to a controller that has separate sense lines (which IMO they all "should" have).
On another side note, I read several threads here that said the Rover reads about .1V low, but at low current flow, mine is reading the same as the BMS. It's just at max current that the difference is higher (thus the Rover is reporting a higher voltage than what is actually at the cells.
Last edited: