Hi all,
I recently switched from a 12v PWM (Curtech CT15A) controller to MPPT (Victron smartsolar 75/15) thinking it would be superior technology and slightly more efficient. However, my setup is such that there is about 6 or 7m of I think 6B&S cable between the reg and the battery (not ideal I know), fed from a 180w panel permanently mounted on the vehicle. What I have noticed is that the PWM seemed to be able to sense battery voltage better over the long cable run than the MPPT. I was hoping someone knowledgeable could confirm or explain what I'm seeing. The voltage readout on the LCD panel on the PWM reg pretty much exactly matched the voltage I was seeing at the battery - lets say for arguments sake this was 14.4v. But the actual voltage being put out by the reg was often a lot higher -close to 15v a lot of the time. So it was as if the PWM reg could read the actual battery voltage and compensate for voltage drop across the cable, and it seemed like the bulk/absorption/float functionality worked properly because of this.
The Victron MPPT I installed recently has a bluetooth app that I can read voltage and current with, and it does not seem to deal with voltage drop at all. For example, I set the absorption voltage at 14.5 and float at 13.5. The Victron app shows the reg in putting out 13.5 volts and whatever amps, but the volts at the battery are only 13.1. I thought about dialling up the absorption and float charge values by 0.5 volts, but I think the voltage drop varies with current and it simply won't work. When I cover the panel with a towel and amps drops to zero the battery voltage and app voltage pretty much match, which confirms its a voltage drop across the cable as current increases I think? So when the system goes into float and current tapers off to zero when the battery is full that extra 0.5 volts I dial in will actually be put into the battery and it will 'float' at 14v.
I'm thinking of just putting the PWM back in but I was hoping someone could explain what's going on here first. I was wondering if maybe the PWM reg reads voltage in between 'pulses', and because there's no current between pulses it gets an accurate reading? I dunno - that sort of thing is beyond my technical understanding.
I recently switched from a 12v PWM (Curtech CT15A) controller to MPPT (Victron smartsolar 75/15) thinking it would be superior technology and slightly more efficient. However, my setup is such that there is about 6 or 7m of I think 6B&S cable between the reg and the battery (not ideal I know), fed from a 180w panel permanently mounted on the vehicle. What I have noticed is that the PWM seemed to be able to sense battery voltage better over the long cable run than the MPPT. I was hoping someone knowledgeable could confirm or explain what I'm seeing. The voltage readout on the LCD panel on the PWM reg pretty much exactly matched the voltage I was seeing at the battery - lets say for arguments sake this was 14.4v. But the actual voltage being put out by the reg was often a lot higher -close to 15v a lot of the time. So it was as if the PWM reg could read the actual battery voltage and compensate for voltage drop across the cable, and it seemed like the bulk/absorption/float functionality worked properly because of this.
The Victron MPPT I installed recently has a bluetooth app that I can read voltage and current with, and it does not seem to deal with voltage drop at all. For example, I set the absorption voltage at 14.5 and float at 13.5. The Victron app shows the reg in putting out 13.5 volts and whatever amps, but the volts at the battery are only 13.1. I thought about dialling up the absorption and float charge values by 0.5 volts, but I think the voltage drop varies with current and it simply won't work. When I cover the panel with a towel and amps drops to zero the battery voltage and app voltage pretty much match, which confirms its a voltage drop across the cable as current increases I think? So when the system goes into float and current tapers off to zero when the battery is full that extra 0.5 volts I dial in will actually be put into the battery and it will 'float' at 14v.
I'm thinking of just putting the PWM back in but I was hoping someone could explain what's going on here first. I was wondering if maybe the PWM reg reads voltage in between 'pulses', and because there's no current between pulses it gets an accurate reading? I dunno - that sort of thing is beyond my technical understanding.