diy solar

diy solar

DC/DC Renogy Trailer Install

Also OP explicitly stated he only wants to cover idle demand while towing, not necessarily charge batteries.

And your statement that 5% loss on 24v is "half the wattage loss" of 12v bothered me. It's not. 5% is 5%.

Let's assume 120w is available for easy math:

If you have 24v at 5 amps that's 120w. Losing 5% of 24v is 22.8 volts. 22.8v at 5a is 114w.

12v @ 10a is the same 120w. Losing 5% of 12v is 11.4 volts. 11.4v @ 10a is 114w.

So as you can see, 5% is 5%.

Though that's not at all how it works when a dc to dc charger is installed. What really happens is the current *increases* in that wire to still provide 120w of power to the device, and thus becomes 5.26a @ 22.8v and 10.52a @ 11.4v. This occurs because the charger is forcing the system to supply the 120 watts.

Or in the case of charging a battery @ 10a it's more like 136 watts, assuming 13.6 charge voltage.

And this doesnt even begin to touch the issues present with modern "smart" alternators which totally eliminate any possibility of charging through the trailer connector when they "intelligently" drop system voltage to save fuel.
 
Last edited:
Short_Shot,
Exactly. .And folks please remember I am installing the DC/DC converter to protect my alternator and wire more than anything. And yes it the amperage did change. I started with a purchased 40 amp DC/DC and realized that at least on the trailer there was only 10 gauge wiring feeding the power from the tow vehicle to the battery. Clearly NOT capable of 40 amps. I then ordered a new 20 amp DC/DC converter and returned the 40 amp one, and will further take advantage of the switch (got to double check this) to limit the current to 10 amp DC. Now I am well in side my alternator amperage, and am limiting the current to protect wiring. It is of some concern that the 10 amps to the battery will require more than that to the DC/DC converter. But I need some protection between the Tow Vehicle alternator and the Lithiums on the trailer.

There is currently no solar on the trailer. That is NEXT years project.

We have a saying in my business. .. A perfect airplane never flies.

My biggest issue on the testing side is finding a controlled load. Without spending a lot of money

Thanks
Andy
 
Lmao since when is a dark rainy day 80% yield on solar?

And suddenly it's become clear you don't understand the issue at hand.

Reducing voltage does not "decrease watts" if a device such as a dc to dc charger is capable of increasing voltage at its output. That's the whole point of a boost converter.

Maybe you should research a few things before commenting.
If you have 10 volts at input and output is 14v @ 10a then your input current will be 14a. Including the efficiency of the dc to dc charger will bring up the input current a bit more still.

You have misconceptions on how the DC to DC chargers work when input voltage is reduced due to resistance in the input circuit. I strongly suggest you read this article. https://www.redarc.com.au/images/uploads/files/news/installing_dc-dc_chargers_ind.pdf


This holds true because the alternator is capable of much higher current. You aren't pulling more from "thin air". A 100 amp @ 13.6v alternator for example has 1360 watts available to use. A 5% voltage drop brings this down to 1292 watts. You might recognize this as "significantly more than necessary for a 10 amp charger". Thus, the necessary ~14.x amps needed is available.

It doesn't quite work that way. And where the DC to DC is located will affect charging performance also. If DC to DC is installed on the tow vehicle up front, it makes no difference how large the cable inputting power is when the output side is limited by under size wire resulting in voltage drop. The house battery will also not receive a full charge due to the voltage drop of the circuit feeding it. The other problem is the DC to DC does not have a true voltage of the battery it is charging.

If the DC to DC is installed on the trailer and fed thru a small wire with a large voltage drop, then the available watts will be reduced and charging will be limited, however the house battery could essentially receive a full charge provided the charge time is extended greatly.

There is no free wattage, rules of electricity still apply. Resistance will always limit current flow, I deal with this on a daily basis in my business.

Furthermore, an alternator connected straight to lithium without any means of regulation is a really bad idea.
I thought the same as you when I first joined here, however considering the resistance of the circuit thru a Bargman cable, it actually works. It has the same inherent problem of voltage drop not permitting full charge of the house battery but the DC to DC will fare no better if the circuit wiring has too much voltage drop.

The money spent on a DC to DC charger to charge just 10a would be better spent on heavier gauge wiring for the circuit. Or boosting voltage high enough to compensate for lost wattage to something like 36V and then converting it back down to the nominal 12V.
 
Also OP explicitly stated he only wants to cover idle demand while towing, not necessarily charge batteries.

And your statement that 5% loss on 24v is "half the wattage loss" of 12v bothered me. It's not. 5% is 5%.

Let's assume 120w is available for easy math:

If you have 24v at 5 amps that's 120w. Losing 5% of 24v is 22.8 volts. 22.8v at 5a is 114w.

12v @ 10a is the same 120w. Losing 5% of 12v is 11.4 volts. 11.4v @ 10a is 114w.

So as you can see, 5% is 5%
Though that's not at all how it works when a dc to dc charger is installed. What really happens is the current *increases* in that wire to still provide 120w of power to the device, and thus becomes 5.26a @ 22.8v and 10.52a @ 11.4v. This occurs because the charger is forcing the system to supply the 120 watts.

Or in the case of charging a battery @ 10a it's more like 136 watts, assuming 13.6 charge voltage.

And this doesnt even begin to touch the issues present with modern "smart" alternators which totally eliminate any possibility of charging through the trailer connector when they "intelligently" drop system voltage to save fuel.

Also OP explicitly stated he only wants to cover idle demand while towing, not necessarily charge batteries.

And your statement that 5% loss on 24v is "half the wattage loss" of 12v bothered me. It's not. 5% is 5%.

Let's assume 120w is available for easy math:

If you have 24v at 5 amps that's 120w. Losing 5% of 24v is 22.8 volts. 22.8v at 5a is 114w.

12v @ 10a is the same 120w. Losing 5% of 12v is 11.4 volts. 11.4v @ 10a is 114w.

Not quite as easy as you claim, the voltage drop on a 24V system will be 1/2 of the 12V system. Ohm's law still applies, regardless of how you attempt to rewrite it. Take this calculator https://www.rapidtables.com/calc/electric/ohms-law-calculator.html and input 2 Ohm's resistance for 12V and 24V.
So as you can see, 5% is 5%.

Though that's not at all how it works when a dc to dc charger is installed. What really happens is the current *increases* in that wire to still provide 120w of power to the device, and thus becomes 5.26a @ 22.8v and 10.52a @ 11.4v. This occurs because the charger is forcing the system to supply the 120 watts.

Or in the case of charging a battery @ 10a it's more like 136 watts, assuming 13.6 charge voltage.

And this doesnt even begin to touch the issues present with modern "smart" alternators which totally eliminate any possibility of charging through the trailer connector when they "intelligently" drop system voltage to save fuel.
 
I am familiar with current limiting effects of voltage drop when connecting to a lithium battery. This is the same reason why a lead acid charges poorly, or not at all, when doing the same.

However I'm still not convinced it will be as limited as you state regarding the behavior of the charger.

So on the admittedly likely chance I'm wrong I decided to test it. Only instead I'm limited to a 5 amp load so I'm using much smaller wire instead. This should be comparable enough to illustrate my, or your, point.

Your claim is that the voltage drop across the wire causes further voltage drop as the DC to DC charger increases its demand to supply the output. Presumably you believe this will somehow prevent the charger from supplying the desired current. In OPs case this is 10a, in my case 5a.

This claim makes sense in a sort of feedback loop of negative reinforcement.

So I'll test that.


I have here approximately 25 feet of 16 awg tinned ofc wire, a 10 amp power supply, a small load tester, and it's doing exactly what I expected from it.

20210719_082846.jpg
20210719_084203.jpg


It's pulling a significant voltage drop from 11.96v at the source...

20210719_083515.jpg

....To 10.48 volts at the input of the DC - DC charger. That's 12.37% voltage drop.

20210719_084203.jpg

The load is 5a constant current at 13.6 volts, or approximately 67.7 watts. The DC to DC charger is delivering this as requested. There's a small amount of error at the cheap load tester plus a small voltage drop but...

20210719_084650.jpg

The output of the power supply is ~6.95 amps (I've measured this display to be +/- .02a) as you can see above. Given the 10.48a voltage this equals 72.83 watts. This indicates the charger is losing roughly 5 watts. This looks fine to me.

For laughs, I taped the testers thermocouple to the wire. You can see it getting warm in the previous photo but we both should have expected this. The wire is long and experiencing significant voltage drop. Edit: Wire is about 27c here.

This is exactly what I expected and described.
 
Last edited:
Not quite as easy as you claim, the voltage drop on a 24V system will be 1/2 of the 12V system. Ohm's law still applies, regardless of how you attempt to rewrite it. Take this calculator https://www.rapidtables.com/calc/electric/ohms-law-calculator.html and input 2 Ohm's resistance for 12V and 24V.
Your mistake is thinking you're not pulling the same wattage regardless of voltage.

If the wattage stays the same (to do the same work with the system) and the voltage doubles the current halves. Something something ohms law.

This doesn't even include losses converting up to 24v, albeit minimal as they are.

Edit: I see what you mean. Same wire, same power, less current, less heating, etc. You got me there.

Edit 2: My teacher would have laughed at me for making that mistake quite a few years ago....
 
Last edited:
Update. After 15+ minutes the wire temperature is stable at 43c. Edit. Not 43c. That's the tester temp. Wire temp is shown at the button of the display and is about 28c. My mistake. Long night.

While warm, not an issue.

Though I would be concerned about trying to run this setup on a 110F day in the desert, there's no way on earth a 10awg wire would experience anywhere near this amount of heat just by doubling the current.
 
Last edited:
Update 2:

Disabled the low voltage shutoff of the DC-DC charger (forgot that was there) and now the voltage drop is 18.3%, down to 9.77 volts.

The load has increased to 6.7 amps on the tester and 9.96 amps on my 10 amp power supply.

20210719_091407.jpg

I'll let you do the math but... it's still not only working but providing exactly what I'm requesting from it.

I'm not sure why you're saying this doesn't do exactly what I've described; pulling more current on the input side to compensate, but here it is doing exactly that right in front of me.

If I had a better power supply and load tester I would experience the same basic situation at 10a, but I'd also switch to 10awg to emulate OP's scenario.

Wire is 50c now. EDIT: 29C. My mistake reading the wrong temp on the display. Definitely approaching a bad situation, but in a real setup I would never let it hit 18% vdrop let alone 10%.
 
Last edited:
Update the third:

Because I've got thermal so why not.

As you can see the heating isn't as significant when the wire is ran out straight instead of sitting in a pile of spaghetti heating itself up.

The pile is upward of 35c while the wire off by itself with the sensor is reading 29c. It's more likely to be the lower temp along the vehicle frame and higher temp in the engine bay, for reference.

received_333933135111639.jpeg
 
Last edited:
It's not even complaining at me for it.

I could change it to battery charger mode but it would just switch to bulk @ 13.6 automatically anyways, so I have instead configured it to constant 13.6v output.

Which it is giving me without issue, despite the 9.7v input.
 

Attachments

  • Screenshot_20210719-093607.jpg
    Screenshot_20210719-093607.jpg
    171.6 KB · Views: 4
It's not even complaining at me for it.

I could change it to battery charger mode but it would just switch to bulk @ 13.6 automatically anyways, so I have instead configured it to constant 13.6v output.

Which it is giving me without issue, despite the 9.7v input.
Put a battery on it instead of the tester. We are looking at charging a battery, not powering a 5 amp load. And set the DC to DC for 10a, you can use 10 ga wire too, 25 feet. Start at 50% of SOC and run until fully charged.
 
I'm going to run a 2 hour timed test at 6.7a.

When the wire gets hot it is bouncing off the 10 amp maximum output of the power supply. If it exceeds this by a small amount the power supply will shut down the output, the charger will go to zero, and the tester will register <1v and shut dish the test. This will record the time stopped and total mah drawn before a significant fault (hot wire running away) occurred.

And with that started I'm going to get lunch.
 

Attachments

  • 16267065531001793730140538332040.jpg
    16267065531001793730140538332040.jpg
    347.1 KB · Views: 3
  • 20210719_095538.jpg
    20210719_095538.jpg
    164.8 KB · Views: 3
Put a battery on it instead of the tester. We are looking at charging a battery, not powering a 5 amp load. And set the DC to DC for 10a, you can use 10 ga wire too, 25 feet. Start at 50% of SOC and run until fully charged.
If you put a battery on it then OP's charger will limit it to 10a. Same result. That's *literally* the entire purpose of a DC to DC charger in this application.

I don't have a power supply that can deliver the necessary current. In real life with an alternator delivering the extra current to a 10 amp charger output is trivial.
 
In fact this is so exactly the intended application that the manual explicitly mentions it.

Not only the voltage drop situation but the lithium battery overcurrent protection AND smart alternators (which I mentioned earlier).


If you wish to buy me a 30 amp bench power supply however I will happily accept delivery at your expense and change to 10awg and a battery.


The point here is you said this isn't possible because the voltage drop causes it to fail. I've intentionally engineered an extreme voltage drop to demonstrate your point.

Yet the charger still functions fine.
 

Attachments

  • 16267068526598490890891437731421.jpg
    16267068526598490890891437731421.jpg
    237.1 KB · Views: 3
If you put a battery on it then OP's charger will limit it to 10a. Same result. That's *literally* the entire purpose of a DC to DC charger in this application.

I don't have a power supply that can deliver the necessary current. In real life with an alternator delivering the extra current to a 10 amp charger output is trivial.
Use an alternator then. Let's make it a real life test.
 
Use an alternator then. Let's make it a real life test.
I have no way to limit my charger to the OPs 10 amp limit and my truck currently has 12 awg wire in it.

You can keep specifying new requests all you want. The point has been proven:

The DC to DC charger compensates for voltage drop by increasing current draw on the input while simultaneously increasing output voltage to a charging level AND serves explicitly to limit charging current to protect the alternator from a lithium battery.


That's literally direct from the manual.


Not sure how much more clear this can be for you.

A severe voltage drop as I have demonstrated will rather increase the current demand from the alternator, however as long as the increase is still within the limits of said alternator it will run fine.


If 6.7a still works on 16awg then 10a damn sure will work on 10awg. Period.

Ideal? Absolutely not.

However it will not only work, but will provide the 10 amps he wants.
 
Last edited:
Update yet again.

My bundled up worst case section of the wire is at 40c. Rated for 60c mind you, but obviously I would avoid this in real use.

The loose wire that's in open air is 30c, which doesn't even feel warm to the touch.
 
Folks
I have a trailer install. Thus no jumpers from an ignition switch. Just the 7 pin trailer connections.
I have a couple of installation questions about my Renogy 40amp DC DC charger with 2 200AH Lithium Batteries (ampere time)
Question 1: So I have seen were folks put in a switch that sees 12V and turn it to the on position to enable the DC/DC and turn it off when they dont want to pull from the tow vehicle. As I always drive with my running lights on to energize my rear view camera, I was thinking of tapping that voltage. That gives me control from the tow vehicle. Any reason NOT to do that?
Question 2: I was way focused on my alternator capacity and how much current the batteries could draw / need. 40 amps seems fine. I have an F350 diesel with the tow package, I believe that is a 120 amp alternator. Howerver, now that I look at the trailer wiring... 10 ga. Hmm. Thinking I should exchange for a 20 amp. Can I get away with the 40 amp? or was that a mistake?
Question 3: I managed to isolate the individual hot wire from the trailer umbilical to the 12V system. But dammed if I can find an isolated ground. Any reason both ground terminals cant simply go to the chassis ground? I have good continuity from the chassis point to the plug. Meter fluctuates between 0 and 0.1 ohms

Any help is appreciated
Andy
I have this same set up in my 2021 Prime Time avenger. Here are the answers to your questions.


Question 1: So I have seen were folks put in a switch that sees 12V and turn it to the on position to enable the DC/DC and turn it off when they dont want to pull from the tow vehicle. As I always drive with my running lights on to energize my rear view camera, I was thinking of tapping that voltage. That gives me control from the tow vehicle. Any reason NOT to do that?

ANSWER: The easiest way that I have found to switch the DC to DC charger off is by taping the hot wire into the running lights of the camper. This is what I have done. This way when I turn my running lights on I also turn on the DC to DC charger. If I need to turn off my DC to DC charger while driving I simply turn off my running lights. This is how I would recommend you make the connection. Hopefully Renogy will make this a voltage sensing switch in the future.

______________________

Question 2: I was way focused on my alternator capacity and how much current the batteries could draw / need. 40 amps seems fine. I have an F350 diesel with the tow package, I believe that is a 120 amp alternator. Howerver, now that I look at the trailer wiring... 10 ga. Hmm. Thinking I should exchange for a 20 amp. Can I get away with the 40 amp? or was that a mistake?

ANSWER: The DC to DC charger works very similar to a solar charge controller in that it will regulate the voltage coming from the alternator/starter batter to the house battery. I would recommend using 2 gauge wire from the starter battery. How I did it was to run a 2 gauge wire from the starter battery into a 60 amp breaker then continue the 2 gauge wire back to the bumper and into an Anderson Plug mounted to my bumper. I run both Hot and ground the entire distance. I found UV rated 25 ft copper stranded battery cables on Amazon to make the run. Just cut off the alligator clips. Then I put a second Anderson plug on the camper to connect the suburban and it runs straight to the DC to DC charger. Having the 60 amp breaker allows me to disconnect the DC to DC charger also if I don't wish to use it. (See pics attached)

______________________


Question 3: I managed to isolate the individual hot wire from the trailer umbilical to the 12V system. But dammed if I can find an isolated ground. Any reason both ground terminals cant simply go to the chassis ground? I have good continuity from the chassis point to the plug. Meter fluctuates between 0 and 0.1 ohms.

ANSWER: You can ground to the chassis if you wish.
 

Attachments

  • Anderson Plug-DC to DC_02.JPG
    Anderson Plug-DC to DC_02.JPG
    455.5 KB · Views: 21
  • DC to DC Charger.JPG
    DC to DC Charger.JPG
    1.9 MB · Views: 20
  • Anderson Plug-DC to DC_01.jpg
    Anderson Plug-DC to DC_01.jpg
    1.4 MB · Views: 20
Last update on my testing until I decide to install this thing in my setup.

Test success. 2 hours at 6.7a output (current limited to 6.7a output) with 10a input (current not limited, though my power supply shuts off any higher than that) and the 25 feet of 16 awg (read: grossly undersized) wire reached a max of 41.1c bundled up in a pile under an 18+ percent voltage drop from 11.96 to 9.8 vdc.

The test pulled 181wh, or 13.4ah @ charge voltage.

Nothing burned up and the wire outside of the bundle (as it would be installed) is still only 30c, which isn't even warm to the touch.

Edit: stupid typos.
 

Attachments

  • 16267135576842989964063562528746.jpg
    16267135576842989964063562528746.jpg
    137.5 KB · Views: 5
Last edited:
Short_Shot
Cool! I walk away from this forum for 1 day and look! Experiments! Data!.
So this summer. Proceed as Planed.
Next summer upgrade Truck and trailer wiring to handle 40 amps, extra good for now, plan for the future
Following summer. Add some solar.
Just like my house. The next owner will have a cool beans trailer!
 
Back
Top