diy solar

diy solar

DIY low power, high capacity 12V battery without BMS

kalamala

New Member
Joined
May 31, 2023
Messages
6
Location
kala
Don't shoot, I'm working with what I have and trying to approach this from task first principles standpoint.

I'm making a 280Ah 12V battery from EVE cells for a camper"van". In my use case, the battery will see 20A load at MOST. It will be running a car fridge that draws ~5A and occasionally charging an electric scooter with ~10A, plus few usb outlets. I chose such big cells only to provide enough juice, as I plan to be in the middle of nowhere for days.

I will also NOT charge it with solar panels.

So I'm thinking to balance charge it with my ISDT Q8 Max (that I already have) powered by an old desktop PSU that puts out 40A @12V (that I also already have). Charging this way would top balance every time I charge (or at least charge high enough, I didn't find Q8 specs regarding balance voltage threshold).

For the low voltage cutoff I'd be using those lipo beepers, set at 3.0 or 3.1 volts (depending on how out of calibration that particular beeper is). I'll be testing that, but the idea is to warn at least 10Ah in advance about lowest cell getting empty. That would give me ~5h of fridge runtime in case I'm away (but me being away from the battery this long will most likely be extremely rare) and notify me otherwise.

I also have an old watt/ammeter that I'd use to monitor Ah-s and Wh-s given out from the battery after each full charge, to roughly monitor SOC (for example, if I know it's getting empty, I wouldn't leave fridge running unsupervised for hours).

That only leaves us with no temperature monitor/cutoff that BMS-s usually have. I do have some cheapo thermistors with a display laying around to stick between the cells and visually monitor the temperature, but no real option to cut off power automatically. I'm wondering though, charging with 0.1C (~30A, maximum for Q8 Max) and discharging with 0.05C at most, what benefit would temperature sensor of a "real" BMS give? This battery will never see temperatures below 5-10 nor above 30-35 degrees celsius. If a thermal runaway happens (that doesn't really happen with LFP), how much more likely would it be during charging with 0.1C vs just sitting there (considering that it's supposedly a sufficiently smart charger and all)? Or during 0.05C-ish discharge?

All of this humble build would use just a few standard automotive fuses. No fancy inverters, no fancy BMS-s with bluetooth, no 4/0 gauge wires, nada. Just a box with a lid for terminal isolation and a few usb/xt60 outputs on top and a handle to carry. Probably 10AWG wires for flexible bus"bars". A short 10AWG wired plug for charging.
 
So I'm thinking to balance charge it with my ISDT Q8 Max
I do this for several of my batteries because i charge from my always-solar-charged battery. Charging is pretty limited so do the math on how long you expect it to take and add a bit. The finishing charge can be very very slow as it applies current to only a cell or 2.

For monitoring discharge at the cell level i like the ISDT Battgo 8S that has a configurable voltage at the cell level alarm.

If you run without a BMS, YOU are the BMS. The Battgo alarm is mighty useful to helping you be a more reliable BMS.

Maybe your Q8 Max has a cell level alarm too?

EDIT - maybe the alarm function is no longer included in (any?) of the new ISDT units?
 
Last edited:
I can’t fathom running without a bms. It’s only a matter of time unless you manually balance regularly. What a hassle. Just be careful and keep an eye on it.
 
Last edited:
It’s only a matter of time unless you manually balance regularly.
Charging with the hobby chargers is a completely balanced charge using the balance wires.

I can’t fathom running without a bms.
Yes, this is what we preach. But for little utility batteries (22Ah(2x) and 4Ah(5x)) for power tools, lights and telescopes, where you are always present i am willing to be the BMS and accept the consequences if i fail as a BMS. All my big batteries operate unattended and have BMSs (now).
 
Ah, hobby charger with balance leads. At least it won’t over cook one.
 
As I mentioned, top balancing will happen every time I'm charging to sufficient level. Q8 Max will balance every time you plug in the balance leads. Also, if the capacities and internal resistances more or less match on initial testing, there's no reason these cells will go out of balance after a few partial charges.

Anyway, what I'm thinking is that BMS is guarding against 3 main things - guarding against over/undervoltage, heat/cold and balancing cells.

Balancing and overvoltage is managed by charger (there's even option to set lower maximum voltage per cell, up to 3.55V). For undervoltage protection there would be this annoying beeper that starts screaming when any cell gets below set threshold (it wouldn't cut power but in my case, discharge would usually be 0.02-0.05C, meaning there's plenty of time, even hours, from when the beeping starts until cell goes critically low).

That leaves the temperature, but then again, what I'm thinking is, that if there's a high temperature issue (there won't ever be low one) while slowly charging with 0.1C or even more slowly discharging (considering there are low-amp fuses for short protection), then the root cause must be something else, and I'm not sure regular BMS would help either. Frankly, I'm not sure what could even happen, bar physical damage, that would heat up the batteries in my use case.

I fully agree that in usual cases you have to have BMS and it's easier to just drill it to every newbies head.

But what would cause a fire? Consider that overvoltage should be taken care of by the charger. Or do you mean if the charger fails for some reason, the BMS is the second layer of protection?

Charging is pretty limited so do the math on how long you expect it to take and add a bit. The finishing charge can be very very slow as it applies current to only a cell or 2.
Yes, it's around 0.1C so 10-12h to fully charge. I don't see cells going super out of balance with one cycle, assuming capacities more or less match, which is first thing I'll be testing when I get them. I won't be putting together too different cells..

Maybe your Q8 Max has a cell level alarm too?
Q8 Max is only for charging, lipo alarm is for discharge monitoring.
 
I would not use automotive fuses for a 280ah LifeP04.
 
Li lithium
Fe iron
P phosphorous
O oxygen (O4 is 4 of them)
PO4 phoshpate

LiFePO4 = "lithium iron phosphate", a real chemical compound not a made up word with odd capitalization



In general, i have a hard time accepting the advice of folks who don't understand the terms/words they make up.
Thank you I have been enlightened, I will watch my spell check more closely.
 
There is a good discussion here (there are many!). I don't understand everything (much?), but enough to know to use a class T fuse with a LiFePO4 battery.

 
Soo.. regarding fuses. Reading through those threads I see 20kA and such currents are calculated only by cell internal resistance. Wires, busbars, connectors - everything is assumed to be superconductor. Or a short is assumed right after fuse, across battery terminals with a thick copper rod or something.

Quick search gave 1kA breaking capacity for usual automotive fuses. By ohms law, 1kA @ 12V is 0.012 Ohm or 12 mOhm. Anything over that and current is less than 1kA.

1m of 14AWG copper wire is supposed to have ~8mOhm resistance. Aluminium is ~13mOhm. ~13 and ~20 mOhm respectively for 1m of 16AWG. XT60 is supposed to have 0.5mOhm, plus ring terminals etc. I might even make series connectors with 12 or 14AWG instead of 10, that adds another ~2mOhm.

I know that regular batteries with 2kW+ inverters are wired accordingly and are capable of delivering thousands of amps. But in my case, bar only direct short across terminals, even wires and switches connecting cells and plugs on the box might have enough resistance to not allow for 1kA, let alone short at the load that is wired with 1m 14AWG wire.

Yes, I'll be losing 2-3W at max load, but that's acceptable.

Anyway, I'm likely getting a 6000A@32V breaking capacity ANL fuse anyway, but I'm tempted to actually test automotive fuse with some 14AWG wires. Or if someone knows a youtube link for similar test..
 
Don't shoot, I'm working with what I have and trying to approach this from task first principles standpoint.

I'm making a 280Ah 12V battery from EVE cells for a camper"van". In my use case, the battery will see 20A load at MOST. It will be running a car fridge that draws ~5A and occasionally charging an electric scooter with ~10A, plus few usb outlets. I chose such big cells only to provide enough juice, as I plan to be in the middle of nowhere for days.

I will also NOT charge it with solar panels.

So I'm thinking to balance charge it with my ISDT Q8 Max (that I already have) powered by an old desktop PSU that puts out 40A @12V (that I also already have). Charging this way would top balance every time I charge (or at least charge high enough, I didn't find Q8 specs regarding balance voltage threshold).

For the low voltage cutoff I'd be using those lipo beepers, set at 3.0 or 3.1 volts (depending on how out of calibration that particular beeper is). I'll be testing that, but the idea is to warn at least 10Ah in advance about lowest cell getting empty. That would give me ~5h of fridge runtime in case I'm away (but me being away from the battery this long will most likely be extremely rare) and notify me otherwise.

I also have an old watt/ammeter that I'd use to monitor Ah-s and Wh-s given out from the battery after each full charge, to roughly monitor SOC (for example, if I know it's getting empty, I wouldn't leave fridge running unsupervised for hours).

That only leaves us with no temperature monitor/cutoff that BMS-s usually have. I do have some cheapo thermistors with a display laying around to stick between the cells and visually monitor the temperature, but no real option to cut off power automatically. I'm wondering though, charging with 0.1C (~30A, maximum for Q8 Max) and discharging with 0.05C at most, what benefit would temperature sensor of a "real" BMS give? This battery will never see temperatures below 5-10 nor above 30-35 degrees celsius. If a thermal runaway happens (that doesn't really happen with LFP), how much more likely would it be during charging with 0.1C vs just sitting there (considering that it's supposedly a sufficiently smart charger and all)? Or during 0.05C-ish discharge?

All of this humble build would use just a few standard automotive fuses. No fancy inverters, no fancy BMS-s with bluetooth, no 4/0 gauge wires, nada. Just a box with a lid for terminal isolation and a few usb/xt60 outputs on top and a handle to carry. Probably 10AWG wires for flexible bus"bars". A short 10AWG wired plug for charging.


are you this guy who was being talked about on this post: https://diysolarforum.com/threads/lfp-cells-gone-bad.62744/
 
are you the guy who didn't read my post?
All 5 of your posts are in this thread, nothing I see to connect this user to that thread.

Please run a BMS.

If you are going to use fuses with a lower break rating, be advised those fuses may not protect the wiring and more importantly the connections at the cells. You bring up a good point on series resistances, but that means you will more than max out your wiring and cell connections in that case. I'd trust the 6k AIC fuse, but not so much with a little blade/ATC fuse.
 
All 5 of your posts are in this thread, nothing I see to connect this user to that thread.
Sorry, I was snickering and meant asking if he read my first post, in this thread.

Anyway, yes I know that fuses are meant to protect the wiring and basically everything that gets toasty under high amps, in order to break the circuit before the toasty-getting parts get too toasty and start damaging themselves (and surroundings).

Back to ohms law, 1m 14AWG aluminium wire will have voltage drop ov 10V @ 400A, I'm not even sure it can put 1000A through, maybe for a microsecond? 10V*400A=4kW, but fast-acting automotive fuses should break in tens of milliseconds if current exceeds rating by hundreds of percents (40A fuse will blow pretty much instantly if subjected to 400A). So how hot do the wires get during these tens of milliseconds @ 4kW? Let's say 50ms and 4kW, this equals 200 joules of energy. 200J is also known as about 56mWh. You know what you can do with 200J? Heat 50ml of water by 1 degree celsius for example. Even 0.5s @ 4kW (obviously wires won't output 4kW for that long, but let's be pessimistic) will output let's say manageable amount of heat, especially if wires are those 200 degree silicon insulated ones. Still, bluesea says less than 0.1s trip time for 350% amperage.

Now this is all napkin-math, but if I didn't make any serious order-of-magnitude mistake here and if results of those online calculators are in correct ballpark, my intuition was correct that in my usecase, with my battery build automotive fuses shouldn't arc, explode or catch fire, nor the wires melt and burn.

I'll still get that ANL main fuse, but for the fridge output I'm not worried about using blade one (brand one, not that cheap no-name shit).
 
Last edited:
Back
Top