diy solar

diy solar

What is the best voltage to charge your cells to? 3.65, 3.55 or 3.45?

  • Thread starter Deleted member 9967
  • Start date
The charging voltage should be what the manufacturer recomends for a given application, in photovoltaic systems it's undesirable to turn off the charging source because it will then prevent loads from using the PV, resulting in drawing from the battery, until a subsequent charge cycle is initiated, to prevent micro cycles a float cycle is introduced to maintain a lower than maximum SOC, but sufficiently high enough to ensure around 90% of capacity is available when the array can no longer provide energy to support the loads.

This is a discussion I had with a well known EV guru, who claimed was it is impossible to use LiFePo4 in solar applications because the charging source needs to be terminated once an individual celll reaches 3.650V. Obviously he was only comfortable within his knowledge base, not knowing the intricacies of charge controllers, but otherwise extremely knowledgeable. Typically in EV bottom balancing is performed to have the cells aligned at the lower end, perhaps to ensure maximum range, but in my opinion it wouldn't make any difference, top or bottom as ultimately the battery capacity as a whole is only as great as the weakest cell.

Using between 20% and 90% SOC is probably a realistic long term strategy, this is something Battleborne dose on their batteries, although they advertise 100% of capacity is available, they back that up by using a larger capacity than advertised to ensure customer satisfaction, a very smart strategy, they are very forthcoming with information, a welcome change to the average vendor.

Included below is the charging recomendations from the manufacturers manual I received with the cells I purchased, note the reference to photovoltaic and float.

View attachment 61655
Apparently the CALB cells are more tolerant of voltage variations than the CATL cells. Or less tolerant. I actually can't tell which since CATL specifies a voltage cut-off and then two separate over voltage protection limits. Confusing.

302Ah CATL cell info below. I have the 271Ah cells. I'm *assuming* the values are actually the same as the 302Ah appears to be a physically larger version of the 271Ah, but TBH it's just one of probably many misguided assumptions as I can't find a similar doc produced by CATL for any other CATL cells
1629929999529.png
1629930011797.png
1629930069012.png
 
It's not only about charge-to-voltage. It also depends on bulk charge current rate. At a low 0.05C bulk current there is little overpotential terminal voltage rise, like 20-40 mV, . At 0.05 CA charge rate, cell will be fully charged when cell hits 3.48v. However, this creats a side issue if BMS doesn't start balancing dump until 3.40v so you need to go to 3.50 or 3.55 v to get some balancing time.

At a ten time greater 0.50 CA bulk current charge rate there is about 0.22 to 0.28v overpotential terminal voltage rise so when a cell hits 3.65v it is still not fully charged. Capped at 3.65v terminal voltage, to avoid electrolyte damage, the charge current will gradually drop off which also reduces overpotential voltage.

Overpotential verses cell current varys a bit from like cell to like cell, their state of charge, and cell temperature. Overpotential gets greater the cooler the cell is, rising faster below +10 degs C. Overpotential voltage increases as cell approaches full charge because available ions are getting scarce to support bulk charging current.

Overpotential also has a time delayed decay to equilibrium when cell current level is changed. Wait a couple of minutes for it to reach equilibrium and settle to a constant level.
 
I'm not a battery engineer, but if 3.45V is 99% SOC (according to the LiFePO4 SOC charts floating around on this forum) then I would think it would in fact harm the battery. At least to your point below, I'm convinced of this until a battery manufacturer or someone who is a chemical engineer can confirm that it's true. I dunno, I'm just a hobbyist.
We've spent too much time on this already, but...

The charge curve is different than the discharge curve. Andy from the Off Grid Garage explains this pretty well in the video posted earlier in this thread, but I don't think you've watched it. Sounds like you are busy, so that's OK.

If you go back and look at that video, starting at 18:20 or so. Using voltage by itself is not a very good indication of state of charge (SoC). At the exact same voltage you could have three different SoCs if there is a charge going on, no current either way, or a discharge. And the voltage will be different during a small discharge than a large discharge.
 
It seems like the area of common conclusion of both videos is that there really is nothing to be gained by charging to 3.65V/cell, and because cells are not going to be perfectly balanced at that point in the curve, trying to charge to 3.65V/cell will - more often than not - result in HVD by the BMS. Since both videos showed that charging to 3.5V per cell gets you to the same capacity anyway, I'm not sure why anyone would try to charge much higher. 3.4V seems to work just as well, although it takes a bit longer.
Yes this is one of my primary concerns.... to charge quickly and fully while avoiding the HVD. 3.65 would virtually guaranty HVD. Even at 3.50-3.55 the balance would have to be excellent to avoid the disconnect. I assume I will be closer to 3.4 until I can determine my unmatched cells will stay balanced.
 
This is a really interesting test as shown. I just want to give a heads up to a couple of things I noticed - specifically the long time for the 3.45V test.
This is something I found out the hard way myself and got much better results once I used lower gauge/ shorter wires.
3V45curve.png

This most likely points to wires that were too thin and/or a shunt resistor that is too high a value. The trick is to use the thickest/shortest possible wire you can find with this type of power supply (source regulation) (or use a power supply with destination sense like in the off-grid garage video i.e. the regulation sense is at the battery cell post).

If you repeat this test with better wiring you will find that the time will drop. Not sure how much. I managed to charge 100AH cells to 90%+ before the current starts to drop at 3.45V with short low gauge wires. I know it is not the same (4S battery), but I only obtained this result after switching to 4gauge wire.
13V8_4sCharge.png
(I know my cell 3 is B grade lower capacity....)

It may also allow you to run the test with one power supply.
 
It's not only about charge-to-voltage. It also depends on bulk charge current rate. At a low 0.05C bulk current there is little overpotential terminal voltage rise, like 20-40 mV, . At 0.05 CA charge rate, cell will be fully charged when cell hits 3.48v. However, this creats a side issue if BMS doesn't start balancing dump until 3.40v so you need to go to 3.50 or 3.55 v to get some balancing time.

At a ten time greater 0.50 CA bulk current charge rate there is about 0.22 to 0.28v overpotential terminal voltage rise so when a cell hits 3.65v it is still not fully charged. Capped at 3.65v terminal voltage, to avoid electrolyte damage, the charge current will gradually drop off which also reduces overpotential voltage.

Overpotential verses cell current varys a bit from like cell to like cell, their state of charge, and cell temperature. Overpotential gets greater the cooler the cell is, rising faster below +10 degs C. Overpotential voltage increases as cell approaches full charge because available ions are getting scarce to support bulk charging current.

Overpotential also has a time delayed decay to equilibrium when cell current level is changed. Wait a couple of minutes for it to reach equilibrium and settle to a constant level.

This is one of the primary reasons for using a BMS that has control of the charging/discharging components of the system.

As the first cell reaches 3.45V, the charge current is reduced to allow effective cell balancing.
 
Back
Top