diy solar

diy solar

What is the point in charging Lifepo4 all the way to 14.4v when we have about 99.8 SOC at 13.88 v ?

Do any of the "residential-grade" MPPT chargers have this? We're using a Victron Smart Solar MPPT, with a BMV 712 with temp sensor/voltage sample that bolts to battery +. We networked them so that the temp and voltage from the BMV is shared with the MPPT. Is the MPPT using the battery sample or its internal voltage sensor as a reference?
 
Last edited:
Do any of the "residential-grade" MPPT chargers have this? We're using a Victron Smart Solar MPPT, with a BMV 712 with temp sensor/voltage sample that bolts to battery +. We've networked them so that the temp and voltage from the BMV is shared with the MPPT. Is the MPPT using the battery sample or its internal voltage sensor as a reference?

When networked, the BMV voltage should be being used as the reference.
 
Fortunately, by the time the battery is topping out and entering absorption, the charge current and therefore the wire/connector voltage losses should be small. In our case, I compute the voltage differential to be 0.0078v.
 
Last edited:
Fortunately, by the time the battery is topping out and entering absorption, the charge current and therefore the wire/connector voltage losses should be small. In our case, I compute the voltage differential to be 0.0078v.

It's really system-dependent. Fuses, bus bars, wire cross section, wire length, and current of course are the main variables. But low-rate charging to a low CV threshold might also mean the current stays high the whole time (mine does).

I have a drop of somewhere around 40mV per cell under full charging. That's enough to skew things a good bit.

But, you're right: if your circuit is not creating much drop at the decision point, the compensation is not important. I think with larger systems it's almost unavoidable, though, because geometry, bussing, and fusing become major contributors to the resistance and currents get big to try to fill the bank on a reasonable schedule.
 
It's really system-dependent. Fuses, bus bars, wire cross section, wire length, and current of course are the main variables. But low-rate charging to a low CV threshold might also mean the current stays high the whole time (mine does).

I have a drop of somewhere around 40mV per cell under full charging. That's enough to skew things a good bit.

But, you're right: if your circuit is not creating much drop at the decision point, the compensation is not important. I think with larger systems it's almost unavoidable, though, because geometry, bussing, and fusing become major contributors to the resistance and currents get big to try to fill the bank on a reasonable schedule.
Yeah, some breakers have a slightly shocking voltage drop, also.
 
How's this for a different cultural metaphor for this discussion?

Lol, Marshall amps are great, I bought one as a gift once upon a time. I use a Behringer bass guitar amp but it only goes to 10.

So please tell me that when voltage sampling occurs to determine the actual factual static for real voltage of a battery, there is no charging current present which would skew the reading, correct? ~ Regards, a voltage purist
 
Last edited:
if you measure battery under charge, you will get the voltage of the charger that is supposed to give a higher voltage of the battery because that's the way you can make current flow (electrician call that "pressure") into the battery.
so to know the real voltage of the battery , you should disconnect charger and put battery under very small charge and check voltage.
that is what some chargers do.
you can also measure what current is flowing into the wires , so when voltage is not able to drive more amps into the battery, it means the battery is charged.
That is why some people charge battery up to 4.2v instead 3.65, because increasing "pressure" let you "push" more amps in the battery.
 
if you measure battery under charge, you will get the voltage of the charger that is supposed to give a higher voltage of the battery because that's the way you can make current flow (electrician call that "pressure") into the battery.
so to know the real voltage of the battery , you should disconnect charger and put battery under very small charge and check voltage.
that is what some chargers do.
you can also measure what current is flowing into the wires , so when voltage is not able to drive more amps into the battery, it means the battery is charged.
That is why some people charge battery up to 4.2v instead 3.65, because increasing "pressure" let you "push" more amps in the battery.

Yes, I know what you mean, the voltage going in must be higher than the battery voltage or current flow for the most part stops due to equalization.

In simple terms, the only way to know the actual State Of Charge of a battery is to take a voltage reading with the battery disconnected from anything that might cause charging or discharging, HOWEVER, with grandpa's old fashioned lead acid batteries for example, a no load battery reading might be perfectly fine but if the battery is placed under load and the battery has weak or undercharged cell(s), the voltage could drop below the threshold of usability. Not sure is Lifepo's and such exhibit the same load/no load characteristics.

Perhaps to ascertain the, "fullness", of a battery that is being actively charged or discharged, perhaps an algorithm is used to compare the current flow into a battery where the battery's electrical profile is known to the algorithm and the State Of Charge is derived from that?

Regards.
 
yes this is due to the fact that lead battery heavily change the internal resistance in regard of SOC.
That is also why you do not need a special charger for Lead battery because you just set the maximum voltage, and let the internal resistance of battery regulate the incoming current.
But this is also why you can measure a correct voltage and get no current.
I think the LFP has a a permanent low internal resistance versus the lead battery that drastically change with the SOC.
this is probably due to the structure of the battery itself.
The LFP as a huge surface compared to a lead battery, so even if LFP has a bad conductivity (that is why the aluminium of the electrode is coated with carbon), it compensate by the huge surface.
 
Last edited:
If you're trying to dial in voltage-based stopping, and your charger doesn't have differential voltage sense wires to read the voltage at the battery, then you are going to be in guessing game mode.

You can still achieve any desired stopping point with trial and error, but it will be current-specific (charge at higher or lower rates later and your offset will be wrong) and you'll be seeing charger values that don't match the charge curve data that you'll see others publishing.

It's definitely worth it to get a charger that has sense lines if you're getting serious about this stuff.
I agree. Excellent comment. You seem to understand the problem well that I had. I think the problem gets worse when you have no voltage sense wires and multiple mppt chargers even if you have large wire the chargers read the others voltage. The chargers tend to think the battery voltage is *higher* than actual if you don’t have a dedicated battery sense with no load on it.
 
and if you use BMS (common charging) , you do not reall yknow what happens after BMS if you measure just before.
Right! My charger voltage sense negative is on the BMS B- lead and the positive is on the battery terminal. I have had issues. Great comment.
 
if you measure battery under charge, you will get the voltage of the charger that is supposed to give a higher voltage of the battery because that's the way you can make current flow (electrician call that "pressure") into the battery.
so to know the real voltage of the battery , you should disconnect charger and put battery under very small charge and check voltage.
that is what some chargers do.
you can also measure what current is flowing into the wires , so when voltage is not able to drive more amps into the battery, it means the battery is charged.
That is why some people charge battery up to 4.2v instead 3.65, because increasing "pressure" let you "push" more amps in the battery.
On my bench charger when I top balanced I ran my charger up to 4.2v but my battery terminal cell voltage was still under 3.65 When I was pumping in 20amps. However, when full the voltage needed to be dialed back to 3.65 at 10ma. Now I don’t trust chargers default lifepo4 settings and chargers without voltage sense leads positive and negative. However, if we have voltage sense wires connected to our chargers and we are charging to a pack voltage goal of 13.8v when batteries are empty we sometimes can read charge voltage up at 14.4v at the chargers output because the charger knows not to exceed 13.8 battery terminal voltage at float. I think my Victron MPPTs may work like this with the networked smart battery sense.
 
Last edited:
Do any of the "residential-grade" MPPT chargers have this? We're using a Victron Smart Solar MPPT, with a BMV 712 with temp sensor/voltage sample that bolts to battery +. We networked them so that the temp and voltage from the BMV is shared with the MPPT. Is the MPPT using the battery sample or its internal voltage sensor as a reference?
Yes, they do.
 
Do any of the "residential-grade" MPPT chargers have this? We're using a Victron Smart Solar MPPT, with a BMV 712 with temp sensor/voltage sample that bolts to battery +. We networked them so that the temp and voltage from the BMV is shared with the MPPT. Is the MPPT using the battery sample or its internal voltage sensor as a reference?
The BMV 712 is a great choice and I love that it reads direct battery voltage. A solution for those of us with out the BMV 712 and Bluesolar MPPTs is Victron smart battery sense. This solved voltage reading issues between the battery on my solar charge controllers.
 
Reading correct battery voltage at the charger @nebster and @nosys70 are bringing up a really important issue about voltage measuring accuracy. I am going to summarize my impression of the conversation. 1) Who cares about the voltage set point if our chargers are reading the wrong battery voltages. 2) Some say just use increase wire size, relocate chargers next to batteries, etc. However, just asking your charger to increase the voltage for the current load conditions on the wire with circuit breakers is a more elegant and usually leads to a better long lasting solution. 3) we need chargers with voltage sense input connected directly to the battery. Here is a video with a discussion on the topic of a battery sense wire.
 
frankly i do not think a 0.5V will change something about the battery life.
An LFP can go up to 4.2v (so over 16V). if you feel the battery is heating too much at the end, you can stick to lower voltage to see if the temperature goes cooler.
Are you sure about that? Li-Ion cells like 18650s are charged to 4.2V. LiFePO4 cells are not exactly the same and are charged to 3.65V.
 
yes , but we just discussed that.
if your charger just measure what it sends, you could send 3.65V and but have only 3.4 at the battery terminals (passing through or not a BMS).
so the correct way to set your charger would be to set a voltage, and measure at the battery what you really get.
De facto LFP can be charged as high as 4.2V while i would see no reason to recommend that, since the increase in capacity is negligible but the wear will be probably certain.
if you charge an LFP to 4.2, it will not stay at 4.2( like a li-ion cell would) , it will go fast down to 3.7.
The lithium battery has this particularity that if you charge at nominal capacity (3.65) you will reach full charge value at about 80%.
So, most charger are pushing the charge a bit longer even if the battery looks charged and they sens the full charge with thermal measure, because the cell will start to heat faster when fully charged.
Another way would be to measure the current absorbed, since the purpose of charging a battery is not to reach a Voltage, but pushing Amps into it. The voltage is only the what allows you to transfer the amps from the charger to the battery.
the use of voltage to control the charge of the battery is just a lazy way to do it because measuring volt is a lot easier/cheaper than measuring current.
So , to make current pass form the charger to the battery, you need to apply a pressure (voltage) that will make flow the current from the charger to the battery. For technical reason you have to limit that voltage to less than 4.2V, or less you could burn the battery chemistry.
The expected behavior, is if the battery is starved, it will eat current fast and create a drop of voltage. When the battery is full , the voltage will rise.
So measuring voltage can reflect the SOC of the battery. (and we have seen that is not the case for a lead battery for example).
 
I'm getting a bit frustrated because I'm finding many chargers are not custom programmable and they charge to 14.4 vdc pack voltage and my Calb 180ah charge cutoff at 3.6vpc or 14.4 pack voltage. Rod at Marinehowto.com has this chart https://marinehowto.com/wp-content/uploads/2015/09/28-LiFePO4-On-Boats.jpg and this article https://marinehowto.com/lifepo4-batteries-on-boats/ that states 13.88 is about 100% full. Yes. I top balanced my pack when I built it. To be safe, I just charge to 13.8 and I'm not giving up much SOC and my batteries will likely last longer. Am I missing something?
I'm Fairly new to building a system, but the way I have my Lifepo4 prismatic cells charged is 14Volts. 3.5V each. I'am new to Lifepo4 but not Ignorant to lithium in general to the practice ( worked with lithium ion prior) and agree that it does not seem worth it to me to try to push it up to "the max", because when I initially top balanced and "maxed them out" and completed a capacity test then, recharged at 14v, and cycled again it was roughly .75Ah difference. My cells seem to be happy going to 3.5, this particular set is being used without a BMS ( and watched closely) I wanted to see how much one was really needed. To me if you know what loads you are pulling and if your controller is set up properly it should never overcharge. (I know nothing is perfect) My Inverter is set to shut down before we get to any dangerously low voltage but have rarely got close to draining it unless it was on purpose. I will be the first to admin that I may not know as much as a lot of you when it comes to Lifepo4, and I'm sure some of you could find a million things wrong with my setup. So Far it has been running smoothly and within 0-3mv of each cell.
 
For Li, has anyone tried programming in a lower absorption voltage (e.g.: 13.8V), and then used the equalization function to absorb at a higher voltage (e.g.: 14.4V) at pre-programmed intervals (e.g.: once a week)?
 
Last edited:
I Only charge to 3.4 / 13.6 volts. I barely loose any capacity. I have cell overvoltage set at 3.5v per cell. And programmad the BMS to read 100% when one of the cells hits 3.4.
So I'm very much at the safe side, my cells will stay healty for a long time, I barely loose capacity..
 
^^^^^^ this right here. My DC-DC/MPPT controller will switch to acting like a power supply, once the battery is charged up, if there is a load on the system.
 

diy solar

diy solar
Back
Top