diy solar

diy solar

Columb counting questions for the pointy heads!

Craig

Watts are Watts!
Staff member
Moderator
Joined
Sep 20, 2019
Messages
2,924
Location
Hollister CA and off the grid in Idaho
I will admit that I am not a huge fan of columb counting to measure battery SOC. I feel that I can get a rough idea via Voltage and have a decent idea of where I stand.

But if I were to build a columb counter or use an Ali one.

I understand that 100 amps out of a battery is supposed 100 amps out (is it?) What I do not understand is how you account for the input is 100 amps in actually 100 amps in or since the round trip efficiency is 90% is 100 amps in, as counted by a columb meter, only 90 amps in?

If this is the case how does the columb counter know what my particular battery efficiency is?


I love to jibe the EE types around here saying I do not suffer from knowing too much but in this case I will gladly like to hear your educated explanation of how this works.

@snoobler @HaldorEE
 
As you'd be aware a decent meter detects the battery charge cycle being completed and resets its SOC to 100%. The rest of the time it is only measuring amp hours (or watt hours etc) in and out. Unless it has explicit support for a given battery chemistry along with current and thermal derate of the available charge it probably makes no accounting for the effects of those and others.
 
As you'd be aware a decent meter detects the battery charge cycle being completed and resets its SOC to 100%. The rest of the time it is only measuring amp hours (or watt hours etc) in and out. Unless it has explicit support for a given battery chemistry along with current and thermal derate of the available charge it probably makes no accounting for the effects of those and others.
Ok so what happens say if in the winter my very large battery say uses 10% capacity per day but only makes up 9% from solar since it is winter so after 30 days my battery is down 30% in stored energy so the battery never gets fully cycled and completed.


Lol sorry I didn't include in the pointy heads I realized after I posted lol
 
If the meter watches for the usual pull up and hold at a certain voltage then lower back down to float to pick charge completed it should realise that your battery is not being bought to full charge and no do a SOC reset. It should just track power in / out and slowly diverge from the actual SOC based on chemistry / temperature / self-discharge etc. If it doesn't have any process like that it'll be up to the operator to periodically bring the battery to full charge and then reset the meter to 100%.

The better ones such as Victron's BMV let you fill in more of the blanks to keep track of things more accurately. You can adjust the voltages used to detect charge completed as well as a fudge factor to take RTE into consideration etc. The smartshunt would have the same options since it's a headless BMV.

Victron BMV setup
 
Interesting subject Craig .... my head if fully round on top ... but

I agree with gnubie that the resolution to this seems to be to occasionally fully charge the battery which will be recognized by the coulomb counter and reset the SOC to 100%

This is definitely going to be a problem when trying to avoid doing a full charge unless the reset point can be specified at various voltage levels and SOC.

When I worked in the control industry, we sometimes used what were called floating point actuators. Their position was theoretically determined by keeping track of how long the actuator had been commanded each direction. Invariably the actuator would get out of sync .... so the remedy was to command the actuator fully one direction for the full actuator time once every 24 hours and then back to it's position....kind of a similar situation.
 
Yah. BMV seems to have the most comprehensive input:

Capacity
Peukert
Charge voltage
Tail current
Charge efficiency.

From the BMV video: "...fully charged at least twice a month to synchronize"

(bolded for @Craig ).

Most batteries are high efficiency until they approach 100% SoC during charge. Discharge efficiency at the C20 rate is by definition 100% efficient. Peukert tells you the "efficiency" at other than C20. The BMV monitors net current flow, so Ah out is at 100% efficiency, but if the net current is charging, then the charging efficiency applies, which is a single value approximating something that's variable.

I have yet to test the "synchronicity" of my BMV-702, but in repeated cyclic scenarios (approximately the same power uses at approximately the same times of day), a given SoC correlates very closely to a given voltage.

It also consistently hits 100% 30 minutes before float. This is about 3Ah of input in that 30 minutes, which is about 1.5% of the total capacity.
 
My JK BMS (aka Heltec) does do coulomb counting. I can't state how accurate it is yet. I am charging at about 20 amps into my 360 amp hour rated pack. The counter is going from 50% to 99% in about 6.5 hours. I am not hitting the true full charge of 58.8 volts. I am stopping short at 57.6 volts. It appears the SOC is resetting when it hangs in absorb charge for a while. I did tell it the over voltage protect is at 58.8 volts, so it may not read 100% unless I let it go that high, as it does hang at 99% for a while near the end of charge. On the way down, it does appear to remove based straight off the listed capacity. It should be dropping 1% for every 3.6 amp hours that are removed from the battery. It does not seem to take the voltage into account. 1 amp hour near full charge of 58 volts, is 58 watt hours, but further down, one amp hour at just 50 volts, is only 50 watt hours. This does not appear to be accounted for in the battery remaining percentage. My XW-Pro inverter is discharging at a constant wattage, so the current actually does increase as the voltage is dropping. This makes the remaining % and Amp Hour display fall faster as the battery runs down. The bad part about this is that the first 50% is quite a bit more energy than the second 50%. If my battery is really 360 amp hours, then the first 50% is 180 amp hours at an average of 3.95 volts per cell or 55.3 volts or 9,954 watt hours. The remaining 50% is also 180 amp hours, but the average voltage is now just 3.35 volts per cell, or 46.9 volts for the whole pack, which comes out to just 8,442 watt hours. So that first "half" 54 % and the second "half" is 46%.

Of course this is if the discharge is perfectly linear, but it is not. It has a bit of an "S" curve just below the 50% charge area that could make this even worse. With this information, I would not put too much trust in a battery level meter other than to get a rough idea where it is at. Even just running my pack down to 50%, I can easily see the current rising and the capacity falling faster as it get's on the lower end of the voltage range. Any voltage regulating switching power supply will need to pull more battery current as the voltage drops to get the same amount of power out to the load.
 
Thanks for the input guys. I also realized that after a few cycles the battery monitor could "learn" the efficiencies. SO if it went down to 50% and back to 100% i and used what would seem to be 55% of energy to fill that 50% it could learn the inefficiencies. I imagine if one made a serious program it could evolve with the temperatures and age of the battery as well.
 
If you're at either end of the voltage/capacity curve, I guess it's pretty simple to reset, bust as mentioned previously, when you're on teh flat middle part of the curve you're pretty much stuck with counting, and the cumulative errors associated with that.
 
Thanks for the input guys. I also realized that after a few cycles the battery monitor could "learn" the efficiencies. SO if it went down to 50% and back to 100% i and used what would seem to be 55% of energy to fill that 50% it could learn the inefficiencies. I imagine if one made a serious program it could evolve with the temperatures and age of the battery as well.
I was thinking the same thing about how the program could learn the actual battery capacity over a period of time and as the battery ages.
The Chargery BMS has battery full and empty voltage parameter, which after the initial charge can be set different from the HVD and LVD ... I think the program uses those values to reset to 100% and 0% .... it could also learn how much AH the battery actually has and adjust the SOC and AH values. The battery would have to occasionally reach both of those levels to keep it in calibration. First they need to correct the inaccuracies in their coulomb counter.

What I am interested in when I look at the display is to know how much capacity is remaining. I like looking at the AH value ...
kinda like the miles left to empty on gas gauges in modern vehicles.
 
Last edited:
And we all know how accurate those are.
I have actually found the one in my new vehicle to be pretty accurate ..... but since I have not ran it out of guess it is hard to know for sure. I don't intend to run it down too low.
The trick is if getting low on fuel is to reset the average MPG so it starts recalculating based on current driving conditions and speed.
But ... this is a diversion I shouldn't have brought up.
 
Not really knowing batteries, I'm pretty sure that with lead-acid, electrons you put in become electrons you get back out, with power lost due to difference in voltages.
Except, of course, the extent to which self-discharge occurs over time, and any outgassing where current flows to perform electrolysis such as during equalizing.
 
Back
Top