diy solar

diy solar

troubleshooting trimetric meter inaccuracy

thaddeusk

New Member
Joined
Jun 7, 2021
Messages
11
I'm not sure if this is the best forum for this question, but here it is -
last fall I upgraded an aging AGM bank on my off grid property with a bank of LFP prismatic cells. nearly the same total capacity (~28kwh @ 24v) but of course the LFP allows much greater discharge. overall it's been a great improvement. but... the trimetric meter has trouble keeping track of the ah in/out. I have turned it up to 100% efficiency, but every day that the system cycles but doesn't reach totally 'full' (to reset the meter in the trimetric) the ah count gets further off of reality. it seems to not be counting the ah coming in (or is overcounting the loads, I guess) so that after a just a few days it will think the bank is at ~30% when it's actually at ~70%, and if I don't get a full charge for a few weeks, it can get way out of line... telling me that more ah have been removed from the bank than it can hold, by multiples. as far as I know the meter was accurate on the old bank, and nothing changed in the way the system is set up (except the addition of a BMS inline on the bank)

any ideas? TIA
 
Set your Efficiency Factor to 92%.
 
Set your Efficiency Factor to 92%.
it was set closer to that before, for the AGM bank, and the discrepancy seemed worse. am I wrong in understanding that the efficiency factor discounts the charge amperage to account for the charging inefficiency of lead acid batteries, and that the lower the efficiency factor number, the less ah it counts vs. the measured number?
 
With FLA batteries, the EF is around 80%. While it gets better with LiFePo4, it doesn't go to 100%. Remember, the EF is a round trip system number. Depending on your system, you may need to tweek that 92% number a little. Here is a link to a data sheet that further explains this:

https://www.victronenergy.com/upload/documents/Datasheet-BMS-12-200-EN.pdf

It is also possible there is some other issue with the Trimetric like a connection that has too much resistance.
 
I read through Rod's article. it's excellent, though I was already familiar with most of that, he goes into good and clear detail.

all that still just makes me more confusted, though. almost everything he talks about is really about what causes ah counters to display a higher SOC than the battery actually is. In one example he shows, a 200 ah bank has degraded, and only holds 140. the SOC meter may show 50% and the owner thinks this is true, but it actually only contains 40ah, or 28%, when the meter says 50%. this is the opposite of the problem I am encountering.

just this morning I went and looked at the system. these are brand new, tested batteries. the bank capacity is roughly 1200 ah @24v. the highest charge rate they ever see is 125A. the highest discharge they ever see is about the same. this last week it had definitely not seen either of those 'extremes', since we are in a northern climate and the sun is still weak, and there are fewer people (and loads) right now.

a week ago we had several days of good sun and few loads, and the bank reached a 'full' charge (which is set at 27.6v to avoid the knee of the charge curve). at this point the SOC meter reset itself to 100%. the bank was cycled up and down in the midrange of its charge over the next week as we used power and got some, but not quite enough, sun.

this morning it was cloudy and the bank was getting a meager 15A charge. the voltage read 26.2v, which is roughly a 60-70% charge. based on my observations of this system I would guess that it may have been showing some voltage from the charging and was probably closer to 50-55%.

so, with the voltage at 26.2, the SOC meter read 3%. the counter believed that 1160ah had been removed and not replaced into the battery. the longer it goes without resetting, the more off it gets. After three weeks and lots of use, I once saw it counting 9,000ah 'negative' account even when the battery voltage indicated closer to 70-80% SOC. obviously this is way off, there's no way to get 9000ah out of a 1200ah battery and still have it at near-peak voltage.

I imagine it's possible that there's some problem with the shunt connection etc but I want to try to understand what it might be, based on the symptoms, and I can't wrap my head around how it could be accurately counting the outflows but not the inflows. if it's counting one, I would think it would count the other the same (except for the processor 'discounting' the charging by the set efficiency factor, but that brings us back around to the fact that the efficiency is already set to count every bit coming in and not discount anything - so it should read a 1 for 1, and it's not. if it were becoming inaccurate the opposite than it is - by reading a fuller SOC than the voltage indicated - I would reduce the EF setting as is appropriate until it became more accurate.)

I am able to mostly guess at the SOC off voltage and recent history, but I want other, less knowledgeable people to be able to know whether they should run a generator in the evening to charge the bank and avoid a cutoff before dawn. obvs that doesn't work based on the SOC meter if the meter is reading '0' for weeks at a time.
 
You're talking about the 2020 I assume? I always found the Trimetic to be finicky, even when I ~thought~ I had the settings correct it still ended up getting lost badly. Charged set volts and amps only worked when we got there otherwise errors just continued to accumulate. Can't say I was really ever confident I was setting them perfectly.

I've switched several customers over to the Victron smart shunt and it does an amazing job from what I can tell. I'n my limited experience Victron has really nailed their batteries preset. So far they've stayed right on track with ~10kWH of LiFePo4 system and another ~60kWH 2v FLA cells. We are also using the Cerbo GX which allows me to remotely see what's going on with customer's systems. Pretty fun to watch.
 
I've switched several customers over to the Victron smart shunt and it does an amazing job from what I can tell. I'n my limited experience Victron has really nailed their batteries preset. So far they've stayed right on track with ~10kWH of LiFePo4 system and another ~60kWH 2v FLA cells. We are also using the Cerbo GX which allows me to remotely see what's going on with customer's systems. Pretty fun to watch.

I might just go that route, I agree with you, I haven't always been impressed with the trimetric. to boot, It's located in a building that is a vacation rental and it would be nice to monitor the batteries without having to intrude.
 
I imagine it's possible that there's some problem with the shunt connection etc but I want to try to understand what it might be, based on the symptoms, and I can't wrap my head around how it could be accurately counting the outflows but not the inflows. if it's counting one, I would think it would count the other the same (except for the processor 'discounting' the charging by the set efficiency factor, but that brings us back around to the fact that the efficiency is already set to count every bit coming in and not discount anything - so it should read a 1 for 1, and it's not. if it were becoming inaccurate the opposite than it is - by reading a fuller SOC than the voltage indicated - I would reduce the EF setting as is appropriate until it became more accurate.)

I am able to mostly guess at the SOC off voltage and recent history, but I want other, less knowledgeable people to be able to know whether they should run a generator in the evening to charge the bank and avoid a cutoff before dawn. obvs that doesn't work based on the SOC meter if the meter is reading '0' for weeks at a time.
I think you're onto something. All shunt based monitors measure a tiny voltage across the shunt & use this to figure out the current. Since the shunts are usually near the battery and are exposed to the corrosive environment, there can be some level of corrosion on the voltage sensing terminals of the shunt which will cause phantom signals. The monitors can't distinguish between real, current flow induced voltage from the corrosion induced voltage. Clean the shunt using some baking soda solution, rinse with water & reconnect. Verify monitor's current readings with a clamp-on type meter.
 
Verify monitor's current readings with a clamp-on type meter.
Good points. You could also add checking voltage drop across the shunt and compare that to what the Trimetric is reading. It's probably a 500A/50mv shunt. Still working on my first cup of coffee.... take your mV reading x 10 to get amps. Of course confirm what shunt you have.
 
Last edited:
This may be to simple of a remedy for your problem but as I recall the "full" voltage setting should be 1/10 of a volt lower than your absorb setting to get the trimetric to set itself to 100%. I may be wrong here. Once it gets daylight I will go get my manual and confirm this.

Edit; The manual suggests 1% lower voltage set on the Tri Metric than the Bulk/absorb voltage setting of the controller. In my case I had to get into the charge controller settings and tweak the voltage reading so that my inverter, Tri Metric and SCC all agreed on voltage. BTW, I also verified the actual battery voltage with my trusty Fluke meter, which I trust over all others.
 
Last edited:
Efficiency factor will not affect the reference reset to 'full'.

Full reference reset is based on battery voltage setting you put into battery monitor AND charger must bring battery voltage at or above this setting. The monitor will then reset AH consumed to zero and set % to 100% full regardless of cumulative AH summation to that point. Usually, you want to set reset battery voltage to a little below absorb charger voltage.

It is necessary for you periodically allow battery charge to reach this reset 'full' level to allow monitor to re-establish its cumulative AH counter.

If you don't reset monitor 'full' reference periodically then monitor continues to work on 'dead reckoning' and any errors in summation of AH count will continue to accumulate.

Efficiency factor can refine accuracy of AH accumulation so you can go longer in time with reasonable cumulation accuracy.

Good condition LFP batteries will have better than 98% round-trip discharge-recharge efficiency at less than 0.3 C(A) cell current rate. It drops to about 95% when cell current is run up to 0.5 C(A) current rate.

Many monitors have their default efficiency factor set to 85% which is for lead-acid batteries.
 
Last edited:
The Trimetric (I assume others, too) uses the efficiency setting to derate the charging Amps (only). If you draw X Ah, it'll deduct it from the remaining capacity.. However, if return X Ah tot he battery, it'll record less than X Ah; exactly how much is set by the efficiency value.
 
Back
Top