diy solar

diy solar

Epever 4215AN charging dystrophy, and cutting out. Trials and possible solutions.

12VoltInstalls

life passes by too quickly to not live in freedom
Joined
Jan 18, 2021
Messages
7,038
Location
Vermont
I installed two Btr Power 140Ah 12V in parallel a week or do ago. No problems other than cloudy days which isn’t a failure of course. Charged fine with 4S2P 100W panels through the existing 1012LV-MK. Previously had charged 36+ hours to 14.5V to reach zero amps with a 10A voltage controlled charger. Adjusted to 14.6 and it ran ~2 minutes to zero amps.

Planned migration to a 4215AN Triron Xtra led me to pull the solar cables from the 1012LV and installed to 4215AN. It charged for a while and then would cease charging at (usually) 12.9V. Reinstalled panel input to the 1012- no problems. Charged fine.

After reading here and remembering this experience as well as reading another thread and many more like this I lowered all the settings and against my sensibilities set the high-voltage disconnect to 15V.
My settings are:
253F4810-08C8-47C0-AC72-6D5CEE253DC7.jpg
…which is yielding:
7DD76DC8-2B6F-4459-95DC-88F52424BD2B.jpg
Note that it did shut down charging once today, but it resumed charging without me turning off the panels and turning them back on. Which was what I had to do yesterday as noted at the end of this thread.

As of this moment it’s been vacillating around ~13.1V for a number of hours.

My conclusions follow.
Please question them.


Related to past high-voltage cutouts with full sun and lead acid batteries I learned to assume that the Epever coding has a flaw that causes disabling of charging when it has voltage spikes from the panels that it allows because it doesn’t throttle them fast enough.
Using that logic, I wildhat guessed that the BMS’s were shutting down from high voltage. So I set the lower charge parameters to make a bigger hvsd differential and I imagined that I would also be keeping it off the Btrpower bms rev limiter. My numbers were influenced by an old post by @mikefitz - thank you.

I probably can’t discover what it will do in full sun for a few days because there isn’t any, but according to seeing that I can now charge over 12.9V settled I’m hoping that it’s problem solved.

Another issue that others mentioned was possible cell imbalance triggering the BMS’s. Batteries last check with my Klein vom yesterday were both 12.91V. Btrpower batteries have no comm so I can’t prove it but I think they are individually ok as well as balanced with each other.

Finally, a poor connection was inquired of. However, in testing Ohms from each battery’s pos and neg i don’t see that. Voltage is the same, and each battery is individually wired to a busbar through equal-length 48” cables; measuring between both the pos battery bolts and both neg battery bolts (which are connected through the busbar) I get zero ohms- 0.00 Ohms. So there’s no issues in my small mind.

Does this and my settings sound reasonable to others? Or just me?
 
Last edited:
Yesterday we finally had a few hours of sunshine. After a few checks of the Epever Triron Bluetooth app, the panel voltage reported was 13.1x Volts at ~2A which happened to be identical to the battery voltage and amps.
I should have screen-shot it.

Subsequently I turned off the panel breakers and turned them back on. Immediately panel voltage came up as 88V (which is an expected number) and ~3A of current, and the battery began receiving 13.1x Volts at ~19.xx Amps.

Clearly the Epever is cutting out somehow, so, “No!” I have not reestablished a dependable charging condition with the 4215AN in spite of my settings being modest.
Just a reminder that the batteries charge fine with the 1012LV-MK by merely moving the solar cables from the Epever to the 1012LV.
I hope that I can figure this out - I am getting frustrated with the Epever issues as it appears to be centric to the Epever as all else is equal. Any new thoughts?
 
Well, it definitely seems like the problem is inside the Epever.

As far as high voltage spikes to the battery shutting down BMS's, i wonder if that would show up on your 'Min Voltage' record? I can't remember from my personal experience (dont think ive triggered a bms high voltage shutdown in months) but do lifepo4's read 0v across the terminals when BMS shuts it off? I would assume so, but if there is 'leakage current' in the switching doodads it might act like a 'pull up circuit' and still show voltage when nothing is flowing, but voltage would disappear under the slightest load. Im just not sure on that one.
 
Last edited:
do lifepo4's read 0v across the terminals when BMS shuts it off?
BMS’s are not shutting off. Or at least not shutting off output - plus this time Epever was dragging at battery voltage and 12VDC load. So it appears the batteries are not misbehaving.
 
Ok. My theory was that if all the battery BMS's disconnected the event would be indirectly recorded by your Min Voltage dropping to something not-plausible, like <10.5v.
 
Surprise fairly good sun this morning.
SCC still functioning so far following turning panels off then back on after noticing 13.x Volts and ~2.x Amps on the Epever app.

The settings currently make it ‘report’ 100% charged; however, neither the batteries’ BMS’s nor the cell capacity have not curtailed charge with nearly 30A being sent to the batteries under sun filtering through high, thin clouds...

FB875214-78C3-4644-89BE-716336267C5A.pngView attachment 137429
Really hoping an event or someone with wisdom can reveal what goes wrong
 
Really hoping an event or someone with wisdom can reveal what goes wrong
No takers I guess.

This morning I noticed this:
5AB9C890-9810-4935-8FBA-55FD3C98F295.png
…and it was fair and sunny, but not optimal angles yet.
So I watched a bit, turned off the panels and back on, and immediately got this: 93CEE1A2-8219-4F25-94CB-0189F9AF4983.png
…which is basically the same watts, yet the voltage was only 24V?!
I don’t know how to explain this. Voltage from panels has been up high like I’d expect every time since I ‘reset’ the panels.
Should I give up on Epever?
 
Well, 105 is more than 78 so it may be valid that the 'maximum power point' at that moment really was at 24v and X amps vs 71v and Y amps.

But, the max power point of 4x ~18v panels in series is almost never going to be at 24v... maybe when an 800w array really cant make more than 100w, then it's max power point really is ~24v? But if in same conditions the other controller gets way more watts out of it, then that's NOT the max power point!

So you either have a major issue with the array itself, OR the epever is crap. You have 800w of panels hooked to a battery bank that should be able to take 800w of charge rate RIGHT up until its nearly full, which 12.anything volts, is certainly not. If the array is able to do somewhere close to its 'wattage' when going through the other charge controller and always crapping out to 100w and stopping charging on this thing, i would say the epever is trash. We've already sufficiently ruled out wiring problems, in my mind.
 
My experience with older Tracers (4210AN & 2210AN) has led me to believe that their processing power is not fast enough.

I never experienced the charging drop out, but do see spikes around 15-16v for a moment when the sun comes out from behind a cloud. Such spikes would cause my inverter to momentarily go into over-voltage protection. Setting max charging voltage to 14.4v didn't seem to change anything for me, it would still spike.
 
Last edited:
We've already sufficiently ruled out wiring problems, in my mind.
I’m 95% there. Except for the fact that the panel input is fine with the 1012LV-MK which tells me it’s not panel connections.
This just seems similar to when I was getting water in the MC4s, but changing SCCs didn’t fix it then. So logic says 100% not the cabling or panels.
My experience with older Tracers (4210AN & 2210AN) has led me to believe that their processing power is not fast enough
I drew that conclusion, too, when I had the Epever software functional (running lead acid then) and could watch realtime the over-voltage disconnect hitting 16+ volts when it should have limited at 15.1V.
When lowering the settings and disabling equalize wouldn’t cure it I found it easy to conclude that.
Setting max charging voltage to 14.4v didn't seem to change anything for me, it would still spike
That is the same behavior I see.

Yesterday was a full sun day all day. It was low voltage higher amps at ~8am so I turned the panels off and back on. Wish I’d thought first to leave it alone and observe.
Anyway, it was still doing 4-5A as sun began fading.

The batteries didn’t trigger grid charging overnight, so they took a good charge: and running the coffeemaker twice this morning didn’t trigger grid charging in spite of overcast today. And without intervention my panels are showing 81V at ~200W, battery volts at ~13V at about 10A input.

So that’s good.

It’s “nice” that I have grid charging available, but I ‘should’ be able to function fully offgrid given sun like yesterday. And I want to be able to depend on my system supporting when in a year or so I may buy a remote property.

When it’s working properly- like it did for a long time in the past, and did yesterday- my measly 800W gets me by. I’m hesitant to online the six 315W panels if that’s actually going to cause a nearly-zero charging condition. Pretty convinced it’s a software problem either by bad predictive programming or by inadequate processing speed to operate within settings parameters.

Not sure how to proceed other than selling everything and going Victron but I actually don’t want to do that.
 
So I have mostly sunny/bright with some in’n’out. I appear to be recovering a full charge, and batteries are accepting ~25.x Amps or 12.5A each. Klein meter shows one battery cable 11.74A the other at 11.75A so the assumption is they are balanced.

Three questions for The Wisdom:
1) should I be safe in assuming and having faith that the BMS’s are accepting amps to the cells because they aren’t saturated/fully charged? Should I just trust the BMS’s?

2) looking at the Epever reported (purported? Ha ha!) output screen as well as the settings screenshot- is there anything wrong with my settings?
These lowered settings appear to have overcome the assumed calculation/processing speed ‘bug’ in the Epever software/hardware composite, and it’s been behaving for a few weeks now. I have not observed it being locked into 7V or 12V at 10W or 20W or whatever since lowering the settings to these. I know these are a bit smaller numbers than the typical tight basic settings range often suggested but it’s working. Today it seems at risk of taking a full charge- although by “math assumptions” one could conclude a max harvest for the day of 2kWh, and the theoretical assumption is that the batteries will take ~3.3kWh to charge (I hit LVD on the 1012LV-MK at the end of coffeemaker run this morning but didn’t have grid connection turned on) and I “lose” ~28Wh or so per hour with the 1012LV

3) Because I am going to add in the six 320W panels over a 6420AN - which is way overpaneled- at some point soon I’m thinking about C-rate again. Btrpower recommended charging says 20A charging, or 40A for the pair. Six 320W panels without factoring efficiency, temperature, or vertical positioning losses have potentially 160A production at 12V nominal. Of course the 6420 will limit to 60A; but leaves a potential 100A of charging, or 50A each battery which is 250% over 20A.
I’ve often read LiFePo can be charged at .5C, which for a single 140Ah 12V nominal battery is 70A. Since 50A charge potential per battery is less than .5C, am I ok here?

4th question :)BONUS ROUND:)
Once I connect the ~2kW of panels should I set the 6420AN to the same settings as the 4215AN? I am thinking I should down-set the 6420AN to max out at 13.2V; basically equalize at 1minute/13.4V, boost at 15minutes/13.3V, float at 13.2, boost reconnect at 12.8V, charging limit to13.2 (if it will let me go that close), and OVD to 14.2 (because I think it might “show the bug” if I go lower.
Thoughts?C4E20658-3644-4955-8B35-5498FA4C1E11.png
A750DEE5-4C27-4D7B-A6CA-2F63AE041EE7.png
Thank you all again for your help with this.
It does appear Epever has the speculated slow processing which kicks it off. Starting to feel better about it now that I have some stability.
 
one battery cable 11.74A the other at 11.75A so the assumption is they are balanced.

Three questions for The Wisdom:
1) should I be safe in assuming and having faith that the BMS’s are accepting amps to the cells because they aren’t saturated/fully charged? Should I just trust the BMS’s?
So near-identical 'current acceptance' means near-identical internal resistance, which is good but doesn't necessarily mean identical state of charge. The common 'tail current' threshold of .05c would be 7a on a 140ah battery. So, you either have to wait until they both hit 7amps to know they are both fully charged (and hopefully they hit it at around the same time), or wait until the 'resting' voltage (voltage which is not being pushed up by a charge source) is 13.6+ to actually be sure they are near-full. But, the BMS will not actively stop charging by disconnecting until the voltage hits ~14.6. So if you don't plan to LET the batteries hit 14.6 you've got to implement your own charge termination through the charge source settings, or with a battery monitor controlling something etc.

If i understand the manual right, setting your Boost voltage to 13.8 should force the controller to transition to float once the battery hits 13.8 or has been in boost mode for a default of 120min, adjustable 10-180min. However, you have a screenshot showing 13.82v and still 24amps going in, which means the controller is putting out more voltage than the battery or there would be no differential to push said 24amps. If it were to stop charging, the 13.82v would drop to the 'real' battery voltage. Im not sure how this controller (or any of them!) do that voltage 'sampling' since as soon as they begin charging, they themselves distort that number upwards. That is a hole in my understanding of chargers in general.

Because I am going to add in the six 320W panels over a 6420AN - which is way overpaneled- at some point soon I’m thinking about C-rate again. Btrpower recommended charging says 20A charging, or 40A for the pair. Six 320W panels without factoring efficiency, temperature, or vertical positioning losses have potentially 160A production at 12V nominal. Of course the 6420 will limit to 60A; but leaves a potential 100A of charging, or 50A each battery which is 250% over 20A.
I’ve often read LiFePo can be charged at .5C, which for a single 140Ah 12V nominal battery is 70A. Since 50A charge potential per battery is less than .5C, am I ok here?
20a is ridiculously low. My brain automatically wants to read into that that it's an indirect admission that "these cells are crap, please don't prove it by trying all the things it should be able to do and asking for a warranty/refund when it doesn't exactly work". I think anything under .5C is fine, and 50a is less than that so.. also fine. The 20a recommendation does not instill confidence, but the likelihood of something bad actually happening is still extremely low i would guess, although possibly still higher than with a battery whose distributor said 70a was ok.. if you get what im saying? lol
 
a screenshot showing 13.82v and still 24amps going in, which means the controller is putting out more voltage than the battery or there would be no differential to push said 24amps.
This is partly why I’m asking these questions. That appears to be ‘disobedience’ to my settings in my mind.

But when I add the ridiculous overpaneling in I don’t want to finish off the batteries whose cells are likely used or subprime in some way.
if you get what im saying? lol
I do - hence the questions. I know what I know, and am I starting to get an inkling of knowing what I don’t know but I’m totally NOT a LiFePo expert.
So thank you.
 
near-identical 'current acceptance' means near-identical internal resistance, which is good but doesn't necessarily mean identical state of charge.
Ok.
But over time, as ‘good sun’ becomes the norm again, will not the balance themselves with the bms applying steady voltage but not tipping them over HVD? My settings should get me to 95%+ fully charged?
That is my understanding of how that would work and I hope I’m correct.
 
But over time, as ‘good sun’ becomes the norm again, will not the balance themselves with the bms applying steady voltage but not tipping them over HVD? My settings should get me to 95%+ fully charged?
That is my understanding of how that would work and I hope I’m correct.
Well, i dont think you meant to imply this but just to be clear, it would have to be the charge source pushing the battery up to HVD, not the BMS. So its all down to charge source settings/behavior whether you ever hit HVD.

As far as the BMS handling balancing, i doubt BTRpower gives any detail whatsoever of the balancing capability or behavior of whatever BMS they installed. But in general i think most of these BMS's don't begin balancing until at least some of the cells are near full, which i think means that even if your application is ok only using like <70% of the capacity of the battery, you should still charge it near to full at least sometimes so that balancing actually occurs. I dont think they balance at all voltages/SOCs although i have heard of BMSs that do that in other applications. I do not know much about what is 'normal' for non-bluetooth BMSs in cheap pre-built batteries, so this is another weak area where im not working off of much. The whole dont know what i dont know thing. :)

So in general the only reason we have multistage charging at all is to charge faster. It takes a certain voltage differential to push current around a circuit. In theory if the cells wanted 3.4v/cell to be full (just as an example) then setting the charge source to just put out 3.4 X 4 (4cells) would eventually fully charge the battery (and eventually actually harm it, thus needing to stop charging with tail current etc), but as the battery voltage got closer and closer to the charging voltage, charge rate would drop to a trickle. So we allow the charge source to use voltages that speed up the charge rate at low SoC, but would be damaging if used at high SoC. "Multi-Stage Charging".

Which is all well and good except most chargers aren't all that well documented as to how they ACTUALLY behave! Because you set your boost voltage limit to 13.8, so why is it still pushing 24amps at 13.82? Why isn't it in float? Probably because the charger has to 'pause' charging every once in a while to see 'actual' battery voltage rather than seeing its own effect of elevating the voltage during charging. So how often does it do this? Or does it wait until current drops below X amps @ 13.8v as a sign that the battery is actually AT 13.8 volts? Is it a 'rate of change' thing, amps drop by X % over X minutes? What are the 'behind the scenes' conditions that aren't explained in the manual? I dont know! In the screenshot you posted you would just have to sit there and stare at it (or datalog it somehow) for as long as it took to change to float and then try to reverse engineer from whatever clues were presented. That part sucks. I feel like in some ways paying more money for an extremely well-documented charger that electrically is all the same damn bits but with a better manual and perhaps better firmware, is spending money just to buy peace of mind..
 
Last edited:
dont think you meant to imply this but just to be clear, it would have to be the charge source pushing the battery up to HVD, not the BMS. So its all down to charge source settings/behavior whether you ever hit HVD.
Yes. What I meant was that with these ‘low’ settings- given long enough sun- that the voltage would never reach the BMS’s HVD, but that the continuously applied wattage would ‘push’ any low/unbalanced cell(s) up to the cells SOC that are at a higher state of charge and not accepting as much current. Which I ‘thought’ would be partially due to the battery battery management system seeking to equally charge the cells. But perhaps I’m naive.
If i were still fla I would not think about this.
most chargers aren't all that well documented as to how they ACTUALLY behave
Right.

It seems to be functioning without cutting off currently- so now I want to understand that:
Because you set your boost voltage limit to 13.8, so why is it still pushing 24amps at 13.82? Why isn't it in float?
… because it is an expected result.
I just want to get 7-10 years out of them and the advertised cycles should produce that result.
I just don’t want to continue to charge “forever” and damage my cheap batteries - although in theory as they reach ‘full’ they should reduce accepting voltage and curtail amps as a result. I think? @sunshine_eggo from reading this:
13.6V can result in low current overcharge. 13.5V avoids this.
…I would appreciate you commenting on my settings.

Thanks to all
 
Last edited:
What I meant was that with these ‘low’ settings- given long enough sun- that the voltage would never reach the BMS’s HVD, but that the continuously applied wattage would ‘push’ any low/unbalanced cell(s) up to the cells SOC that are at a higher state of charge and not accepting as much current. Which I ‘thought’ would be partially due to the battery battery management system seeking to equally charge the cells.
Eventually, yes. But the practical limitation is that the correction of imbalance with a purely passive method which has a steep diminishing rate of return as the voltages get closer together (ie slows to almost nothing) can easily be outrun by the accumulation of imbalance in the first place. Thus needing to 'actively' balance. I think basically ALL lithium BMS's do balancing to some degree, but assessing the imbalance, or how effective the BMS is being in addressing it, is hard when it's all in a sealed case with no BMS no access to cell-level terminals.

The BMS doesn't intercept charge current and distribute it to the cells. The cells are wired in series, and the only thing the BMS can do to affect current on the series circuit is to disconnect, or open the circuit. It is either on or off and affects all cells equally in that regard. The BMS has voltage sensing/balance leads to each individual cell. Balance current passes through those wires. Those are relatively tiny circuits with tiny current limits. So even if a battery had 100a of charge current going into it, AND bms was actively balancing, it would still have a tiny current limit for balancing activities because those are totally separate circuits from the circuit that the 100a is going through.

In the pic below, all the large transistors on the top half essentially are in parallel forming one large on/off switch. The bottom half of the board contains the 'brains' and balancing circuits for the 4 individual cells.

HTB1pOwOc13tHKVjSZSgq6x4QFXaK.jpg


I just don’t want to continue to charge “forever” and damage my cheap batteries - although in theory as they reach ‘full’ they should reduce accepting voltage and curtail amps as a result. I think? @sunshine_eggo from reading this:
13.6V can result in low current overcharge. 13.5V avoids this.
…I would appreciate you commenting on my settings.
So what i have read is that if you dont EVER want to 'low current overcharge' your cells you have to feed them no more than 3.37v per cell, which x4 = 13.48v, thus ~13.5v.

But as a practical matter you have to be hitting your tail current FIRST, before you have to worry about low current overcharge. So assuming a 280ah bank and .05c tail current, you need to be down to 14amps going in, before you worry about it. So the example above of 13.82v and 24amps means you are not there yet in that situation.
 
That is helpful
It appears I’m ‘safe’ and functional, just not empirically “perfect” which is fine with me as long as I have ‘enough’ charge and time is indicating that I do/will.

Thank you
 
Update:
I had seen things apparently working well so stopped paying attention.

Both yesterday and the day before- both good sun days- the 1012LV-MK charger came on in the evening. “What?!”

Therefore today I paid attention and things seemed fine- but I checked on it again just as I was leaving for the day and saw this:
1BBD78C1-96FC-4492-8900-B8E0E5A79908.png

After turning the panels off, and then back on, I got this:ED011D39-A9CF-4F67-A32A-8E663CFC1780.png

I had bumped up “float” to 13.4 but I now put it back to 13.2V.

Epever definitely has either a processor speed issue, or an inadequate programming subroutine for processing panel output. Or both.
What else could it be?

This is very annoying to me. I have HVD at 15V and reconnect at 13.3 and I don’t like that 15V as it ‘feels’ too high.

I should be able to be entirely autonomous at this point but I’m not. When I move to the property I’m purchasing grid is 50’ away, but other than the shop I SHOULDN’T have to actually need it. The solar setups are bought and paid for, but future grid usage is never paid for and I’d like to avoid it and should be able to but for the Epever issues.

PS the “instant” 100% charge on the monitoring screen shot is a misnomer. Epever interprets system voltage as buffered by load and panels to be 100% by apparently what ‘float’ voltage is set to, not actually SOC so ignore that if it bothers your head.
 
Last edited:
Back
Top