This is incorrect operation and will eventually damage the battery.
Are there any scientific studies documenting the degradation?
My understanding is that there could be a 3-7% reduction over a 10-15-year expected cycle lifespan of LiFePo batteries, and I think Will alluded to that in a recent video.
Me being offgrid, charging to 14.2V and holding on my ~5kWh 12V battery bank during winter months ‘added’ enough usable capacity here in northern New England that it eliminated my ‘historical’ generator usage. Offgridders need everything they can get, and even if it “cost” me 7% less lifespan the 0.2-0.3V increase is worth it.
With weather patterns that give 2-5 cloudy days repetitively during The Dark Months being able to fully use my system without constant SOC monitoring is a valuable benefit on a daily basis that’s worth the “$75” ‘lost’ over a 12-15-year period; before upping my charge boost and float voltage I have spent $60-$100 on generator gas over a single winter.
Conclusion: using my batteries including hammering them all day with higher charging voltage is ~10% of the cost of being conservative with my batteries.
I bought my solar power products to fully use. So I do that.
Ignoring powerco install fees (more $ than my solar system) it’s $350-$400 less costly per year to be on solar only versus the monthly minimum charges before consuming
any powerco kWh and my power has not gone out in storms since my first setup in spring 2019.
Am I wrong?
I am curious about what The Usual Suspects think about this.
*ps: In fwiw department; I can fully replace my system for ~$5k, or I can get grid installed for $5-$6k. NOT having the monthly minimum grid charges alone (no kWh) saves nearly $5k in ten years of usage. .