It is degrees of bad - if contact resistance is high enough to run hot at full load but doesn't fail immediately, does that stand out?
When I started one project, there was zero support from software or firmware. I made everything either straight hardware or set by DAC and read by ADC, control loops in hardware. Settable tuning parameters would have been a nice addition.
Software/firmware, either need reliable time-sharing/RTOS, or a tight enough loop. This does get implemented in some products, but I had no faith in the team even once it was formed. Especially things like keeping a filament from burning out, I wanted fast hardware protection.
I'm thinking if you just rotate through measuring current and all cell voltages, over windows of time by logging high and low of each, you can detect outlier voltages and also calculate IR.
Mean of current gives power from battery, while RMS gives heating of wire/fuse/MOSFET.
4th scope image shows current transformer around battery cable, inverter 40% loaded with resistance heater.
I think at 100% load the DC current (which can be inferred from AC CT measurement & DC reported by inverter or a clamp DC ammeter) would reach about 0A for lows. So a large signal in both you current and cell voltage readings, that you have to average out in some manner.
After reading of someone tripping battery breakers at well below rating, I realized that could be due to pulses of current drawn by the inverter. Ideally capacitors would smooth out current draw so high frequency switching pulses and 60 Hz draw all came from the caps, and current from batteries...
diysolarforum.com