diy solar

diy solar

Semi-DIY BMS: Active Balancer + Microcontroller

schroederjd

Solar Enthusiast
Joined
Nov 12, 2020
Messages
175
Hi Everyone,

After spending the last 2 months building my battery (48v, Lishen 272Ah cells, build thread here), I'm considering abandoning the TinyBMS s516 that I'm currently using. I like a lot of the features of the TinyBMS has, but it has a few quirks that I'm struggling to get around. Rather than shell out another $250 for a replacement TinyBMS (which may not behave any better than the last one), I'm considering a different option. After reading through a few of the 'full' DIY BMS threads, I'm confident that option is way beyond my skill/knowledge level. I do love the idea of controlling things with a ESP32-based MCU (or similar). And that has lead me to the option of using an off-the-shelf 'smart' active balancer for the cell-level voltage measurement and balancing, and the MCU+relays for protection features.

Here's where I'm at with the design. I would really appreciate any feedback on this.

Functional Requirements:
  • At least 1A of balancing current (the quality of my cells is questionable, at best...)
  • NLT 80A continuous charge current, NLT 150A continuous discharge current
  • Separate-port operation with an all-in-one inverter/charger
  • Cell-level low-voltage load disconnect
  • Cell-level high-voltage charge disconnect
  • Low-temperature charge disconnect
  • Remote monitoring by Wifi, minimally including individual cell voltages, pack voltage, SOC, pack current, pack temp, and ambient temp
Main components (total cost: $173, excluding the relays, which are already part of my TinyBMS build):
System design overview:
As previously stated, cell-level voltage measurement and balancing would be performed by the balancer. The newest version of the JK balancers have the option for RS485 communication. Thanks to upnorthandpersonal, I've already got the RS485 protocol, so it should be fairly straightforward to get the cell-level information from the balancer to the ESP32. The other key inputs for the MCU would be the current readings (from the LEM HOYS) and temp readings (from the thermistors). Those would go through the ADS1115 16-bit ADC (not for the extra bits, but because the ESP32's internal ADC is pretty terrible). Load and charging disconnects would be separately controlled by the ESP32 using the relays listed above, likely driven by optocouplers. The 5A relays would be wired in series with the inverter on/off switches to disconnect loads in the event of LVP event (and to allow for remote shut-down of the inverters). The 40A relays would go on the positive side of the incoming PV lines (one for each MPPT) to disconnect charging in HVP or LTP events. Battery box heater would also be controlled by the MCU.

Anyway, i'm still pretty early in the design process, so would very much appreciate feedback.

Thanks in advance.
 
The only thing I don't like about the idea is that critical protection functions will be dependent on RS 485 communications ..... See, I was careful to put the RS 485 in there since many of the design types on here define communications differently than I do.
I prefer that critical protection data gathering and controls all happen in the same device.

I do really like the idea of having total control via a custom programmable device.
 
Interesting. Is it specifically RS485 that concerns you, or more that it's two 'independent' devices. They do offer that same active balancer with CAN bus. Would you think that might be a more reliable option?
 
Interesting. Is it specifically RS485 that concerns you, or more that it's two 'independent' devices. They do offer that same active balancer with CAN bus. Would you think that might be a more reliable option?
No ... I have a background in controls and just don't consider it a good idea to be reliant on any type of communications for critical functions. I consider the comm link to be a somewhat weak link.
 
Got it. I'll have to think through loss-of, or errors in, communication as a potential failure mode.

Something I've thought of but have absolutely no clue on: Would there be any way to leverage the circuitry on a 'dumb' active balancer to get cell-level voltage readings into an MCU in a more direct way (ie, no device-to-device communication required).

Thanks for the quick feedback Bob.
 
Got it. I'll have to think through loss-of, or errors in, communication as a potential failure mode.

Something I've thought of but have absolutely no clue on: Would there be any way to leverage the circuitry on a 'dumb' active balancer to get cell-level voltage readings into an MCU in a more direct way (ie, no device-to-device communication required).

Thanks for the quick feedback Bob.

There are ways to mitigate it to some degree by setting up some sort of "heartbeat" check type thing that is dependent on communications .... If you loose that then take some emergency action.
 
There are ways to mitigate it to some degree by setting up some sort of "heartbeat" check type thing that is dependent on communications .... If you loose that then take some emergency action.
If it's not getting the signals it should, it shuts things down. That's beautifully simple. I'm going to assume that you just came up with that idea right now...
 
If it's not getting the signals it should, it shuts things down. That's beautifully simple. I'm going to assume that you just came up with that idea right now...
No ... I've done it 100 times .... Really

There has to be some state or value that is changing in a way that can be predicted .... or you have to have a programmable device on the other end to send the heartbeat signal.
 
So, I'm going to assume that the lack of interest in this thread is NOT due to the fact that it's a complete crap idea, and push forward.

I've decided to purchase the balancer through Heltec. They have a 16s, 2A version with RS485 comms for $117 plus shipping (tbd).

I've also done some testing of the current, voltage, and temperature measurement circuits. I'll have to post a detailed wiring diagram later, but here's a couple pics of the set-up so far. I wasn't able to find a HOYS-100 current sensor in stock, so i'm using a LEM HASS 50-S (that's the blue thing in the picture on the right) for the time being. The HASS runs off 5V, so I'm running the ADS1115 (the blue breakout board on the left side of the breadboard) at 5V along with a line-level shifter (the red board in between the ADS1115 and the ESP32) to take the I2C signal down to 3.3v.

IMG_7656.jpg IMG_7657.jpg

Here's some test results of the current sensing setup. For this run, I'm taking readings at 1 sec intervals. At each interval, I'm taking the average of 50 adc readings. This does a nice job of 'filtering' out the ripple, which is quite significant (without filtering, I see about +/- 10% ripple in the readings).

These first two plots show sensitivity and linearity at very low current levels. Note that I'm running the ADS1115 at the default '2/3' gain setting, (or 0.1875 mV/step) which gives me the full +/- 5V to work with. At that resolution, the current sensor, which has a 'senstivity' of 12.5 mV/A, has a theoretical resolution of 0.015 A (0.1875 mV/step / 12.5 mV/A = 0.015 A/step), which is more than I need. To produce this trace, I used a 30 ohm resister and just went down one side of my bank, essentially increasing the voltage by 3.28V at a time. As you can see, the readings are incredibly stable, even at the lowest level (theoretical current of 0.109 A). Linearity is spot on (R2 of 0.999), and the intercept isn't too bad either. Note that for all the plots "Readings" are in Amps.
Lin1.png Lin2.png

The next test I did was 15A and 30A discharge, followed by 15A and 30A charge. Discharge was done through a Growatt inverter using a space heater as the load. Charging was also done by the Growatt, with AC mains supplying the charging power. Note that negative values are discharging, and positives are charging. Once again, rock solid readings at all levels. Noise is extremely low (in my opinion), with values rarely deviating by more than +/- 0.1A, even at 30A discharge/charge (yeah, I realize you 12V guys probably think that's way low, but this is a 48V system). Overall, I'm really happy with how this is coming along. I think the true test may be thermal stability, but we'll see. Much more testing to come.
discharg-charge test.png
discharge-noise.png charge-noise.png
 
Nice progress. Just to confirm my understanding, you are reading data at the pack level, capturing data at regular intervals. Are you capturing data on your PC/Mac?
 
I would also second the suggestion to have the system fail safe in the event of lost communication. Should be fairly easy to implement.
 
Nice progress. Just to confirm my understanding, you are reading data at the pack level, capturing data at regular intervals. Are you capturing data on your PC/Mac?
Yes, I'm reading current, voltage, and temperature at the pack level at this point (don't have the balancer yet). I'm just using the serial monitor to capture data for these short tests (at 1sec intervals). For longer tests, I'll probably move to SD card logging, or for really long tests, use Blynk.

I really want precise/accurate/reliable voltage and current measurements at the pack level for coulomb counting (which is what's lacking in my TinyBMS). I'm planning to use direct measurements of pack voltage, rather than data from the balancer, for those calculations, as I wan't the voltage and current readings to be measured concomitantly (well, as much as possible). At this point, I'm just using a simple voltage divider (R1 = 100k, R2 = 10k) to get the pack voltage into the ADS1115's range of +/- 5V. With no filtering or smoothing of any kind, here's what the voltage readings looked like over that same discharge/charge test run from above. Haven't confirmed the accuracy of these readings yet. Still have work to do to figure out what I need to do to get accurate pack readings.

voltage.png
 
Last edited:
Thinking further, you will need to code some fault detection and handling logic. Its hard to do without knowing the common failure modes though. An open balance lead for example, and handling and transients.

Is your serial connection going to isolated? Its probably not necessary?
 
Thinking further, you will need to code some fault detection and handling logic. Its hard to do without knowing the common failure modes though. An open balance lead for example, and handling and transients.

Is your serial connection going to isolated? Its probably not necessary?
No, I wasn't planning to isolate the serial connection. Can you think of any reason why I wouldn't want the Balancer to share a common ground with the ESP32?
 
FWIW with my Orion BMS, if comms are lost, the contactor is opened. Even the Victron's will shut down if they lose comms with the battery.
Interesting .... didn't know that.
 
Can you think of any reason why I wouldn't want the Balancer to share a common ground with the ESP32?
Nothing immediately comes to mind. I don't think a failure of the balancer causing high voltage to be present on the serial would happen.
 
For pack voltage measurement, I'm planning to use the same ADS1115 that I'm using for current measurement. That ADS1115 is currently running off a 5v supply, which means it can measure up to 5.3 V. To get pack voltage into that range, I could just use a simple voltage divider. With that approach, it seems I'll need to balance accuracy against leakage current (not sure if that's the right term) through the R2 resistor.

As I understand it, the ADS1115 has an impedance in the neighborhood of 6M ohms. I'm currently using a voltage divider w/ R1 = 100k, and R2 = 10k. With R2 @ 10k, my leakage current is 0.465 mA ((51.2V / 11) / 10k ohm), which seems ok? Again, if I'm understanding this correctly, with some current leaking through the ADS, the ADS and R2 basically become another voltage divider. So, my 'ideally' divided voltage of 4.655 becomes 4.647 [4.655 * (6M / (6M+10k))]. Which corresponds to an error of -0.116%. So, for a true value of 51.20, my reading would be 51.11.

First off, do I have ANY of that right? If so, I'm thinking that I can probably live with both that level of leakage current, as well as that level of accuracy. Or am I missing something in these calculations? I see others using op-amps to improve the accuracy of voltage measurements, but I'm not sure that's needed in my case, as I'm only measuring a single voltage. I also have no clue how to use an op-amp...

First one to check my math on this gets a gold star.
 
Last edited:
i think your idea is great

rely on heartbeat for failure shutdown

this is a great thread!!

using balancer and data is great idea imo
 
First off, do I have ANY of that right?

Yes but you can't ignore the 100 k resistor, the error you can get is higher than what you calculated. Also the 6 M figure is a typical value, worst case can be worse. And the leakage current is random within the range specified, it's not like it's a nice fixed 6 M resistor, so you can't really calibrate it.

Often you'll find the recommended maximum value for input resistors in the datasheet section talking about the anti-aliasing filter (but also apply to any other resistance like your voltage divider for example). Here it says "Limit the filter resistor values to below 1 kΩ." and I would highly recommend to follow that, but that means your voltage divider will have a very low impedance so a high current consumption. That's why you want to use an op-amp so you can have a high impedance divider ;)

Also, the Vdd + 0.3 V figure is from the absolute maximum ratings table, never design using those values (read the warning just under the table for more info). The recommended operating conditions table says Vdd is the max for input voltage (and personally I would avoid going near Vdd anyways since you can expect having most of the non-linearities here).

Note: the datasheet says "If a VDD supply voltage greater than 4 V is used, the ±6.144V full-scale range allows input voltages to extend up to the supply. Although in this case (or whenever the supply voltage is less than the full-scale range; for example, VDD = 3.3 V and full-scale range = ±4.096V), a full-scale ADC output code cannot be obtained. For example, with VDD = 3.3 V and FSR = ±4.096V, only signals up to VIN = ±3.3 V can be measured. The code range that represents voltages |VIN| > 3.3 V is not used in this case." so if you power the ADC with 5 V the recommended configuration would be to use the 4.096 V reference and scale your input signal to be just under that.

Note²: the datasheet also says "Single-ended configurations use only one-half of the full-scale input voltage range." so in SE configuration it's like you have a 15 bits ADC at best. It's not a big deal but you can't ignore it in your accuracy and resolution calculations.
 
Back
Top