diy solar

diy solar

Chargery BMS News / Update (July-27-2020)

Bits do not deliver accuracy, they only deliver resolution.
The quantity of bits also provides accuracy, not just resolution. The adc is not a perfect measuring device. It may have an accuracy spec of 2 LSB. That means a 12-bit adc is only accurate to 10 bits.
 
Good Day Folks,

I am posting this in ADVANCE as I have been fielding questions in PM and figured I should share what's happening and answer the Q's for everyone, so I thought a small information update is due for the membership here. When the New Manuals etc are posted, I will update the resource section.

Jason has been listening & hearing the input from the BMS users and so things have changed up a bit as a result.
- New 4.1 Owner's Manuals will be out after some corrections this week (I hope). [Just looking at the new drafts now]
- Firmware updates have been done and will be posted as well. (if not already)
- New Shunts have been tested & selected which will now be available.
- New DCC Contactors have been developed by & for Chargery which have some great features that were very needed. sorry I don't know what the pricing will be.
I hope to be able to test the new DCC Contactors soon but it is outside of my hands at this time.

The new Shunts :
View attachment 18556
The new DCC Contactors:
View attachment 18555

The new DCC-300 showing Big Lug useability.
View attachment 18557

For the Really Curious !
View attachment 18558
I really LIKE seeing Chargery's ongoing effort towards improvements. Question about the new Shunts (noticing I could update my 300 amp version at reasonable low cost): Does the newer shunt version somehow allow the BMS to be more accurate than the older shunt versions? or serve some improvement other than looking better/ and providing a better battery lug connection? ... just wondering / been study the threads a bit
 
I really LIKE seeing Chargery's ongoing effort towards improvements. Question about the new Shunts (noticing I could update my 300 amp version at reasonable low cost): Does the newer shunt version somehow allow the BMS to be more accurate than the older shunt versions? or serve some improvement other than looking better/ and providing a better battery lug connection? ... just wondering / been study the threads a bit
It allows for safer mounting it is also "likely" more accurate. For me, upgrading to 4.2 was a large improvement in the Amps accuracy dept.
 
The quantity of bits also provides accuracy, not just resolution. The adc is not a perfect measuring device. It may have an accuracy spec of 2 LSB. That means a 12-bit adc is only accurate to 10 bits.

Cell voltage accuracy is +/- 5 mV. Therefore you will see jumps of 10 mV.

ADC specifications and performance are a deep and thick conversation. In the context of BMS applications, things are more simple because the frequency response requirements are extremely low - 10hz is plenty. This means that you are oversampling by something like 1000x giving the 12bit ADC and analog front-end system an ENOB of 12bits. In a higher-speed application it may top out at 10bits, but there is no reason to do that in a BMS where you are only looking for very slow trends with actual voltage accuracy as a priority.

As for the +/-5mV specification - that is generally for an accuracy specification and not stability. That means that it could read +/- 5mV but should be expected to have stability (repeatability) FAR greater than that. Even an ultra-cheap DMM is stable even if it is not particularly accurate. In a cell balancing application, the precise voltage is not as important as being able to read each cell with the same error - the goal is primarily to match the cell voltage. The jumping around is either a hardware design deficiency or a firmware problem - or both. That results in some strange behavior of the balancing circuits that rely on stable information.

The firmware is ultimately what decodes the data coming from the ADC and decides how it correlates to a specific voltage. In very low frequency response applications, a multitude of ADC sins can be overcome in firmware to get the full resolution in an accurate manner.
 
In the context of BMS applications, things are more simple because the frequency response requirements are extremely low - 10hz is plenty. This means that you are oversampling by something like 1000x giving the 12bit ADC and analog front-end system an ENOB of 12bits. In a higher-speed application it may top out at 10bits, but there is no reason to do that in a BMS where you are only looking for very slow trends with actual voltage accuracy as a priority.
Doesn't oversampling need to be done at a rate higher than the Nyquist rate, or twice the bandwidth? At 10 Hz bandwidth, can you make 1000 adc conversions in 50 ms? No way will you be able to do this. LTC2452 is an adc I'm currently looking at. It takes up to 23 ms to make one measurement. If you're dealing with 8 cells then there's a huge amount of processor time dedicated to oversampling. It's so much easier to start with a higher resolution adc than oversampling to improve resolution.

If you have to oversample 100 times to improve resolution then I think you are putting lipstick on a pig. Oversampling does nothing for offset errors, linearity errors, voltage reference drift errors and thermal drift which is a huge portion of the +/- 2 LSB specification. At 2 LSB error, 10-bit adc has only 10 bits of resolution.
 
Doesn't oversampling need to be done at a rate higher than the Nyquist rate, or twice the bandwidth? At 10 Hz bandwidth, can you make 1000 adc conversions in 50 ms? No way will you be able to do this. LTC2452 is an adc I'm currently looking at. It takes up to 23 ms to make one measurement. If you're dealing with 8 cells then there's a huge amount of processor time dedicated to oversampling. It's so much easier to start with a higher resolution adc than oversampling to improve resolution.

If you have to oversample 100 times to improve resolution then I think you are putting lipstick on a pig. Oversampling does nothing for offset errors, linearity errors, voltage reference drift errors and thermal drift which is a huge portion of the +/- 2 LSB specification. At 2 LSB error, 10-bit adc has only 10 bits of resolution.

Many valid points....
A quick look at the LTC2452 reveals that it does a LOT of internal processing so that you don't really need to do much other than provide a stable Vref. The very slow conversion time is after internal oversampling and error correction. I think it even does temp compensation too. The design is also optimized for muxed input switching so you can measure inputs with a shared analog and digital path. All inputs would essentially have the exact same errors and since a BMS balancing circuit is all about nullifying the differences - all other errors become insignificant. Offset errors and temperature compensation can easily be done in software based on a simple LUT.

For $2, you can get a 16bit 153ksps ADC MCP3461 which is available in up to 4 diff channels or 8 SE channels. It's a modest ADC, but plenty of performance for an application like this. It has an internal Vref along with external which means that you can share an external ref among many ADCs and get 8 inputs on each one. All the muxing would be done inside the ADC's.

Most converters do internal oversampling and many have other error correction schemes for the usual sources of errors. That keeps the application process from having to deal with any of it. Just program the ADC to do the heavy lifting and give you the final result. The balance of errors left to deal with are slow and reasonably easy to manage in software. In BMS world, we are only talking about low frequencies and mV's so the challenge is modest.

My point on this thread is not to dig deep into ADC circuits and signal processing - but rather point out that the performance I see on the Chargery is not very good and I don't think the 12bit converter is the glaring issue. My estimate is that there are numerous circuit design, layout, and firmware issues that make up the bulk of the errors and instability I see. If those same errors persist - no number of bits will help improve the final result.
 
For $2, you can get a 16bit 153ksps ADC MCP3461 which is available in up to 4 diff channels or 8 SE channels. It's a modest ADC, but plenty of performance for an application like this.
Being in the electronics arena, ... and a little over my head to fully undersand; I found pdf specs showing you are talking about a microchip. Just wondering: Ware you suggesting a DIYer could use such a chip to make a BMS more accurate , or maybe suggesting the manufactor of the BMS could make their BMS more accurate with a $2 microchip ... ??? Question for my curious side :+)
 
Definitely more of a suggestion that a manufacturer has no excuse for fiddly results with modern components available. DIY, not so much - but at least end users have a sense of what they should expect in a commercially released product.
 
Doesn't oversampling need to be done at a rate higher than the Nyquist rate, or twice the bandwidth? At 10 Hz bandwidth, can you make 1000 adc conversions in 50 ms? No way will you be able to do this

You totally can, but it'll cost you money for a really nice ADC ^^


LTC2452 is an adc I'm currently looking at. It takes up to 23 ms to make one measurement.

Yep, it's a nice ADC because it's pretty accurate and stable but it's slow, that's why it's not too expensive ;) When I selected it I knew I didn't need a super fast ADC so anything with a few dozens S/s or more would be okay.


If you're dealing with 8 cells then there's a huge amount of processor time dedicated to oversampling.

Well, the ADC takes a long time to do the conversion but your MCU is totally free to do anything you want during that time, then, when the ADC has finished you just have to extract the 16 bits, which is super fast (for example with a 1 MHz clock on the SPI comm it would take less than 20 µs). So you can have a timer for reading the ADC every 25 ms for example, that would take less than 0.08 % of the CPU time with my 20 µs example ;)


It's so much easier to start with a higher resolution adc than oversampling to improve resolution.

That's probably true. Especially as finding a low res ADC but with a low tempco will be harder than using a higher res ADC which already should have a low tempco.


If you have to oversample 100 times to improve resolution then I think you are putting lipstick on a pig. Oversampling does nothing for offset errors, linearity errors, voltage reference drift errors and thermal drift which is a huge portion of the +/- 2 LSB specification.

Very true ;)


At 2 LSB error, 10-bit adc has only 10 bits of resolution.

I think you meant 8 bits :)


Many valid points....
A quick look at the LTC2452 reveals that it does a LOT of internal processing so that you don't really need to do much other than provide a stable Vref. The very slow conversion time is after internal oversampling and error correction. I think it even does temp compensation too. The design is also optimized for muxed input switching so you can measure inputs with a shared analog and digital path. All inputs would essentially have the exact same errors and since a BMS balancing circuit is all about nullifying the differences - all other errors become insignificant. Offset errors and temperature compensation can easily be done in software based on a simple LUT.

Yep, you basically summed up why I chose this ADC ? but I think I'll do coef based corrections instead of using LUTs (simpler and, most importantly, I only have 256 bytes of eeprom which will be pretty full with all the others settings and calibration data).


My point on this thread is not to dig deep into ADC circuits and signal processing

But that's always fun and interesting ?


My estimate is that there are numerous circuit design, layout, and firmware issues that make up the bulk of the errors and instability I see. If those same errors persist - no number of bits will help improve the final result.

That would be my guess too.
 
My point on this thread is not to dig deep into ADC circuits and signal processing - but rather point out that the performance I see on the Chargery is not very good and I don't think the 12bit converter is the glaring issue. My estimate is that there are numerous circuit design, layout, and firmware issues that make up the bulk of the errors and instability I see. If those same errors persist - no number of bits will help improve the final result.
I still believe the 12 bit adc is a major issue with Chargery current measurement. Chargery is redesigning the BMS and now using a 16bit adc.

Take a look at a popular current monitor IC. The INA226 uses a 16 bit adc. The LSB step size is 2.5 uV. Using a 300A, 75 mV shunt, current granularity is 10 mA. That's a reasonable step size.

From my measurements (see chart), Chargery's current resolution appears to be 250 mA (with 300A, 75mV shunt). The difference between 16 bits and 12 is a factor of 16. If we scale the 10 mA resolution (with 16 bit adc) to a 12 bit unit then we should get 10 mA * 16 = 160 mA. So yeah, you're right. Chargery could have done a better job designing the current measurement. They could have gotten 160 mA.

Still, why settle for 160 mA (if correctly designed) when you could get 10 mA resolution?

chargeryresolution-jpg.23584
 
Last edited:
I still believe the 12 bit adc is a major issue with Chargery current measurement. Chargery is redesigning the BMS and now using a 16bit adc.

Take a look at a popular current monitor IC. The INA226 uses a 16 bit adc. The LSB step size is 2.5 uV. Using a 300A, 75 mV shunt, current granularity is 10 mA. That's a reasonable step size.

From my measurements (see chart), Chargery's current resolution appears to be 250 mA (with 300A, 75mV shunt). The difference between 16 bits and 12 is a factor of 16. If we scale the 10 mA resolution (with 16 bit adc) to a 12 bit unit then we should get 10 mA * 16 = 160 mA. So yeah, you're right. Chargery could have done a better job designing the current measurement. They could have gotten 160 mA.

Still, why settle for 160 mA (if correctly designed) when you could get 10 mA resolution?

chargeryresolution-jpg.23584

As a side note - I have been using INA226 in my designs for years. It is an excellent device, especially when very closely coupled to the sense resistor. In a good design, the Kelvin connection to the sense resistor along with any RC filters is a few millimeters from the input pins of the ADC. This offers usable performance all the way to the noise floor of the ADC. This is similar to how Victron implemented their Smart Shunt which appears to have rather good performance.

Keep in mind, I am not saying that 12bits is the same as 16bits. What I am saying is that bits do not matter if an ADC is implemented poorly as a system. Think of a 500hp turbo Porsche engine mounted to a golf cart with sugar in the gas tank. It's a dysfunctional contraption.
If you replace the engine with a 1200hp W16 Bugatti - you still have a dysfunctional contraption.
 
To better match my identity on YouTube - I changed my name from Rx8Pilot to Factory400.

Hope that is not too confusing......
 
Most productive 400 sq ft in the world?

Raise your sights, go for the universe!

?
 
To better match my identity on YouTube - I changed my name from Rx8Pilot to Factory400.

Hope that is not too confusing......
I think we can handle that. ;)

Welcome to the discussion. I can learn a lot from you guys. Electronics has passed me by the last 20 years while I was out mountain biking 100 miles a week. Now that age has drastically limited my bike climbing abilities, I'm back fooling around in electronics.

So I got some LTC2450 on order but haven't picked a voltage reference yet. Any suggestions? (ebay or aliexpress?) A leaded part would be best as I'm hand wiring everything. A SOIC package would be OK. Got some solder paste on order. Never used it. Also have SOIC to DIP circuit board adapter on order.
 
So I got some LTC2450 on order but haven't picked a voltage reference yet. Any suggestions? (ebay or aliexpress?) A leaded part would be best as I'm hand wiring everything. A SOIC package would be OK. Got some solder paste on order. Never used it. Also have SOIC to DIP circuit board adapter on order.

What voltage and minimum specs do you want for things like initial accuracy, tempco, ...?

Also, the 2450 is SPI, the 2451 is I2C. Not sure if it's a typo but you wanted a I2C one, so I'm just making sure...
 
I tend to get all of my parts from DigiKey and Mouser - at least for development and pilot manufacturing runs. eBay and AliExpress are littered with fake parts, used parts, poorly handled parts, etc. DigiKey and Mouser are very reliable and very well stocked. The amount of time you put into projects like this is not worth fiddling with a bad component (ask me how I know o_O)

There is an enormous selection of references on the market. For this application, I don't see much value in the high-end models with temperature compensation, ultra-low noise, etc, etc. Even the low-ish cost parts have usable performance for test bench learning experiments. If you build something that eventually needs more exotic specs - it is not hard to drop it in. Just keep in mind that (like ADC's), fancy parts usually require some fancy design and build considerations to actually get the fancy out of them.

I have been using this reference in a commercial product for about year - nothing amazing but they are low-cost and performant enough for this application. You can spend huge money of Vrefs if you want - but I tend to start small and slowly move up if the circuit needs better performance.
DigiKey Voltage Reference
 
I tend to get all of my parts from DigiKey and Mouser - at least for development and pilot manufacturing runs. eBay and AliExpress are littered with fake parts, used parts, poorly handled parts, etc. DigiKey and Mouser are very reliable and very well stocked. The amount of time you put into projects like this is not worth fiddling with a bad component (ask me how I know o_O)

I can't agree more...
 
What voltage and minimum specs do you want for things like initial accuracy, tempco, ...?

Also, the 2450 is SPI, the 2451 is I2C. Not sure if it's a typo but you wanted a I2C one, so I'm just making sure...
I have the LTC2452 on order. Don't know the difference between 2450 and 2452. They both use 3-wire SPI. The plan is to use one 2452 per cell (total 4 cells). The unit gets power from the cell under test (voltage >2.5). Use a ADUM1201 for serial clock and MISO isolation. Will use a simple opto isolator for chip select. The two SPI signals will be daisy chained 4x. Not sure that's a problem. Chip select gets a direct (isolated) signal from the controller. Does LTC2452 Vcc require a regulated voltage? Or is a direct cell connection adequate?

Reference voltage should be around 2V. Or use higher voltage with resistor divider. I don't think that should be an issue. Initial accuracy shouldn't be a problem as calibration will take care of that. Tempco? as good as it gets.
Edit: Ref voltage can't be greater than LVD. 2.5V is max.

So far I've been getting all my parts from China. Haven't had any issues. The parts are just for my use, not selling anything. I get a bit turned off with shipping costs. I paid more for shipping to get some inductors and chip capacitors from Mouser than the cost of the parts. I just had the option of purchasing a part (from the same web site) shipped from the US (where I live) or China. US shipping was about $7 and China shipping was $1. WTF? It's not like I have an organized plan and make one large purchase. Purchases come in dribbles.
 
Last edited:
Back
Top