Around $200 including shipping (which was very fast FedEx). They are under $150 without shipping.What did that model cost you?
Around $200 including shipping (which was very fast FedEx). They are under $150 without shipping.What did that model cost you?
Ok, testing on the new and improved model. The one with Bluetooth, color display, remote voltage sense and comes with two temperature probes. They are very similar to the $30 models, just $20 to $30 more. The temperature sensor under the MOSFET reads low 30s Celsius at 15 amps, the diode (with no heatsink, heat dissipation side soldered to the board) measured right next to it low 70s Celsius. That same diode gets over 100 Celsius when run at 20 amps (with a Raspberry Pi heatsink on just not on the correct side). That is measured on the PCB with both the temperature sensor they send with it (taped to the spot) and a thermal temperature gun (not the most accurate, it's only $10)So knowing what you know about these cheapo 150W/180W testers and the components they use, would you have a recommendation for ‘safe’ current levels to run when testing an 8S string?
Or said another way, the temperature read out on these boards seems pretty reliable - is there a temperature you would recommend staying under?
Oddly enough, I asked the manufacturer some questions. They said that I might have a defective model, and are sending me two new models free. They made it sound like an older model had a problem but the new model doesn't. But I have the newest model. First time I have had a Chinese company offer free stuff. All it really needs is a voltage sense reading.What did that model cost you?
Yeah, I think you’ve nailed why these units fail at 20A or even 15A - no cooling got the diodes and they overheat.Ok, testing on the new and improved model. The one with Bluetooth, color display, remote voltage sense and comes with two temperature probes. They are very similar to the $30 models, just $20 to $30 more. The temperature sensor under the MOSFET reads low 30s Celsius at 15 amps, the diode (with no heatsink, heat dissipation side soldered to the board) measured right next to it low 70s Celsius. That same diode gets over 100 Celsius when run at 20 amps (with a Raspberry Pi heatsink on just not on the correct side). That is measured on the PCB with both the temperature sensor they send with it (taped to the spot) and a thermal temperature gun (not the most accurate, it's only $10)
I wouldn't run it above 15 amps if you have a lot of battery cells.
You might have to wait a couple days, but I can test it and run it. I'll let you know. The 10 amp I think I measured at 50 degrees Celsius, but I will check.Yeah, I think you’ve nailed why these units fail at 20A or even 15A - no cooling got the diodes and they overheat.
I’ve been limiting my current to 10A and if you want to be a real hero, it would be great to know what terms you are at 10A with your new fancy-dancy model with temperature probes.
I’m essentially using my 150W tester two ways:
-1S @ 10A (~32W)
-8S @ 5A (~128W)
No rush, really. Whenever convenient because your using your tester anyway.You might have to wait a couple days, but I can test it and run it. I'll let you know. The 10 amp I think I measured at 50 degrees Celsius, but I will check.
I have the same unit. I have been trying to calibrate it and now it is stuck on the wrong cell voltage (.15v lower than voltmeter) and won't go above 5.64 amps, no matter how much I turn the two knobs. Also when the knobs are all the way counter-clockwise as soon as I connect it to the cell it immediately pops to the above (3.09v/5.64a) and just stays there.He didn’t say it clearly. It’s10A at low voltage to calibrate current and 30V at low/no current to calibrate voltage.
The calibration for current and voltage don’t happen together, they are two separate calibration steps...
I have the same unit. I have been trying to calibrate it and now it is stuck on the wrong cell voltage (.15v lower than voltmeter) and won't go above 5.64 amps, no matter how much I turn the two knobs. Also when the knobs are all the way counter-clockwise as soon as I connect it to the cell it immediately pops to the above (3.09v/5.64a) and just stays there.
In one try in trying to 'calibrate' it with my PS set at 30v and 0 amps, then moving to the 10 amp setting while keeping the voltage as low as possible it seemed to register. I then press the tester's button while plugging in the power the volts showed 30v in the menu but when hooking it up to the cell it now displays the discharge voltage as 30 volts at 5.64 amps. All this time the knobs do nothing. I must have confused it somewhere by not doing it right. I started everything with knobs fully 'off'. Blew a couple of 12v 5 and 10 amp fuses on the way.
When you say 'supply', is this referring to the ‘Power Supply’ or the ‘capacity tester’ you are talking about? I need to calibrate my Capacity Tester.I hope you're adjusting voltage and current of the supply with battery disconnected!
Supply terminals not connected to anything, adjust voltage for target (e.g. 3.65V x number of cells)
Supply terminals shorted, adjust current for target (e.g. 10A)
Once adjusted, switch off supply, connect to battery (with BMS including disconnect) or cell, then turn on.
It appears many supplies end up regulating voltage as seen internally while battery sees a lower voltage. That slows the charging process, but best to just be patient. Good, low resistance connections and thick wires will minimize voltage drop outside the power supply itself.
If anybody cared and was willing to spend the money, they would get a supply with separate sense terminals.
If your supply is acting flaky, doesn't behave even open circuit or with a simple load like a resistor, could be damaged. Not surprising if cheap, or if it experienced overloads trying to charge batteries.
You should only pay attention to the Ah readings on these testers and ignore the Wh readings.When you say 'supply', is this referring to the ‘Power Supply’ or the ‘capacity tester’ you are talking about? I need to calibrate my Capacity Tester.
Now it is saying voltage is 3.17v and my meter says the cell is at 3.605. I am testing another cell. So will the tester be basing the running capacity count as 3.17v times current drawn, which will be wrong, as the true voltage is 3.605v.
Now several hours later my capacity tester is showing 3.09v and the meter is showing 3.306. Such a difference! It the capacity being computed by this lower voltage or the actual voltage?
Cheers