diy solar

diy solar

CTs Vs. Toroids

Ordered some CTs off Amazon. Suspect I was overpaying for the $0.43 ones on AliExpress, but they might have been different and good quality.
Nah.

Written on them is dl-ct-08cl5 and 0-120A 2000 turns. The measure Capacitance was .255nF and ~150Ω.

If it was 30 AWG, 2000 turns should be ~12 Ω. 40 AWG is about 10x higher. But 40 AWG can only carry 13.7 mA. It's listed at 2000 turns & 120 amps max... which gives 60 mA on the secondary, so it should be at least 33 AWG.

Could there be a resistor built in? Ah, in the datasheet it has “secondary internal resistance 145+/- 10”. So, at 5Ω it might be 26 AWG.

My preliminary math went:

The transducer is rated for 0 to 120 amps with a built-in 145Ω resistor. V=IR, V=(120/2000) x 150 = 9VOC.
The test wouldn't go over 20 amps (the circuit breaker is only 15A), so the max power the resistor would bear was 9x15/2000 = 67 mW, which is far less than a 1/4W resistor.

As I don't trust datasheets from countries with no consumer protections and my math is notoriously bad, I added a 5.1kΩ resistor to the circuit. I was also going to start with a low-wattage LED bulb.

Last time I did an experiment you all were much more interested in the setup than you were in the CT, so this time I photographed it showing all the bits and bobs:

1676737471994.png

The brown wire is the primary (an old extension cord that has no ground) with 120V@60Hz. You can see a Wago is used to split one of the two wires (yes, only 1 wire goes through the CT). The primary power consumption is measured via the WattMeter. The two white alligator clips connect the CT leads across the 5.1kΩ resistor. The red leads connect the volt meter across the resistor.

1676739706198.png
1676738168142.png

The results also appear less linear than I expected for something for 0-120A. I doubt that's insurmountable, I'd expect the same manufacturing lot to have a similar profile and you could match against it.

On the upside, from 0 to 7 amps had a range of 0.4 to 14.3V, so a linear voltage divider could reduce that to 0.04 to 1.4V, so chips like the ESP32 (which aren't the best ADCs out there) could probably get accurate measurements to 6 watts. My thinking on this is that on 15 amp circuits you're more interested in the accuracy of low-watt items, whereas circuits over 15 amps probably only have one big thing on them and you can get high precision by tuning for the actual expected range. For example, a hot water tank on a 30 amp circuit might have two elements and pulls either 12 or 27 amps. So, you can tune the low end of the voltage range to 10 amps, and the high end to 30. Below 10 amps the voltage would be too low for the ADC, but you wouldn't care as the circuit only has less than 10A when off.

Update: $1 on AliExpress with the same part number. +$7 shipping so perhaps I didn't do so bad ; -)
 
Last edited:
From your test it doesn't seem to have any built-in resistor. And that's also probably why it's not linear (the resistor value is to high).

And 2.55 F? unless there's a supercap somewhere I guess it's a typo ^^ also, it's the capacitance between what and what?
 
And that's also probably why it's not linear (the resistor value is to high).
Good thought!

And 2.55 F? unless there's a supercap somewhere I guess it's a typo ^^ also, it's the capacitance between what and what?
Oh wow... sure is! Sorry about that... .255 nF on the leads of the CT as measured by the meter without any primary load.
 
The data sheet is quite specific, and everything you need to know is right there.

First thing, the suggested current measurement range for full rated accuracy is stated as 20 amps rms.
The ratio is 2,000:1 so at 20 amps rms, the secondary current will be 10mA.
The maximum value of the burden resistor is specified as 100 ohms.

The maximum output voltage you can get without falling below the rated accuracy will be 10mA x 100 ohms = 1.0v rms.
Saturation voltage, where it falls flat on its face is at 4.0 volts output.

You can push it to 120 amps at reduced accuracy, but you must reduce the load resistor to 20 ohms to do that, you would then get 60mA and 1.2v output, where it again starts to go non linear much above 1.0 volts.
Internal power dissipation at 120 amps and 20 ohms would be .06A x .06A x 145 ohms = 522 milliwatts.
Its going to run rather hot like that, and that pretty much sets the maximum at 120 amps and 20 ohms.

So you can set any measurement range you need up to 120 amps, as long as the load resistor is chosen to stay below 1.0 volt output.
Best guarananteed accuracy will be at 20 amps or less though.

The secondary 145 ohm internal resistance is not really relevant to anything, except for estimating power dissipation and internal temperature rise.

Running it up to 17v output with a 5.1K resistor is just nuts !
Try again with a lower value load resistor, and do not go over 1.0v rms output.

As the data sheet says, linearity and accuracy will be 0.1%. If its worse than that at or below 1.0 volt output, your load resistor may not be dead accurate, quite likely if its a 1% tolerance resistor.
 
Last edited:
...The secondary 145 ohm internal resistance is not really relevant to anything, except for estimating power dissipation and internal temperature rise.
Thanks! The internal resistance numbers still don't really make a whole lot of sense to me.

I did see the part about 4V saturation so shouldn't have been surprised really, definitely didn't understand what the "0-80 (RL<100Ω)" meant.
Not sure what the Quarantine is about, 3 kV?

The big resistor was to resist current on the first pass since I didn't know what I was doing and didn't trust anything. Although, have to admit I am curious as to how well it does as you approach saturation.

Pretty sure I can dig up a 100Ω resistor, if I have time I'll try it this afternoon (but now it's time to go out on the water ; -).
 
Last edited:
Only a guess, but Quarantine to me, suggests "isolation" between the primary and secondary ?
Someone probably looked up isolation in a Chinese/English dictionary and quarantine popped up.
A bit odd to our Western eyes, but the Chinese dude would not know that.

The secondary winding puts out a current, and that current flows through both the internal resistance and the external load resistor which are effectively in series.
When the rated 10mA is flowing, there will be 1.45v dropped across the internal 145 ohm resistance, and 1.0 volt across our 100 ohm load.
Try to imagine that the secondary winding has to produce 2.45v total developed EMF across its 2,000 turns to do that.

Its that 2.45v and 2,000 turns that determines the flux swing in the core to do the job at 20 amps.
That flux swing is linear enough to produce the quoted better than 0.1% linearity figure.
It will be a quite small flux swing.

Now if we go for broke and run the beast at 120 amps, we develop 60mA through the secondary.
That will be 8.7 volts drop across the internal 125 ohms, and 1.2v drop across the external 20 ohm load resistor.

Total EMF developed across the 2,000 turns needs to be 8.7 + 1.2 = 9.9 volts.
That requires a much greater flux swing in the core than at 20 amps, and it will still work quite well.
But it will almost certainly not still meet the tight 0.1% linearity spec, but should still be perfectly usable.
Internal heating is what very likely sets the 120 amp rated maximum for that CT.
 

header.jpg
Testing commences....​


1676836498083.png1676836503493.png
The 0-80 100Ω gives the saturation voltage at 80 amps. With a 100Ω burden resistor @ 20 amps on the primary generates 1 volt on the secondary. 60 amps would still be under saturation at 3V, but the curve might not be linear (although in the prior test it was 3x saturation and still fairly linear, so perhaps saturation is at a different frequency?). If it’s consistently repeatable it might be perfectly acceptable to be closer to saturation. (If you're wondering why 99Ω instead of 100Ω, the bands indicated it was 100Ω but I used the measured value).

The voltages were quite slow to steady, on the order of seconds. Not sure how much of that is the CT capacitance and how much is the meter, but guessing a bit of both. As you can see, at 100Ω the accuracies weren't the best.

An 80 amp range is pretty big, so to try and increase the accuracy I found a 510Ω resistor (which no surprise when measured wasn't 510Ω).
1676836509976.png1676836514876.png

So, what did I think I learned? If any of these takeaways are wrong please let me know!
  1. CTs have saturation voltages, I believe that's the magnetics, so the closer you get to it the more inaccurate the readings probably are. But the prior results are pretty interesting so that might be a datasheet error or just a coincidence.
  2. CTs have capacitance. Again, it might be the meter; but, if you need something that measures very rapid changes (e.g., fingerprinting like the Sense energy monitor), you probably want less capacitance (e.g., fewer windings or some other trick).
  3. A high burden resistor (e.g., 5.1k) can get very good resolution (e.g., 0.5W), but has a narrow range of the primary current that it can operate accurately over.
  4. For this particular CT on a 15 amp circuit with a 330Ω burden resistor would have a 0 to 2.5V range, suitable for low voltage ADCs and have an accuracy around 8 watts. A 2.5V range on a 30 amp circuit would need about a 160Ω resistor and have an accuracy of about 15 watts.
  5. Knowing apriori the actual primary range allows the CT to be tuned for that range greatly increasing sensitivity. For example, this CT on a 30 amp circuit could be tuned for 20-30 amps with a 250Ω resistor using a voltage divider to reduce the upper 3.75V@30 amps down to 2.5V for the ADC assuming the curve approaching saturation is repeatedly consistent you could get down to a 10 watt accuracy.
  6. If you're trying to measure extremely low currents on the primary, you can loop the primary through the CT. For example, a second loop on this CT essentially changes it from 2000:1 to 4000:1.

Generic devices like the IotaWatt must sacrifice quite a bit of accuracy as their lowest CTs are 0 to 50 amps. I wonder how the device distinguishes between the different transducers (e.g., 50, 100, 200, and 400 amps)? Possibly a software setting with the burden resistor in the CT assembly to the voltage is always constrained to an acceptable range. Quite possibly their CTs have much higher saturations allowing for better resolution.

I know Warpseed said the maximum burden resistor value was 100Ω based on the datasheet, but I don't think the CT cares as long as the voltage is well below the saturation.

The CT "quarantine" is 3 KV, but I've no idea what that is about. Possibly they mean the insulation's dielectric strength won't hold over 3KV?
Update: Like Warpspeed in his post above, I believe this to be a translation error.
 
Last edited:
The suggested operating conditions are 20 amps, 100 ohms, and 1 volt output to reach the specified accuracy.

You can run it outside those parameters with higher current or higher output voltage, but pay for that with much worse non linearity.
That probably might not matter if its to just detect some huge overload condition for example.

If you need to measure 50 amps, or more, use a current transformer that has a larger core and thicker wire specified for that current.

And as said above, if you wish to increase sensitivity, loop several turns through the hole, you can do that without losing any accuracy.

Wide range measuring instruments will almost certainly use an amplifier between the CT and any analog to digital converter, and the gain might also be switched for autoranging.
 
Wide range measuring instruments will almost certainly use an amplifier between the CT and any analog to digital converter, and the gain might also be switched for autoranging.
I was thinking about that too, I see how it can be done if the saturation is high enough.

But, after looking at some spec sheets on expensive CTs most of them seem to want to be under 1V for the range.

Guessing it probably makes sense to get a precision ADC than a CT with a higher saturation. Pity, a lot of microcontrollers have spare ADC pins, but it's doubtful they're accurate under a mV.
 
Its pretty rare for a CT to be rated for more than one volt output, often its only 100mV to drive an analog meter movement through a bridge rectifier.

You can always make your own CT for any voltage output you want.
It will not be thumbnail size though.

A small transformer core of about 5VA might have a couple of thousand turns on the 230v primary. Strip off the existing secondary and replace it with one turn which becomes the new primary of your CT.
A bit of experimentation should get you something that will produce several volts easily across a suitable load resistor.
Just be mindful of the current rating of that primary. 5VA and 230v = only 22mA
If it has 2,000 turns a 44 amp CT primary would be feasible, based on what the seconday can cope with.

It may actually have far more or far less than 2,000 turns depending on mains voltage and power rating of the particular transformer.
If you count the turns when you remove the secondary, that should give a pretty good idea of the turns per volt on the primary.
Its a bit klunky, but it definitely works.

If the original transformer has 230 v across the primary without saturating, it sure is not going to saturate with say 5v across the same winding used as a CT.
 
Last edited:
The voltages were quite slow to steady, on the order of seconds. Not sure how much of that is the CT capacitance and how much is the meter, but guessing a bit of both. As you can see, at 100Ω the accuracies weren't the best.

Did you do a point-slope linear calibration?
An offset will go to infinite % error at zero input.
Tolerance requirements should be % of reading +/- some number of LSB, or similar.

I was thinking about that too, I see how it can be done if the saturation is high enough.

An analog circuit could monitor voltage, switch burden resistors in and out with hysteresis.

Guessing it probably makes sense to get a precision ADC than a CT with a higher saturation. Pity, a lot of microcontrollers have spare ADC pins, but it's doubtful they're accurate under a mV.

Or an op-amp.
At work we're playing with circuits that measure uA ripple in A circuits, and amplify it to where we can capture with DC coupled scope, having noise floor of maybe 0.1 mV. When I used a current probe on that scope, it had noise floor of 1 mA (corresponding to 1 mV into scope.)
 
Last edited:
CTs have saturation voltages, I believe that's the magnetics, so the closer you get to it the more inaccurate the readings probably are. But the prior results are pretty interesting so that might be a datasheet error or just a coincidence.

The voltage isn't the source of the problem, it's the core which saturates if the magnetic field is too high.


CTs have capacitance. Again, it might be the meter; but, if you need something that measures very rapid changes (e.g., fingerprinting like the Sense energy monitor), you probably want less capacitance (e.g., fewer windings or some other trick).

Given it's seconds here I'd say it comes from the meter (did you have some averaging function enabled?). The CT should be good up to at least the kHz region giving it a less than 1 ms response time, far more than enough for typical applications.


I know Warpseed said the maximum burden resistor value was 100Ω based on the datasheet, but I don't think the CT cares as long as the voltage is well below the saturation.

Yes, within reasonable limits of course.


The CT "quarantine" is 3 KV, but I've no idea what that is about. Possibly they mean the insulation's dielectric strength won't hold over 3KV?
Update: Like Warpspeed in his post above, I believe this to be a translation error.

Yes, it's very likely it is the guaranteed isolation voltage between the primary (assuming a bare wire) and secondary.
 
The voltage isn't the source of the problem, it's the core which saturates if the magnetic field is too high.


What actually happens when the core saturates?

From WIkipedia:
Seen in some magnetic materials, saturation is the state reached when an increase in applied
external magnetic field H cannot increase the magnetization of the material further, so the total
magnetic flux density B more or less levels off.

That description matches pretty well with the results in #62. Different core materials have
different falloffs. From what I've read silicon steel is the preferred core material for CTs.

But, there are two big takeaways for me from the curves. First is that a CT should
be able to be operated pretty close to its saturation without sacrificing much accuracy.

The second is that readings beyond it aren't total gibberish. It is still fairly linear past the
saturation point although the flatter curve means a lot less sensitivity.
1676895083328.png


Given it's seconds here I'd say it comes from the meter (did you have some averaging function enabled?). The CT should be good up to at least the kHz region giving it a less than 1 ms response time, far more than enough for typical applications.
Thanks! I scratched that thought out in my earlier conclusion, I may have been influenced by past calculations for capacitor charging times.
What effects would capacitance have if any then? Other than as filters, capacitors in AC circuits are always a little mind-bending for me.

Hmm, is that what the datasheet is trying to say by phase difference, which they specify as <5 (20A, 0Ω) Branch? I suspect another translation error. But possibly they mean the delay is <5 degrees of the phase? Which would be 0.23 mS?

Did you do a point-slope linear calibration?
No calibration, just raw data from which it could be calculated. The "err" column is (theoretical-actual)/theoretical.

Or an op-amp.
That's a really good idea!

At work we're playing with circuits that measure uA ripple in A circuits, and amplify it to where we can capture with DC coupled scope, having noise floor of maybe 0.1 mV. When I used a current probe on that scope, it had noise floor of 1 mA (corresponding to 1 mV into scope.)
I should probably start a thread on minimizing noise.

The thing that immediately hits me when using an op-amp or precision ADC is does it really have any value to do so? In my test case, the length of the wires, using a wago, proximity of the neutral, proximity to a PC... any of that can start to alter voltage measurements in the sub mV range.

The IotaWatt page doesn't make any claims as to the accuracy. This person reports their IotaWatt as having significant errors...but that's so far off there's probably something significant wrong going on. Emporia's tech sheet doesn't list it either, but they say they're revenue grade which from ANSI rule C12 is +/- 2% accuracy. Hopefully, that's +/-2% at any level of current, hate to think it would be +/-2% of the meters maximum rated throughput. ; -)

But anyway, since the CT by its nature would most likely be distant from the measuring apparatus, probably just as important is the circuitry to eliminate unwanted AC frequencies picked up. Not sure how you'd do that as devices like the IotaWatt would be deployed inside the breaker box where all the wires are radiating at the same frequency.

The accuracy of the CTs I tested was listed as 0.1, but no units so it's meaningless. At very low and high currents it was well off from the predicted values. But, and this isn't confirmed, it looked like it had fairly consistent behavior, that is possibly it had good precision - so accuracy might be very good with calibration and curve fitting.
 
Last edited:
Regarding noise, EMI, etc... I'd recommend to amplify as close to the CT as possible (an op-amp is enough) so the SNR is higher on the rest of the circuit.

Regarding accuracy the general rule is that a percentage means it's a fraction of the measured value, not a fraction of the meter maximum. Other units (V, A, digits, ...) usually means it's a constant value regardless of the measured value.
 
Regarding noise, EMI, etc... I'd recommend to amplify as close to the CT as possible (an op-amp is enough) so the SNR is higher on the rest of the circuit.
Given the nature of the beast (measuring current through external circuits) I'm not sure how possible that is.
Although, shielded Wires might be a good requirement for a CT.
 
Given the nature of the beast (measuring current through external circuits) I'm not sure how possible that is.
I'm not sure what do you think is not possible?

Although, shielded Wires might be a good requirement for a CT.
Yep, shielded twisted pair is probably the best here without going crazy on complexity and cost ;)
 
I'm not sure what do you think is not possible?
For example, on the Iotawatt the CT cables are 5'. They have to be long enough to string from whatever group of wires you want to take the reading back to the device that takes the voltage and reports it.
 

What actually happens when the core saturates?

If you put a coil of wire around ferrous material, then apply DC voltage, for a while magnetic domains get messed with. During that time, there is an impedance which slows rise of current. Once the metal has been magnetized as strongly as it is able, impedance drops and current shoots the rest of the way up to what DC resistance limits it to. Remove applied voltage, and you're left with a permanent magnet.

Reverse polarity and domains are changed back to demagnetized, then get remagnetized with opposite polarity. Toggle polarity and you've got AC, with magnetization going back and forth, tracing out a hysteresis curve. Low enough voltage and period, it draws a loop without saturating. Higher voltage or longer period, it becomes fully magnetized in each polarity (saturated) after which current shoots up. Shape of curve varies with material and treatment, good for different applications.


Compare peaky current at nominal voltage, vs. more rounded at reduced voltage:



Thanks! I scratched that thought out in my earlier conclusion, I may have been influenced by past calculations for capacitor charging times.
What effects would capacitance have if any then? Other than as filters, capacitors in AC circuits are always a little mind-bending for me.

Phase shift. And frequency-dependent attenuation.

Hmm, is that what the datasheet is trying to say by phase difference, which they specify as <5 (20A, 0Ω) Branch? I suspect another translation error. But possibly they mean the delay is <5 degrees of the phase? Which would be 0.23 mS?

Phase would matter if calculating real and reactive power from V and I
Given a repeatable phase shift, you could pick out data from different times array to correct. For the fundamental; harmonics shift differently in degrees, have to figure out if also in time.

No calibration, just raw data from which it could be calculated. The "err" column is (theoretical-actual)/theoretical.

So do a sweep, store a point-slope calibration. Then see how accurate and repeatable the measurements are. It should really flatten the steep curve in error at low value. But error band (of both data and calibration) will be larger there.

The thing that immediately hits me when using an op-amp or precision ADC is does it really have any value to do so? In my test case, the length of the wires, using a wago, proximity of the neutral, proximity to a PC... any of that can start to alter voltage measurements in the sub mV range.

I fed one CT and several thermocouples into my DAS. Thermocouple signals got clobbered; they are very low.

But anyway, since the CT by its nature would most likely be distant from the measuring apparatus, probably just as important is the circuitry to eliminate unwanted AC frequencies picked up. Not sure how you'd do that as devices like the IotaWatt would be deployed inside the breaker box where all the wires are radiating at the same frequency.

Radiated 60 Hz pickup might be fairly small. Or not, after converting large CT current signal to small voltage. Twist and shield helps. Also balanced impedance, i.e. terminate both into resistors, not one into ground and other into resistor.
But high frequencies can be filtered out (Nyquist) while passing 60 Hz without attenuation. To deal with lower harmonics of 60 Hz, could sample as sufficiently high rate. But that becomes computationally intensive. Better to ship high level signal around and keep sampling simple. Or, put signal into an RMS detector. I used a fancy (Analog Devices?) IC in an RF design; that was good for DC to 500 MHz. Some DMM use a thermal sensor, measure temperature rise of a resistor, which responds to RMS. That would not be fooled by brief harmonic peak (which could otherwise violate Nyquist), but contribution from square of small number is diminished.

The accuracy of the CTs I tested was listed as 0.1, but no units so it's meaningless.

10%?

Regarding noise, EMI, etc... I'd recommend to amplify as close to the CT as possible (an op-amp is enough) so the SNR is higher on the rest of the circuit.

Driven current should be fairly low impedance, more immune. Following burden resistor, converted to small voltage more sensitive.
Run wires from current transformer, put burden resistor fairly close to amp or ADC?

Regarding accuracy the general rule is that a percentage means it's a fraction of the measured value, not a fraction of the meter maximum. Other units (V, A, digits, ...) usually means it's a constant value regardless of the measured value.

My first thought was that "0.1" meant "10%"

Many sensors also have an offset, but don't think there would be for CT.
 
Last edited:
Driven current should be fairly low impedance, more immune. Following burden resistor, converted to small voltage more sensitive.
Run wires from current transformer, put burden resistor fairly close to amp or ADC?

That's a good idea, you're basically turning a voltage sense into a current loop which should be far more immune (hence why industrial stuff use the 4-20 mA signaling) without needing an op-amp right at the CT.


My first thought was that "0.1" meant "10%"

Many sensors also have an offset, but don't think there would be for CT.

Possible, but 10 % would be very low accuracy...

Yep, I was talking in general, not for CTs specifically.
 
Back
Top