Vigo
Solar Addict
The lowest amount of resistance most multimeters will show is ~0.2ohm, unless they have a zero'ing function in which case you could get it to read 0.0ohm, but in either case you are still measuring in tenths of ohms, not hundredths or thousandths.
It's an issue of scale. Batteries HAVE resistance, but it is TINY. If you think about a car battery that might be rated at 1000 cranking amps, ohm's law says 12.6v / 1000a = 0.0126 ohms. In other words, even if you hooked a battery to something with ZERO resistance (not possible), the only way you'd get 1000 amps to flow is if the battery itself had no more than 0.01ohm of resistance!
So with any battery circuit, the VAST MAJORITY of all the resistance of the circuit, is simply the wiring/conductors/connectors. That's why seemingly tiny changes in circuit resistance can result in huge changes in charging current, AND why a regular multimeter set to measure ohms is not very helpful in assessing what is good or bad. Making a tenth of an ohm difference on a circuit with a 100ohm 'load', would do practically nothing. Making a tenth of an ohm difference on a circuit where the 'load' (battery in this case) is in the hundredths or thousandths of an ohm, would give a huge result because that tenth of an ohm is a huge proportion of the total circuit resistance.
To get a good assessment of a resistance that tiny with a 'regular' multimeter, you need to make current flow in the circuit, and then measure voltage drop (aka difference) between both ends of the circuit, and do ohms law with those voltage and current numbers.
So just taking some numbers you posted and using them for sake of example, 13.7 - 13.35 = 0.35v 'drop'. 0.35v / 9amps you have seen (if these numbers were simultaneous or taken under identical conditions) would equal 0.038ohms. If you cut your voltage drop in half (aka cut your resistance in half) that would be .038ohms / 2 = 0.019ohms. That would be ~50% drop in the TOTAL resistance of the circuit, with a commensurate increase in charge current as the result. But also, if we reduce that 0.35v drop by half, that also means the charger is effectively 'pushing' ~0.17v harder! So you would get more of an increase then JUST the reduction in circuit resistance would have you think because doing so also increases the effective voltage differential you are working with to push current through the battery. So, huge increase in charge current by taking ~0.02ohms out of a circuit.
I would not say this is 'useful' math (too many approximations) except to build a mental framework of why things happen the way they do. You can improve your wiring, see no difference in ohms on a multimeter, yet get huge results. It's just that the scale of resistance that is determining these results, is much smaller than the layperson is used to thinking about, or a meter set to ohms will effectively measure.
It's an issue of scale. Batteries HAVE resistance, but it is TINY. If you think about a car battery that might be rated at 1000 cranking amps, ohm's law says 12.6v / 1000a = 0.0126 ohms. In other words, even if you hooked a battery to something with ZERO resistance (not possible), the only way you'd get 1000 amps to flow is if the battery itself had no more than 0.01ohm of resistance!
So with any battery circuit, the VAST MAJORITY of all the resistance of the circuit, is simply the wiring/conductors/connectors. That's why seemingly tiny changes in circuit resistance can result in huge changes in charging current, AND why a regular multimeter set to measure ohms is not very helpful in assessing what is good or bad. Making a tenth of an ohm difference on a circuit with a 100ohm 'load', would do practically nothing. Making a tenth of an ohm difference on a circuit where the 'load' (battery in this case) is in the hundredths or thousandths of an ohm, would give a huge result because that tenth of an ohm is a huge proportion of the total circuit resistance.
To get a good assessment of a resistance that tiny with a 'regular' multimeter, you need to make current flow in the circuit, and then measure voltage drop (aka difference) between both ends of the circuit, and do ohms law with those voltage and current numbers.
So just taking some numbers you posted and using them for sake of example, 13.7 - 13.35 = 0.35v 'drop'. 0.35v / 9amps you have seen (if these numbers were simultaneous or taken under identical conditions) would equal 0.038ohms. If you cut your voltage drop in half (aka cut your resistance in half) that would be .038ohms / 2 = 0.019ohms. That would be ~50% drop in the TOTAL resistance of the circuit, with a commensurate increase in charge current as the result. But also, if we reduce that 0.35v drop by half, that also means the charger is effectively 'pushing' ~0.17v harder! So you would get more of an increase then JUST the reduction in circuit resistance would have you think because doing so also increases the effective voltage differential you are working with to push current through the battery. So, huge increase in charge current by taking ~0.02ohms out of a circuit.
I would not say this is 'useful' math (too many approximations) except to build a mental framework of why things happen the way they do. You can improve your wiring, see no difference in ohms on a multimeter, yet get huge results. It's just that the scale of resistance that is determining these results, is much smaller than the layperson is used to thinking about, or a meter set to ohms will effectively measure.
Last edited: