Hi,
I wanted to play with building a power meter so I ordered a nice shunt resistor.
The Fluke 179 unhelpfully said 0.1ohm, and when I tried with the Fluke 8840a it was about 6ohm out of calibration in 2wire mode.
I looked at the specs for the usual sub $1k bench multimeters (Agilent, Keithley etc) but they surprisingly often started at 100ohm and up.
So, what's the best way to get accurate measurements of sub 1ohm shunts? Preferably <1% error.
Thanks,
Marcus
Use 4-wire measurement.
If you wouldnt have bench meters use accurate current source like 1 amp and measure the voltage drop accros your resistor.
A lot of high end meters don't have specific low resistance ranges, but they have enough accuracy and resolution to measure far lower. For example, the classic HP 34401A's 100Ohm range is accurate to 0.004Ohms.
As you noticed, DMMs don't really specialize in low resistance measurements. Part of the problem is that they simply don't source enough current. Usually only 1mA, which means the DMM can only measure 1uV / mOhm. So, as suggested, the easy thing to do is source the current yourself and then measure the voltage across the resistor using your meter. If you have two meters, you don't even need a decent current source, just measure the current and voltage simultaneously.
They make milliohm and microohm meters. I think Robrenz has one. You can find some Chinese made ones for under $500.
Most ohm meters have at least milliohm resolution.
Sent from my LGLS990 using Tapatalk
Method One
========
If you have a lab power supply that can generate a constant current then set it to supply 1,0A with your resistor attached then measure the voltage across the resistor (NOT at the supply). Ohms Law does the rest.
Method Two
========
Take a single 1,5V D cell and wire a 20A amp meter and your resistor in series, wire a voltmeter across the resistor. Quickly connect the battery and note the volt and amp readings then disconnect, again, Ohms Law does the rest. I have measured down to one milliohm using this method but take care because things get very hot very quickly.
Below 1 ohm must be done in 4 wire or Kelvin mode. Also you must use some means of removing thermal EMF errors. These occur because each different material and thermal gradient acts as a thermocouple and generates voltages that add or subtract from the actual voltage drop across the resistor. The simplest method is to leave all connections the same and reverse the current direction at the source. Then add the two different readings and divide by two. If you are buying something to measure below 1 ohm make sure it has offset compensated ohms feature where the thermal emfs are eliminated from the readings automatically. Even though the current source and voltmeter method works fine. When you do the math on the worst case errors of your current and voltage measuring instruments
at the value you are measuring it is difficult to get below 1% OF READING accuracy.
I always suggest charting the accuracy of meters as PERCENT OF READING everything else is bogus. This calculates the percent of scale + percent of reading for each range. an example below has meters that perform well in the low ohm ranges. notice how who is best depends on the range you are in.
Thank you for all your responses! I'll try to test them out as best I can.
I am a programmer, not quite ee, so apologies if this is a
question; but is it a stupid idea to have n resistors,
R0 = x ohm
R1 = x*2^1 ohm
Rn = x*2^(n-1) ohm
And use opamp as a comparator? Ie first try Rn. If overflow then close it, otherwise let it be. Then continue. When done, take the binary value and multiply by x.
If you have a lab power supply that can generate a constant current then set it to supply 1,0A with your resistor attached then measure the voltage across the resistor (NOT at the supply). Ohms Law does the rest.
Or without a constant current, insert a known series resistance and then use the multimeter to measure the voltage cross the known resistance and the voltage across the unknown resistance. Trivial math will then indicate the value of the unknown resistance.
but is it a stupid idea to have n resistors
Presumably in a bridge? I think it would be very difficult to get good accuracy from that setup. You'd need n+2 really good resistors, some of which would be "unusual" values so you'd likely have to get them special order. In addition, you'd have to compensate for a lot: the resistances of the switches, resistance of the series connections, any voltages from the many connections involved, and the characteristics of the comparator; all of which together could be more than the resistance you're trying to measure.
I wanted to play with building a power meter so I ordered a nice shunt resistor.
The Fluke 179 unhelpfully said 0.1ohm, and when I tried with the Fluke 8840a it was about 6ohm out of calibration in 2wire mode.
I looked at the specs for the usual sub $1k bench multimeters (Agilent, Keithley etc) but they surprisingly often started at 100ohm and up.
So, what's the best way to get accurate measurements of sub 1ohm shunts? Preferably <1% error.
How nice is your shunt? Is it a 2 terminal or a 4 terminal shunt? The 4 terminal ones are fairly straightforward to measure, by feeding a well defined current through them, and measuring the voltage across the sense connections. The resistance of a 2 terminal shunt only really makes sense when it is in circuit, as the connections affect the resistance so much. Beware of even the tiniest amount of copper in your sense path. When you are measuring milliohms the resistance of copper no longer looks so low, and it has a temperature coefficient of 0.4% per degree C.
I made my own power meter. I bought the shunt resistor and never bothered to measure it. I built up the meter and calibrated the final result.
is the resistance of the shunt the only error you are going to have in your system?
others have suggested putting a known current through it and measuring the voltage on the sense terminals. seems straightforward enough.