G'day all,
I've been toying with methods in Conrads Mini Metrology Lab article for a while now and using it to learn about my equipment and its limitations (in addition to mine).
One of the things I've been experimenting with is resistor matching and I've been toying with a couple of methods. The obvious method is matching them with direct reading ohm meters of various accuracy and resolution. The most accurate method available to me is as described in the Mini Metrology Lab series, part 3 by using a makeshift bridge. I have a question about the maths though and I think I must have either a beginners mathematical error, or a fundamental misunderstanding of basic principles. My calculations are out by almost exactly a factor of 2.
I'm going to preface this whole bit with "If I'm not mistaken" and then proceed to explain my calculations hoping that someone will point out where I *am* mistaken.
This is based on the most accurate match required. The first decade are 10K resistors and they need to be matched to 37PPM (0.0037%). This would mean a maximum spread of 0.37 Ohms across the resistors (10,000/1,000,000*37). For the purpose of this example I'm going to select 2 resistors of 10,000.00 Ohms and 10,000.37 Ohms.
I'll assume known bridge resistors are all 10,000.00 Ohms. So with a 27V input, the left side of the bridge gives us 13.5V.
With the 10,000.00 Ohm unknown, the right side of the bridge gives us 13.5V. - 10,000/(10,000+10,000)*27
With the 10,000.37 Ohm unknown, the right side of the bridge gives us 13.500249V - 10,000.37/(10,000+10,000.37)*27
So we have roughly a 249uV difference. The Mini Metrology Lab article specifies resistors must be matched to within a window of 1/2mV == 0.5mV == 500uV.
This is almost exactly double the value I came up with. Can someone please point out my error?