Author Topic: Testing and calibrating a milliohm meter  (Read 2010 times)

0 Members and 1 Guest are viewing this topic.

Offline OM222OTopic starter

  • Frequent Contributor
  • **
  • Posts: 768
  • Country: gb
Testing and calibrating a milliohm meter
« on: September 29, 2019, 02:21:14 pm »
I designed and built a DIY milliohm meter a while back and just finished writing the software for it. the response times and overall accuracy of each building block is to spec and performs really nicely, so I decided to test some resistors with it. I only have some basic 1% resistors (no idea what the tempco is). I first measured the resistors with a multimeter and compared them to the results with the milliohm meter. the rough value is the same but there seems to be quite a lot of drift which I'm not sure if is just the tempco of the resistors or if it's caused by errors in my design. the multimeter doesn't show as much variation but there is some drift there too, maybe it's just averaging more samples and filtering the results better?

I wanted to get some reference resistors to test with and the theoretical range is 10u\$\Omega\$ to 200K\$\Omega\$. What is the best resistor type to get for low tempco? I searched on mouser and I'm not sure how to select the filters. some of the values are just a fixed number (e.g: 100ppm/c) and some are 2 or more values (e.g: -2000 ppm/c , 0ppm/c). Also setting the filter to 0ppm/c only brings up results for 0\$\Omega\$ resistors which isn't really useful! I don't want them to cost over 10$ per resistors either, so please leave recommendations as to what resistors are better suited for testing and calibration of my device without breaking the bank.
« Last Edit: September 29, 2019, 02:25:19 pm by OM222O »
 

Offline Vgkid

  • Super Contributor
  • ***
  • Posts: 2727
  • Country: us
Re: Testing and calibrating a milliohm meter
« Reply #1 on: September 29, 2019, 02:45:31 pm »
What are the full scale ranges of your milliohm meter, budget, accuracy(as the resistance gets lower, the cost goes up). How is the measurement circuit implemented/parts used. All effect what is needed to do the job.
If you own any North Hills Electronics gear, message me. L&N Fan
 

Offline OM222OTopic starter

  • Frequent Contributor
  • **
  • Posts: 768
  • Country: gb
Re: Testing and calibrating a milliohm meter
« Reply #2 on: September 29, 2019, 02:51:08 pm »
I already specified the theoretical full scale range which is 10u\$\Omega\$ to 200K\$\Omega\$, but to be honest I doubt I will be able to get the 10u\$\Omega\$ range working! I also can't find any decent tempco resistors for testing that range. my issue is not with the design itself, rather finding suitable reference resistors for testing and calibration. like I said before, I don't want each of the resistors to cost more than 10$ either, since I need about 10 or 12 of them to cover all of the ranges.
 

Offline MosherIV

  • Super Contributor
  • ***
  • Posts: 1530
  • Country: gb
Re: Testing and calibrating a milliohm meter
« Reply #3 on: September 29, 2019, 06:43:47 pm »
From the fact that you are talking about tempco, the design must be using a fixed constant current probably something like 1 Amp.
Yes this will cause significant heating and hence the measured resistance value changes as the resistor heats up.
I would try to keep the maximum current to 100mA and try to amlpify the small voltage across the resistor.
10uΩ is very ambicious. At that range, contact resistance is going to be a problem even with 4 wire measurement technique.

Do you have access to another bench dmm with 4 wire resistance measurement?
If so you ca  compare your results with that instead of using lots of reference resistors.
You would have to measure each test resistor before you try it anyway, unless you have known reference resistors.
 

Offline OM222OTopic starter

  • Frequent Contributor
  • **
  • Posts: 768
  • Country: gb
Re: Testing and calibrating a milliohm meter
« Reply #4 on: September 29, 2019, 08:01:22 pm »
From the fact that you are talking about tempco, the design must be using a fixed constant current probably something like 1 Amp.
Yes this will cause significant heating and hence the measured resistance value changes as the resistor heats up.
I would try to keep the maximum current to 100mA and try to amlpify the small voltage across the resistor.
10uΩ is very ambicious. At that range, contact resistance is going to be a problem even with 4 wire measurement technique.

Do you have access to another bench dmm with 4 wire resistance measurement?
If so you ca  compare your results with that instead of using lots of reference resistors.
You would have to measure each test resistor before you try it anyway, unless you have known reference resistors.

Exactly! I use a constant current source and a universal shunt that can change between 1 and 100K ohms and a 24 bit ADC  to measure some values. this keeps everything relative and avoids things such as ADC offset, voltage reference errors, etc. For the issue of 1A heating up the components, I only take 20 samples and average them with a 90SPS rate and allow 30mS of settling time, so this whole process takes about 500ms and there is a cooldown timer of about 5 to 10 seconds depending on the current used which I tested over a long period of time and it seemed pretty stable and nothing overheated.

10uΩ seems a bit far fetched, I agree! but from the tests I can see I have at the very least 100uΩ accuracy (not just resolution).
Unfortunately I don't have access to any other lab equipment that are more accurate than what I've built  :-DD |O so I can't pre test the resistors and compare them, hence wanting to buy different reference resistors for different ranges and calibrating against them.

here is a picture showing measurements of a 1Ω resistor and of a 10mΩ resistor.
845266-0

The error in that range is particularly bad (about 5%) but I noticed my 1ohm shunt that is used to generate the 1A current is also bad, so changing it should fix the errors (even testing it with the 100mA range yields results that are under 1%) and I tested all the other ranges with 1% resistors and they were all within 0.3% which was a really good sign!
« Last Edit: September 29, 2019, 08:09:09 pm by OM222O »
 

Offline MosherIV

  • Super Contributor
  • ***
  • Posts: 1530
  • Country: gb
Re: Testing and calibrating a milliohm meter
« Reply #5 on: September 29, 2019, 09:09:05 pm »
Quote
  specified the theoretical full scale range which is 10uΩ to 200KΩ,
That will be very hard to acheive!
1m\$\Omega\$ to 2K would be more acheivable.

As to how to calibrate, what do you have access to?
Do you have 4 or 5 digit dmm?

Assuming you have single range on the milliohm meter
Get 10\$\Omega\$, 500\$\Omega\$ and 1K to cal with
In .1% precision resistors. CPC do then for a few £ each.
Get  1\$\Omega\$ and another sub  \$\Omega\$ precision resistor to test with.
Measure the resistors greater than 9\$\Omega\$ with your accuracte, calibrated dmm.

Now used these to check and adjust your diy miliohm meter.
With a single range and assuming there is good linearity, the 3 test resistors allow you to calibrate for resistances you cannot accurately measure with dmm.
You can now see if the milliohm meter works for the 1\$\Omega\$ and other one but you will not know fir sure until you can get them measured but the 3 point cal should give you good confidence.

Keep it simple, single range and limit it to no more than a few K ohm and this method should work.
« Last Edit: September 29, 2019, 09:13:14 pm by MosherIV »
 

Offline OM222OTopic starter

  • Frequent Contributor
  • **
  • Posts: 768
  • Country: gb
Re: Testing and calibrating a milliohm meter
« Reply #6 on: September 29, 2019, 09:37:58 pm »
actually the higher ranges are pretty simple to achieve! perhaps I should post an schematic but when limiting the current to 100mA, These are the results I got:
0.995 when measuring a 1% 1Ω resistor which is within tolerance spec (0.5% error) and 0.010039 (10.04mΩ) while measuring a 1% 10mΩ resistor which is again, well within range! The resistors I used for the DIY meter are 0.1% themselves, so I think it's a good idea to use reference resistors with better tolerance than that, since I don't want the error from my reference to be "calibrated" for in my meter, resulting in poor calibration. I have heard a few precision resistors series such as the vishay VHP100, but those cost way too much! I was just searching for which resistor series are good references for under 10$ each, since I didn't have much luck on mouser / digikey. the best mouser has are some of those bent wires which are specified for 1% which again, is no good for calibration.
 

Offline MosherIV

  • Super Contributor
  • ***
  • Posts: 1530
  • Country: gb
Re: Testing and calibrating a milliohm meter
« Reply #7 on: September 29, 2019, 09:53:52 pm »
Precision resistors below $100 will not come with a test report which states what the resistance actually is.
You just know that the resistance will be to the specified tolerance.
So a 1\$\Omega\$ 0.1% resistor could be 0.99\$\Omega\$ or 1.01\$\Omega\$ or somewhere in between.

To calibrate, you need a 'reference standard'. You can get resistance standards but they cost more than the vishay resistors you are looking at.

I am suggesting you buy presission resistors that you can measure yourself with you current dmm.
Based on these now known resistors (your own resistor reference standard) you can adjust your diy resistance meter to match the known resistor. Do 2 or more points on each range should give you high confidence in the results for each range.

As I said, you can buy 0.1% precision resistors for a couple of £ from CPC.
Get ones you can easily measure with your dmm and you have some pretty good resistance standards.
The web page say they 15ppm/°C
« Last Edit: September 29, 2019, 09:59:09 pm by MosherIV »
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf