Author Topic: How to pick an input resistor for a millivoltmeter  (Read 1499 times)

0 Members and 1 Guest are viewing this topic.

Offline ckocagilTopic starter

  • Contributor
  • Posts: 15
  • Country: tr
How to pick an input resistor for a millivoltmeter
« on: August 07, 2019, 01:05:11 pm »
All millivoltmeter designs I've seen -- from the 34401A schematic from The Art of Electronics to various DIY meters -- have two things in common:

- A non-inverting op amp configuration
- A resistor to the ground at the non-inverting input pin, usually 10M (i.e. a resistor between IN+ and IN-)

What's the point of having a 10M resistor? Doesn't it just degrade the sweet gigaohm+ input impedance of the non-inverting op amp? How is the value determined? Is it only necessary for range-switching? Does it help discharge stray/gate capacitance somewhere?

Thank you and sorry for asking a beginner question here.
 

Offline blackdog

  • Frequent Contributor
  • **
  • Posts: 768
  • Country: nl
  • Please stop pushing bullshit...
Re: How to pick an input resistor for a millivoltmeter
« Reply #1 on: August 07, 2019, 01:19:07 pm »
Hi ckocagil,

Think about this remark, where do you want the bias flow to go?
There is always a bias current, with modern multimeters this is usually below 50 Pico Amperes.

There are also measuring instruments that do not have a 10M resistance for the low ranges, then the bias current flows in or out of the measuring instrument through the circuit to be measured.

When not connected to the mV meter, the instrument usually indicates a random value depending on the direction of the bias current and the charge of the capacitor at the input of the measuring instrument.

10M is a standard value for many measuring instruments, but if you are building something yourself and bias currents are small, 100M is also possible.

Kind regards,
Bram



Necessity is not an established fact, but an interpretation.
 

Offline splin

  • Frequent Contributor
  • **
  • Posts: 999
  • Country: gb
Re: How to pick an input resistor for a millivoltmeter
« Reply #2 on: August 07, 2019, 11:40:28 pm »
Hi ckocagil,

Think about this remark, where do you want the bias flow to go?
There is always a bias current, with modern multimeters this is usually below 50 Pico Amperes.

There are also measuring instruments that do not have a 10M resistance for the low ranges, then the bias current flows in or out of the measuring instrument through the circuit to be measured.

Hi blackdog,

Think about this remark, where do you think the bias flow is going to go?

If the source resistance is 10Mohms then half the bias current will go through the source - but the measured voltage will have be 50% of the real voltage. If you want to measure with an accuracy of 1% - pretty mediocre accuracy - the source resistance must be less than 100kohms, in which case 99% of the bias current will flow through the source.

A meter with a 10Mohm input resistance is only suitable for measuring low resistance sources for which the bias current introduces negligable error anyway.

So to address the original question: the 34401A has an input reistance which can be selected between 10Mohm and giga-ohms. The 10M option serves no real purpose but appears to be there for backward compatibility with older meters. Bizarrely and irritatingly it is the default setting on power up requiring a number of setup operations to change to the sensible, > 10Gohms mode. (Again, this is probably for backwards compatibility).

About the only reason for 10Mohms I can think of are for measurement setups which rely on the 10Mohm resistor. For example, the 34401A's lowest current range is 10mA - but using the 100mV range with the 10Mohm input resistor gives a 10nA range. (10nA x 10Mohm = 100mV). The accuracy isn't great due as the 10Mohm resistor tolerance is moderate and also due to input bias current (<50pA), but nevertheless can be useful if you don't have a 10Mohm shunt resistor to hand..

Many, if not most, high resolution meters only have 10Mohm input resistance for high voltage ranges where it is necessary to use a resistive divider.

Yes, you have to allow for errors caused by the meter's input bias current, but slapping a 10Mohm resistor across the input is a really dumb way to deal with it.
 

Offline David Hess

  • Super Contributor
  • ***
  • Posts: 17604
  • Country: us
  • DavidH
Re: How to pick an input resistor for a millivoltmeter
« Reply #3 on: August 08, 2019, 01:19:11 am »
The 10 megohm input shunt resistor does not do anything useful except provide a known input resistance.  It is not required for operation and some meters have an internal jumper or relay to disable it converting the input to a gigaohm input.

The common value of 10 megohms comes from the common value of the input resistance of the switchable input divider for a general purpose voltmeter.  Some of these voltmeters only use the input divider at higher voltages where they have a 10 megohm input resistance.  On lower voltage ranges, the divider is removed and the input resistance is gigaohm levels.

Like Splin says, a 10 megohm input resistance is only suitable for high precision measurements when the source resistance is low.  If you want to make precision high voltage measurements at higher impedances, then a design without an input divider is required like an electrometer input configuration.  Otherwise most meters are limited to +/-2 volts or not much higher than +/-10 volts for the good ones.
 

Offline guenthert

  • Frequent Contributor
  • **
  • Posts: 780
  • Country: de
Re: How to pick an input resistor for a millivoltmeter
« Reply #4 on: August 08, 2019, 04:39:54 pm »
Some Null meters (Fluke 845, Keithley 155) use 1MOhm, so they can do double duty as Picoammeter.
The HP419A uses an input R of only 100kOhm, which necessitates a different scale if used as such.
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf