Author Topic: Dynamic gain/automatic ranging for maximizing bit depth in measuring equipment?  (Read 621 times)

0 Members and 1 Guest are viewing this topic.

Offline cape zolohTopic starter

  • Contributor
  • Posts: 38
  • Country: se
Is this something that is used in, lets say oscilloscopes? I have next to no knowledge in how they work, only that they use fast but low resolution ADCs.

Now lets talk multimeters.
The high resolution (and low res) multimeters I have used have a couple of ranges, usually in [max ADC input] x10, x1, 1/10, 1/100 and so on.
The x1 range is the optimum, the others are "the next best thing". Most bit loss happens when your measurement is the furthest away from the range.
Wouldn't it sometimes be nice to have a feature that automatically amplifies the input signal, in order to make that x0.1 or whatever signal=x1, and then compensate for that gain in post?
Does this cause more problems than what is (sorry..) gained?
 

Offline David Hess

  • Super Contributor
  • ***
  • Posts: 18745
  • Country: us
  • DavidH
Some multimeters (and oscilloscopes) do exactly that; the most sensitive range is produced by switching in a preamplifier without adjusting the attenuation.  Another trick multimeters sometimes do is to attenuate the reference for the analog-to-digital converter to 1/10th its normal value which has the same effect.

Oscilloscopes use frequency compensated attenuators to achieve wide bandwidth while multimeters use a decade divider which is acceptable without frequency compensation at lower frequencies.
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf