Author Topic: Op amp offset measurement/digital compensation  (Read 1451 times)

0 Members and 1 Guest are viewing this topic.

Offline daqqTopic starter

  • Super Contributor
  • ***
  • Posts: 2302
  • Country: sk
    • My site
Op amp offset measurement/digital compensation
« on: December 03, 2013, 11:10:01 am »
Hi guys,

Assuming you have a simple non inverting amplifier and a source with a large enough impedance (10kOhm+) and low voltage output (<1mV, such as a infra thermocouple) that doesn't mind short circuits, could you measure the offset voltage by shorting the source periodically with a MOSFET transistor? In theory you'd only get the offset voltage amplified on the output, which you could measure with your ADC and then simply subtract it from the normal measurement. If you did this often enough, you'd compensate against both the actual offset voltage and its drift.

In theory and simulation it works, how about in practice? Would this be a bad idea?

Thanks,

David
Believe it or not, pointy haired people do exist!
+++Divide By Cucumber Error. Please Reinstall Universe And Reboot +++
 

Offline Harvs

  • Super Contributor
  • ***
  • Posts: 1202
  • Country: au
Re: Op amp offset measurement/digital compensation
« Reply #1 on: December 04, 2013, 12:58:39 am »
This is sort of commonly done in DMMs.  Often at some point in the analog signal chain, an analog mux will switch from the signal source to GND, then the perform a conversion cycle and subtract it from the result.

For auto-zeroing bench meters like the HP 3456A, this is done every conversion cycle by default, or you can switch it off and double your sample rate.

As to whether you should short your sensor out, about the only issue I can think of is the leakage current through the MOSFET.  You'd need to check the specs of the FET's off-state leakage and see how much error this would introduce to your measurement. Be aware this is unlikely to be linear.
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf