Hi guys,
Assuming you have a simple non inverting amplifier and a source with a large enough impedance (10kOhm+) and low voltage output (<1mV, such as a infra thermocouple) that doesn't mind short circuits, could you measure the offset voltage by shorting the source periodically with a MOSFET transistor? In theory you'd only get the offset voltage amplified on the output, which you could measure with your ADC and then simply subtract it from the normal measurement. If you did this often enough, you'd compensate against both the actual offset voltage and its drift.
In theory and simulation it works, how about in practice? Would this be a bad idea?
Thanks,
David