What are these internal offset voltages and offset currents? Why not just cancel them in DSP once and be done with it for the rest of the life of the instrument?
It changes with temperature, humidity, pressure, age, cosmic rays, whatever ... basically the only thing you know about offsets is that they don't change very fast, most of the time.
So multimeters often auto-calibrate. You short the input to ground, measure the offset and digitally subtract it from a value measured from the signal. Since you want a steady stream of measurements you simply do this once every measurement. This is closely related to correlated double sampling, although that term is mostly used in the context of image sensors.
Auto-zero is most commonly used in the context of amplifiers for more
sophisticated and
complex schemes than the above mentioned auto-calibration. The advantage being that you don't have to waste an entire measurement cycle every time while removing some of the "higher" frequency flicker noise as well (offset is basically "low" frequency flicker noise, auto-calibration as above can only remove this "low" frequency flicker noise and relies on simple low pass filtering for the "higher" frequency flicker noise). The downside being that it creates a small AC current on the input (in theory auto-calibration does so too, but it's so low frequency it's unlikely to matter).