Hi. I'm trying to use my oscilloscope to measure a DC voltage, but even without a probe connected, the voltage on channel 3 is reports 800 mV. When the probe is connected and measuring a 5 V from a linear PSU, the oscilloscope reports 5.8 V, for 10 V it reports 10.8 V, etc. There is a consist offset of 800 mV.
Comparing to the other channels, Channel 1 has the least offset, followed by channels 2 and 4. Channel 3 has the most voltage offset.
I've tried performing a "self calibration", but this did not help.
Is there something wrong with my scope? It's 2 years old but has been used maybe a dozen times. Is there a way to manually "zero" the voltage on a channel?
Thanks
----------------------------------
Edit: Might have figured this out... the oscilloscope datasheet specifies a DC offset accuracy of "+/- 0.1 div, +/- 2 mV, +/- 1%". For higher vertical scales this offset becomes more apparent. I was trying to cram a lot on the screen, so I set channel 3 to 10 V / div, which leads to very noticeable offset of around 500 mV. Decreasing the vertical scale to a lower value reduces the offset. Interestingly 20 V/div has less apparent offset than 10 V/div... hmmm