I am seeing a ~1 pixel positive offset especially from CH1 and CH2, which tends to come and go with vertical position, even when the inputs are gnd coupled (is this a "real" ground anyhow?) and regardless of the volts / div setting. Probably just expecting too much from the front-end!
Isn't this funny? Folks buy a dirt-cheap entry level scope from a Chinese manufacturer and then rant about things that experienced professionals would not expect even from a midrange DSO manufactured by one of the "big boys" and costing more than 10 times as much
Just like the majority of modern DSOs, this one does not have a "real" input ground. So checking anything with input coupling set to GND is completely meaningless.
If you want to check the offset with grounded input, you could use a BNC-Terminator for that (with DC coupling). Or just leave the input open, the difference is really negligible.
If you do so, please check the specifications for offset and gain error in the data sheet (and compare these with even much more expensive instruments) before you start questioning the quality of the (excellent - high sensitivity, low noise) frontend, just because you spot a few pixels deviation somewhere.
Self calibration is there to bring the instrument to the calibrated state. Specifications can only be guaranteed after a self-cal has been performed after at least half an hour warm-up of the instrument. You can bet your bottom dollar that the factory performs such a self-cal before checking the instrument cal state and then issuing the certificate if it's passed. Of course you won't need another self-cal anytime soon unless there were major temperature changes or the firmware has been updated. Yet the temperature in your lab might differ from the one used for the calibration, so a self-cal could improve things.
Self-Cal cannot provide infinite accuracy though. It relies on an internal reference and a highly accurate DAC, but the adjustments don't have infinite resolution. This is true for the offset DAC (for offset calibration) as well as the PGA (for gain calibration). So especially if you look at the trace at high sensitivities (<5 mV, which are just "fake" software zoom in many other even much more expensive DSOs, but are very real in the SDS1000X-E series), the limited resolution of the offset DAC might leave us some visible offset in the realm of a couple 100 µV.
The frontend of a modern DSO has to use an OpAmp for the DC/LF-path of the input buffer. Since it needs to be high impedance, it has to be (MOS)FET. But this means that it cannot be low offset with low TC - and you don't want a chopper stabilized amp in the frontend of a DSO. This is the reason why a DSO frontend cannot be compared to a multimeter, because the latter usually doesn't need a separate OpAmp and even if it did, it would be of the "zero offset" species. It's essentially the drift and temperature stability of this beforementioned OpAmp that causes the offset error (changing with temperature) in a DSO, hence makes the self-cal necessary. The SDS1000X-E can even provide a "quickcal" that kicks in automatically as soon as the instrument detects a condition that might increase the offset error. The gain on the other hand is much more stable anyway.
If you are interested in an exemplary performance verification regarding DC and AC accuracy, you can have a look at the document "SDS1104X-E Review 50-70.pdf" that is attached in reply #1 of this thread:
https://www.eevblog.com/forum/testgear/siglent-sds1104x-e-in-depth-review/In this document, at page 52 you will find a demonstration to measure a DC voltage of 205 V to an accuracy of better than 0.08 %.
Compare this to the specifications and you will learn that this instrument can actually be vastly better than that, especially when it comes to offset accuracy. Still you'll have to live with a small
additive error, only visible at high sensitivities.