Ok this is an excellent question (I wanted to address it in my initial post actually...).
Here's the reasons why I know it's not reading correctly.
-I measure a decade resistance box I have made myself. All resistors within a given group have the same value (within 1% specifications but actually are better than that as I selected them manually to match).
-I can measure each resistor individually. For instance, on the 100 Ohms set of resistors, I can measure each resistor (while in the Ohm range) and each one is spot on 100 Ohms (+- probes and traces residual resistance).
-When measuring them in series (as they should be used in such a device), I get these exact measurements from 1 up to 600 Ohms. As I have determined my residual resistance from probes and traces to be 0.56 Ohms, I can get 2 decimal points in precision : for instance I can almost exactly read from 1.56 Ohms to 600.56 Ohms for any value I select on the device.
-But above 600 Ohms, the DMM switches automatically to the kOhms range. And then where I should ideally read 0.700 kOhms, I read 0.696 kOhms. It's not such a big difference, but it's quite noticeable. I am quite certain that this DMM can be more accurate.
Even if we exclude any absolute precision here (ie the DMM could actually be wrong even in the Ohms range), one could say that the precision of the DMM changes quite dramatically when it changes it's range from Ohms to kOhms. Hence the need to recalibrate it.
Apart from that I have confirmed the problem with my other DMMs (Brymen and UEI).