After getting an reference voltage and Amperes tester, plus by making my own high accuracy resistors box, I started to test my meters .
By my opinion , it looks that the possibility that something can go wrong, about the meter accuracy,
goes by one percentage in relation with the measured unit .
For example , it less possible an meter to have issues on the DC range.
Or its more possible one meter to handle well Ohm resistors , and have issues at the MOhm range.
The AC measurement ( Just mains ) , its also one measurement that true accuracy is questionable.
My advice in general would be , that if you do own an " non-auto range" multimeter,
to get at list few high quality resistors at 0.1% , and test its range .
For example one meter with resistors range like:
2 - 20 - 200 ohm 2K - 200K - 2MOhm 20MOhm 200 MOhm
It can be tested with eight resistors : 1 Ohm - 10 - 100 Ohm 1K 100K 1MOhm 10MOhm 100MOhm
The 95% of the multimeters out there will never see an calibration lab due cost mostly,
so my suggestion its an cheap alternative and easy to do.
Currently from my collection of accuracy verification tools ,
is missing one AC source , that could produce AC with base frequency above the 60Hz ,
so to be able to test accuracy over bandwidth = True RMS .
If some one can point out, an easy to build design, please do so .