It is not a commercial meter, but one I am designing. The current source is 100mA, controlled by a buried zener voltage reference. I'm not too worried about the source, as the components are very tight tolerance. I am putting a precision instrumentation amp to multiply the sense voltage by x1, x10, or x100 to increase the sensitivity, but this is the part that worries me. I have no accurate way to check that the multiplication is accurate. For a $30 IC, it had better be, but I'd like to know.
There are 0.01% resistors in 100ohm and up (I'm using one in the current source) but this is too high for my design to measure. Once you start getting into the fractional ohm range, low tolerance resistors get very expensive.
The eBay seller claims that the standards are tested working, but does not mention being checked for cal. His eBay reviews are positive, and one reviewer posted that the resistor was within 0.005% tolerance. So they seem legitimate. I just don't know much about these things.
The leads are 4 wire and shielded. There is a current reversal switch. The meter will be a bench Fluke model, I'm not sure of the model number as it is not in front of me right now. I'm not expecting 0.1% accuracy on a meter that costs $100 to build. But I would like to be able to check everything.