There are three possible ways to calibrate the meter:
1) a resistor of known value (needs to be less than the maximum range of the meter): in 4-wire mode, turn the meter into calibration mode, and measure the known resistor. Adjust the trimmer - the lcd will display the "measured" resistor value and modified current reading, in ua. Repeat until the lcd displays the correct resistor value.
2) a ua meter: put the ua meter on the milliohm meter to measure its "resistance" - the ua meter is now a dut to the milliohm meter. The ua meter will show the current flowing through itself (=dut). Adjust the trimmer until the displayed current matches that of the reading from the ua meter. You would need to have a ua meter with as low of burden voltage as possible.
3) two resistors: pick two resistors of known value - they don't need to be precision resistors and their values don't need to be precise. Pick the first resistor (R1) to be small (<1ohm), 0.47ohm or 0.22ohm for example. Pick the 2nd resistor (R2) to be 10 - 20ohm range (again, no need to be very precise). Use your multimeter to measure the resistance of the 2nd resistor - most meters will give fairly good measurement on this resistor.
Put the meter in calibration mode - it will display measured resistance + modified current reading.
Put the first resistor on the meter - it should show one reading; Then parallel it with the 2nd resistor - the reading should decline by a little bit: R1 - R1 * R2 / (R1 + R2) = R1 * R1 / (R1 + R2) = R1 / (1 + R2 / R1). Since R2 >> R1, the changes will be dominated by R2, which you have fairly good measurement of.
You just need to adjust the trimmer now to reproduce the expected changes.
This approach is more tedious but works in cases where your equipment is limited, and in 2-wire mode as well - I actually showed that approach a while back.
Now, I just need to code all that into the chip,