I've built myself a milliohm meter. Ratiometric measuring, nothing fancy - two INA219 in series, one measures the current over known 1% 0.1Ohm resistor, the other measures the voltage drop over the leads+DUT resistor. Current switches back and forth to negate some of the measurement errors.
Now the resistance of the leads - this is where people tend to use 4-Wire Kelvin test leads. I get why.
Instead, I short my two-wire test leads, do the measurement, and remember it in software as "offset" or "error" which I later subtract from the actual DUT.
What is my fallacy? Why use 4-Wire Kelvin if I can short the leads? Convenience? The inconsistent shorting point?
Q2:
Most of the 4-Wire Kelvin use shielded cable, where does it go inside the measurement equipment? Does it connect to the ground? If yes, digital ground? Analog ground? The star-point where all grounds connect? To the metal chassis of the enclosure? How does it work?