Measuring a 0.1ohm resistor:
A reading of 35mv means 0.35ohm - 0.25ohm = 0.1ohm.
Having built the two ESR meters, I can think of the following pros / cons:
1) analog read-out:
pros: no calibration required. Simple and intuitive read-out. The non-linear read-out is desirable as it provides more resolution at low ESR ranges. It is a full-range meter;
cons: you need an analog meter with resistance range; and you are limited by the center resistance of your meter - generally not a problem.
2) digital read-out:
pros: can be used with digital multimeter; easy to see readings; "appearance" of precision;
cons: limited upper range; needs calibration.
edit1: the digital meter is slightly more current hangury: 10ma vs. 8ma for the analog esr meter.
edit2: the digital meter, in order to obtain more accurate reading at the lower end, need to use very fast opamp in the precision rectifier vs, the analog meter. NE5532 (the opamp I used here) has a practical lower end of 0.1ohm. LM358 / TL0xx would not work for this design.
With smd parts, something like this can be easily built into an el-cheapo DMM (830 for example); Alternatively, an adapter can be made, with either analog or digital meters.
I did both zero scale and full scale calibration.
Here is the meeter measuring 5ohm (10ohm // 10ohm) on the left (calibration point) and then measuring 10ohm on the right.
We have a reading of 5.10ohm on the left and we should have a reading of 10.20ohm on the right, vs. 10.18ohm actual.
Practically spot on.
Did a simulation: ESR measurement error is within 1-2% for ESR from 0.1ohm through 2ohm. 10% at 5ohm and 29% at 10ohm -> the later could be due to errors in measuring the current. At those levels of ESR, a little bit change in current corresponds to a lot of changes in ESR.