For current measurement on VI (voltage/current) instruments, I found that current sense resistor values are always chosen in this way: R * Max current = 2V. For 1A current range, 2 Ohm high precision resistor is used for current measurement; for 100uA current range, 20K Ohm resistor is used.
There are some drawbacks of using 2V voltage drop:
1, Big voltage drop which requirements amplifier to deliver higher output voltage. 2V is 10% of +/-10V voltage range.
2, Big power consumption and heat on the resistor. For 1A current, the power dissipation on the resistor is 2W. The resistor should have big package to handle the power/heat, and resistor value may drift due to heating which results in measurement accuracy degradation.
Any not using smaller voltage drop, for example 0.2V, for current measurement?
Thanks.