| General > General Technical Chat |
| Why "2V voltage drop" is used by VI instruments to measure current? |
| << < (3/3) |
| David Hess:
The old standard 3-1/2 digit ADC accept either a 2.000 volt or 0.200 volt input and the 4-1/2 digit ADC accepts 2.000 volts. The change to 0.200 volts is done by dividing the reference by 10, but dynamic range considerations make this difficult with a 4-1/2 digit ADC. For a long time precision ADCs were most often 2000 or 20,000 counts making the most natural input range 2 volts. It was only later that 4000 and 6000 and 10,000 counts became more available, with inputs of 4 volts, 6 volts, and 10 volts. |
| desert:
There are many kinds of VI instruments, to be more specific, the VI instruments I am talking about is SMU (Source and Measurement Unit) for semiconductor wafer and component testing. The voltage source and measurement range could be from ±50V maximum to ±5V minimum. The current source and measurement range could be from ±1A maximum to ±10uA minimum. This kind instrument has tough accuracy requirement, especially for low current range. For ±10uA current range, the accuracy specification is ±(10nA + 0.05% of setting/reading + 0.2nA/V*|V|). I guess using “2V voltage drop” for current measurement is the way to achieve high accuracy for low current range. And keeping the same voltage drop for different current ranges is to reuse the same current measurement circuit. What’s the main problem to achieve good current accuracy while using smaller voltage drop (such as 0.2V)? Any example to show the calculation and lineup? |
| David Hess:
--- Quote from: desert on December 21, 2022, 04:02:28 am ---What’s the main problem to achieve good current accuracy while using smaller voltage drop (such as 0.2V)? Any example to show the calculation and lineup? --- End quote --- 2000 counts (3-1/2 digits) at 0.2 volts is 100 microvolt resolution. 20,000 counts (4-1/2 digits) at 0.2 volts is 10 microvolt resolution which was much more difficult to achieve in the past, especially with an integrated CMOS converter. Automatic zero corrects for drift but not flicker noise which might exceed that level. (1) This is why old multimeters which relied on these converters had a minimum 200 millivolt range if 3-1/2 digits and 2 volt range if 4-1/2 digits, for the same best resolution of 100 microvolts. (1) Old datasheets do not give details on the noise levels but excluding flicker noise, those old CMOS designs had peak-to-peak noise of at least several microvolts. The 4-1/2 digit converters might also have had linearity problems at a resolution of 10 microvolts. |
| Kleinstein:
The general problem with current measurments is measuring relatively small voltages. 5.5 digits with 200 mV burden already requires 1 µV resolution. At that level amplifier drift, noise and thermal EMF can become a problem. One problem here is the heat from the shunt causing thermal gradients near the shunt. So the shunts not only need a low TC but also low thermal EMF values. Modern zero dift amplifier can help quite a bit, but the thermal design to avoid gradients at the wrong place still needs care (e.g. good symmetry). For the SMUs the porblem is that the very low current ranges may not work well the AZ amplifier, but may have to use classic JFET or similar amplifiers. In this case dirft of a few µV is hard to avoid. Using separate amplfiiers for the higher currents is extra efford, but could still be worth it. For the early handheld meters ADC counts directly correspond display steps. 100 µV accuracy was reasonable easy, 10 µV already demanding for a low power design. So a 200 mV range for the ADC and also the drop at the shunt was the natural choice as this can avoid extra amplification. |
| Navigation |
| Message Index |
| Previous page |