AFAIK the DA1281 and solartron 7071/81 use chopping the whole way - at least they can. The Prema 6048 also has only the one chopper amplifier in the input path. The ref path uses precision BJT parts, though and in a way that an offset there mainly effects the gain, not so much the zero point. The point here is to have low drift not only for the input amplier, but also for the ADC.
Many other multi-slope ADC versions (e.g. the HP3458) have quite some drift and 1/f noise. So they need the AZ switching not just for the amplifier, but also for the ADC.
For the common ADC type with switching at the integrator input the offset drift is not just from the OP's, but also from resistor drift. So an AZ OP alone would not help much with the ADC drift.
An electrometer input with a high voltage range (e.g. some 100 V) may be lower noise than a more classic DMM with 1:100 divider and than an amplifier. Here it is not only the amplifier noise, but also the thermal noise of the divider resistor (e.g. 100 K at the low end have some 40 nV/sqrt(Hz) of noise). For the comparison it still depends on the frequency to look at as the electrometers usually have quite some 1/f noise. For lower noise It would help to have less initial divider and than use a larger input range (e.g. +-20 V), even of this need an extra divider after the buffer. Also the usual :100 and x 10 for the 100 V range is not ideal.
An electrometer input with a high voltage range (e.g. some 100 V) may be lower noise than a more classic DMM with 1:100 divider and than an amplifier. Here it is not only the amplifier noise, but also the thermal noise of the divider resistor (e.g. 100 K at the low end have some 40 nV/sqrt(Hz) of noise). For the comparison it still depends on the frequency to look at as the electrometers usually have quite some 1/f noise. For lower noise It would help to have less initial divider and than use a larger input range (e.g. +-20 V), even of this need an extra divider after the buffer. Also the usual :100 and x 10 for the 100 V range is not ideal.
The reason I was thinking of an electrometer style input is that the divider following the bootstrapped high impedance buffer can be lower impedance, so have lower Johnson noise, while the bootstrapped buffer has the same noise as a non-bootstrapped buffer.
.. Does anybody apply chopper stabilization to the whole signal path? ..
There are integrated chopper stabilized amplifiers with clock input. One can run several of them in sync, let's say input buffer, reference current source and integrator. Don't know whether the two devices in a OPA2189 run in sync. Thermal EMF between the amplifiers will still be left, so an oven is good to have.
If we believe the Zürich paper linked by Echo88, temperature controlled amplifiers may be good enough without autozero/chopper stabilization. With their 20 °C FET pair they got 100 nVpp into 100 sec integration time over several days. Don't know how this compares to the OPA189 mentioned above and what it means for 1 second integration time.
Regards, Dieter
Don't know whether the two devices in a OPA2189 run in sync. Thermal EMF between the amplifiers will still be left, so an oven is good to have.
I have used chopper stabilized duals and I think they do run using the same clock because I never saw idle tones show up in the output when they were cascaded. If they used separate clocks then there would be problems.
A circuit sketch for a voltmeter front end with four ranges with a factor 2x pattern, e.g. 3 V, 6 V, 12 V and 24 V. It shows how to implement force-sense drive. In my opinion the factor 10 pattern of many commercial meters is a budget version, i mean in audio we also accept 6 dB headroom but not 20 dB.
Regards, Dieter
Edit: For this to work one needs a 2 x 4 MUX that is good for 24 V.
A high resolution ADC can cover quite some range, so that the 1:10 steps are not that bad, even if one operates at only 10% of the range. More switches for the gain also add more leakage current and a divider for a gain of 2 has relatively high output impedance to make it sensitive to leakage and also adds some noise.
Finer spaced ranges may be a slight advantage with an ACAL procedure to measure the gain steps relative to each other: more but smaller steps may give slightly better accuracy and maybe a way to do extra checks.
In the implementation shown the leakage of the switches may be an issue for the higher ranges. The 4052 switch is only for a limited voltage.
It also depends a lot on how the ADC is build and this one looks rather specific for the Prema type ADC with contineous integration. In other version one needs to turn the input off or uses symmetry with the switch resistance also for the reference. It may in some cases be possible to have 2 ranges at the ADC (e.g. 5 V and 10 V) and than keep the more classic 1:10:100 as amplifier gain setting.
To work with the AD7177, one can connect the lowest range resistor to ground and use its voltage instead of current, probably with a buffer plus an inverting buffer.
Regards, Dieter
This is a simplest front end I can imagine (I had a thread about it in past). What you may need in addition are the input switches for AZ and calibration..
Bootstrapped OPA140 (9V, 3.4mApp output current in this example), ultra low leakage input protection. LT5400 (18k/2k) divider, and differential buffer (2x ADA4528).
With +/-12V input you get 2.4Vpp differential low impedance output for the ADCs (CM is 2.5V).
Not elaborated in HW yet, however.
Here is a drawing with 8.5 digit voltmeter I made.
Does not make sence to me. AD7177 does have internal buffers on inputs & references, so all it takes to make a nice voltmeter is to put 4:1 (7.5 : 2.5 MOhm) resistive divider. Money already payed for luxury.
Design of the AFE required with chip ADC w/o buffers.
BTW, my tests with AD7172 shows that I can't improve internal buffers with external circuitry even using best of the best IC on the market. Moreover, any external front-end significantly degrade SNR, especially if it comes to AZ OPA.
The internal buffers, especially at the input are not ideal. Especially the INL can be an issue. With the typical 3.5 ppm INL with the internal buffers the ADC is no longer that interesting for a high grade DMM.
A 7.5 M : 2.5 M divider at the input would also add quite some noise, from the Johnson noise alone and chances are the input switching would be slow with such a low source resistance. Such high resistors also tend to be not very stable. So at the very least one would need a buffer at the input and than a lower resistance (e.g. 30K and 10 K ) divider for the input. There are better buffer than the internal ones, though this is not a simple task.
The other point missing is that with only pseudo differential drive (1 side essentially fixed at a virtual ground) one only gets half the range and the INL may also be not as good (the INL specs are for full differential input signals). One can add differential drive, by moving the input low side = other input of the ADC. It may be a bit hard to understand and can add some complications (e.g. with gain at the input and electronic switching for current ranges), but it is a nice working way to drive a differential ADC.
I don't have a full reverse engeniering of the SDM3065 front end, but it looks like it uses a similar system, to get a +-20 V input range with only a +-15 V supply.
..One can add differential drive, by moving the input low side = other input of the ADC. It may be a bit hard to understand and can add some complications (e.g. with gain at the input and electronic switching for current ranges), but it is a nice working way to drive a differential ADC.
I don't have a full reverse engeniering of the SDM3065 front end, but it looks like it uses a similar system, to get a +-20 V input range with only a +-15 V supply.
Could you elaborate a bit?
Would this be the case you have mentioned?
Would be ~1k (or 500) seen by the ADC low enough?
For a true differential drive one would drive the virtual ground to the inverse of the output of U1. This does not have to be super precise as the error there would only cause a small common mode signal to the ADC that is reasonably well suppressed.
It depends on the ADC how low the driving impedance should be. For the internal buffers 1 K and even 10 K should be OK. Without internal buffers 500 Ohm may be OK, but not ideal (e.g. higher INL and gain error) - one has to check the ADCs datasheet on this. So one may need buffers between the divider and ADC.
The BS supply as shown for U1 may be boarderline when it comes to stability against oscillation. Usually the BS supply part should be slowed down a bit. So one may want some RC filtering before the bases of the transistors. It may work in the simulation, but this could be due to a bug in quite some OP models that don't properly handle a moving supply and use a GND reference instgead of the true supply somewhere in the model.
For a true differential drive one would drive the virtual ground to the inverse of the output of U1. ..
I think the ADC in above v42 (and below v42a) sees the inputs truly differential.. The ground there is the ADC ground.
Or am I mistaken somehow?
Below v42a with some stability improvement (and yes, LTSpice has shown oscillation with some inputs without the BS compensation, but primarily the ADC input is the case we have to solve)..
The last circuit produces the differential drive by moving the GND point around. So the ADC's supply would have to be bootstrapped relative to the GND symbol and the GND symbol is not the power supply ground. One could do it this way, but would not get the relatively large possble input range (e.g. 20 V) for the supply.
What about this config.. (v43)..
Input +/- 20V
Output +/-2V diff
Vcc +/-15V
LT5400 18k/2k divider (or 4.5k/0.5k with more current off the opamps).
That circuit is about what I suggested, with the extra inverter. Chances are one would need some extra capacitance (e.g. 1 nF range) in parallel to R11 to slow down the inverter part a little to reduce ringing / gain peaking. Due to the divider the neg side of the ADC signal now moves too much and would also need a divider. So the divider should not be 18 K and 2 K , but more like 9 K : 2 K : 9 K with the ADC signals over the 2 K resistor. The AD7177 has a FS range of up to +-5 V and the divider would thus ideally be more like 1:1:1 ( 15 V FS range) or 3:2:3 (20 V FS range) and not 9:2:9.
The OPA140 is a good OP-amp, but may still want some way to check it's offset or do auto zero switching at the input.
Another point is that is can be tricky to add gain for lower voltage ranges.
To maintain the low output impedance and the divider ratio and the low opamp's output current - that is not an easy task with the two opamps, imho..
AZ input switch - any suggestion, btw?
PS: below with the square wave 40Vpp. You may see with DC the output error is much smaller compared with the 5Hz sine wave. I wonder how the big boys handle the propagation delay through all those (RC compensated) opamps..
1.
The internal buffers, especially at the input are not ideal. Especially the INL can be an issue. With the typical 3.5 ppm INL with the internal buffers the ADC is no longer that interesting for a high grade DMM.
2.
A 7.5 M : 2.5 M divider at the input would also add quite some noise, from the Johnson noise alone and chances are the input switching would be slow with such a low source resistance. Such high resistors also tend to be not very stable. So at the very least one would need a buffer at the input and than a lower resistance (e.g. 30K and 10 K ) divider for the input. There are better buffer than the internal ones, though this is not a simple task.
3.
The other point missing is that with only pseudo differential drive (1 side essentially fixed at a virtual ground) one only gets half the range and the INL may also be not as good (the INL specs are for full differential input signals). One can add differential drive, by moving the input low side = other input of the ADC. It may be a bit hard to understand and can add some complications (e.g. with gain at the input and electronic switching for current ranges), but it is a nice working way to drive a differential ADC.
I don't have a full reverse engeniering of the SDM3065 front end, but it looks like it uses a similar system, to get a +-20 V input range with only a +-15 V supply.
1. Internal buffers of the AD7177 as good as ADC itself. Picture below. About 1ppm at the best, so I dn't know if it's any good for 8.5, more like 6 digits?
2. Regarding stability 7.5 M resistors, 75x100k resistors matrix is still lower in cost than AFE.
Input buffers should have ultra low bias current to work with 10 M (in case not divider, but output inpedance of the voltage source measured), and here we have complication, I'm sure overall stability/ linearuty /noise of the OPA189 is much worse when interfaced with 10M, than chiepest 7.5 M resistors one cat get. There was a topic recently regarding problems with input bias AZ amps, OPA18x.
3. This is only argument I agree, differential drive - verse INL. But still this sollution needs to be checked mathematicaly if it worth to buld SE-DIFF converter if ADC never break 1 ppm INL in any way. And what is the difference would be since AD doesn't provide SE INL data, there is a chance that would not be any difference at all
AZ input switch - any suggestion, btw?
I am just thinking out loud here.
If the switching is also bootstrapped, then it does not need to handle high voltages, *and* zeroing of the input buffer can occur *at* the common mode input voltage removing all common mode errors like the old Intersil designs. The control signals will need to be level shifted, but that is easier than implementing a low leakage high voltage switch.
Of course with bootstrapping there should be no common mode errors anyway, but is that really the case? As I recall, the common mode rejection of the operational amplifier will be multiplied by the open loop gain in this case, so nonexistent.
I wish there was a way to remove the flicker noise. There are dual path designs which do, but then the input bias current of the chopper is added to the input bias current of the buffer defeating the advantage of the buffer. The OPA140 has such a low input bias current that several could be used in parallel to make some improvement.
Using only pseudo differential input limits the votlage range to +-2.5 V and thus limits the SNR. In addion the single ended mode may well have even order INL parts. The true differential mode tends to suppress the even oder INL contributions as seen in the INL curves with a relatively point symmetric curve and thus little even order INL part. The data sheets tend to show the good points and if not extra bad or normally expected more leave out the weak points.
The typical specs are 1 ppm without the buffer and 3.5 ppm with the buffer. The difference may not be that visible in the graph though, but the curve without the buffers still look a little better.
The Cern digitizer showed even better INL, at least for positive voltages. So the 1 ppm from the datasheet may be based on a not so ideal layout / circuit. One weak point the switched capacitor ADCs is that the ADC chip itself is only part of the solution and the buffers/amplifiers for the signal and reference can effect the INL, noise and drift quite a bit. This especially means that one still should check the INL of the actual circuit unless it is a 1:1 copy of the test circuit used for the specs, which is often not shown in detail. The requirements with the reaction to short charge pulsed are not that easy to tell from specs alone - it may need real world tests.
I totally aggree that a 1 ppm INL spec is not really what one expects at 8.5 digits, still better than most 6 digit meters. So maybe more like a compromise 7 digits. 3.5 ppm is more like 6 digits and not even a good one.
At the high end and already at 6 digits one usually wants a high impedance (e.g. > 10 Gohm) input impedance. So a 10 Mohm input divider for all ranges is not really an option. It is kind of needed for the high voltages, like >20 V where it gets tricky to build an amplifier. So I don't see a point in a 10 M divider from 10 V to 2.5 V. Using multiple resistors combined still does not solve the problem with possible leakage on the PCB. Its not much leakage to get a few ppm error to a 7 M resistor. Up to some 20 or 25 V there is the option to use a driven low side and conventional +-15 V supply switches. Another point is the input impedance of the buffer - this may not be that linear compared to some 2 Mohm from the 7.5:2.5 M divider output. Already with a more conventional 9.9 M + 100 K divider the input impedance of the amplifier / buffer and if present FET switches can be an issue.
So even there the OPA189 is likely not a good choice. More suitable may be an MCP6V76 or AD8628 with a bootstrapped supply (to get very high input impedance).
For the signal source one often still has way less than 10 M, usally even less than 100 K. With common > 10 Gohm specs one would need less than 10 K to keep at least 1 ppm accuracy for sure.
If the switching is also bootstrapped, then it does not need to handle high voltages, *and* zeroing of the input buffer can occur *at* the common mode input voltage removing all common mode errors like the old Intersil designs. The control signals will need to be level shifted, but that is easier than implementing a low leakage high voltage switch.
Of course with bootstrapping there should be no common mode errors anyway, but is that really the case? As I recall, the common mode rejection of the operational amplifier will be multiplied by the open loop gain in this case, so nonexistent.
Bootstrapping the switches only works part ways. When using classical AZ switching like in the HP meters there still needs to be a switch to bock the full voltage. One can use bootstrapped switches for the first part / precharging phase at least. In the parallel thread
https://www.eevblog.com/forum/metrology/analog-frontends-for-dmms-approaching-8-5-digits-discussions/msg4454926/#msg4454926there are suggestions and even so prilimary test results that look very promissing. I see the main point in reducing the switching spike, which can be quite anoying and hard to compensate with classical switches.
For a chopper stabilized amplifier one can use bootstrapping the the switching part, e.g. as used in the Datron 1281 and likely internal in many of the high voltage AZ OP-amps like OPA189.
The bootstrapped amplifier supply should essentially avoid the CM error and also increase the input impedance (voltage dependent input bias current). Bootstrapping is relatively easy for a buffer, but a bit more tricky, but still possible for an amplifier.
~~~~~~~~~~~~~~~
At the high end and already at 6 digits one usually wants a high impedance (e.g. > 10 Gohm) input impedance. So a 10 Mohm input divider for all ranges is not really an option. It is kind of needed for the high voltages, like >20 V where it gets tricky to build an amplifier. So I don't see a point in a 10 M divider from 10 V to 2.5 V.
What you are saying, it's very nice to have:
1G input impedance + ultra linear + low noise bandwidth & 1 /F .
The problem is you can't have all 3 sides. Dual stages (non-inverting high impedance buffer followed by low voltage second stage right in front of the adc and 1/4 divider in between stages or inverting second stage) degrade 1 /F and low noise down to not acceptable level. AZ OPA is not a cure, since it introduces wide bandwidth noise and 1G is in question as well as linearity - chopping frequency may distruct adc completely. Building AZ out of discrete parts is also not an option, there are no switches w/o injection current low enoght, and there is no access to internal clock chain of the adc.
So my point is you can't make buffer that outperforms internal, simply because it's synchronized to internal clock structure of the adc. It's "integrated" in the processing signal path. And it's have switches/ matrix in front of it, providing offset/ full scale / linearity calibration capability whenever end user like.
..thus it seems no new particles will be discovered at CERN after the installation of the HPM7177 units..
The highest accuracy is only needed and achievable with a relatively low source impedance (e.g. < 10 K to maybe 100 K). If the buffer is >> 1 Gohm input impedance this does not mean it must also work with a comparable source impedance. Even if it does not work well with a 1 M source this could be acceptable. There would be the option to have multiple inputs for different sources.
A configuration with buffer - lower resistance 1/4 divider - buffer (ADC internal or external) does not degrade the noise performance very much, though the resistors at the divider add some noise and to low in resistance can run into thermal INL issues.
I see 2 ways to implement this:
1) A bootstrapped AZ OP-amp (e.g. AD8628) at the input and if needed have extra switching (ADC internal or external) between the divider and buffer.
2) Switching a the front and than use non AZ buffers (e.g. OPA140 with BS supply and maybe current driver support).
The first way is about how the Keithley 2002 and Datron 1281 work on there 20 V range, though with a multislope ADC and 1/2 divider.
The Keithley 2002 does however suffer from some LF noise: I am still not sure about the origin: could be the AZ part of the input buffer, thermal fluctuations or a software oddity in how AZ is handled.
I have tested the AD8628 with some moderate RC input filtering and this is sufficient to largely isolate the AZ amplifier from the source impedance. It still works OK with a 10 M source - not great, but still good enough. The measure input impedance including protection and switching at the input is in the 300 Gohm range (that is some 3 pA of change in the bias per 1 V change in the voltage). The divider and 2nd buffer would isolate the input buffer from the ADC. So one should not expect interference in either direction. Switching at the relatively low resistance divider should be no problem.
I still have some LF noise (~ 20 nV, likely thermal and supply), but this is not an issue for the 20 V range, mainly for 200 mV.
The 2nd way is how the HP3458 works, though with only 1 amplifier stage and no divider, as there is only a 10(12) V range just like the ADC provides. Switching at the input can suppress LF noise and offsets from both buffer stages. The somewhat tricky part can be to keep the switching spike small, so that it works with more sensitive sources. The full swing switching and dead time makes filtering a bit problematic.
So I don't see a point in a 10 M divider from 10 V to 2.5 V.
A 10M input resistance on all ranges, including the low voltage ones, can be important for consistency of measurements. It is disconcerting when a range switch from say 2 to 20 volts results in a change of reading way outside of the accuracy specifications because the source resistance was not zero ohms.
I ran into this most recently when using an external high voltage divider probe which expected a 10 megohm input resistance on the voltmeter. I had 4 voltmeters and 2 results because one of them had a high resistance input instead of 10 megohms. When that one voltmeter was manually switched to its 20 volt range, forcing a 10 megohm input, then all 4 readings were consistent.