I asked essentially the same question here:
https://www.eevblog.com/forum/testgear/frequency-response-121046/msg1654637/#msg1654637
I checked that page out an ya, I was looking too for a meter that would be acceptable. The only one I found so for with +-0.2% reading +0.05% full range of measurement was that siglent, but they want 500 bucks!
I'm trying to do this under 150 if possible. I'm not sure if a used meter is going to be within specifications of the manufacturer due to the voltage reference only being guaranteed for like a year.
Thanks for the suggestion.
Just did some simulations, and they are in parallel.
it really increases the error quite a bit.
hmmm
I will ask just one question re. the DIY route - how would you calibrate your converter = how could you be sure that the result is correct to your required accuracy unless you can check it over the needed level and frequency ranges with either a calibrator or with a generator and a calibrated meter with a several times better accuracy than you are trying to achieve?
Cheers
Alex
I have a 10V TI reference IC coming and I was going to calibrate my meters to it. I'm hoping the Vrms on all three will then match up too.
Then when I have a 10 V reference, I was going to check where that stood on my scope in reference to the probes, DC.
Then, I was going to use a 20Vpp 60hz signal in reference to the 10V, and check that against my scope.
I really wish I knew how to change the actual voltage reference point on the scope in relation to the divisions.
The old analog scopes used to allow you to do that.
Oh well, I will simply have to observe the peak value on the scope and check it against the 10V Ref.
Then add or subtract the difference to any measurement, which sucks.
Anyways, I took that voltage supervisor chip off of the uCurrent and the noise was still there.
I did measure the correct Vrms from the meters from the uCurrent. The ground of the scope was feeding the meters 2mVrms probably from the probe noise.
So I was able to successfully measure 4.8 mArms (4.8 mVrms) at 60 hertz, and that the noise can be cancelled out by putting the scope in the average mode, increasing the depth of the measurement, and setting the trigger to video.
The signal came out pretty clean.
After that I increased the frequency up to 100khz, and the peaks were the same on the scope but the Vrms on both the scope and the meters were registering only about 3.6 Vrms.
Edit: 100khz
So the scope seems like a fair way of measuring the Vp, and reverse calculating, but I'm not sure how accurate that is even if I was using a single polarity measurement of the Vp using all 8 divisions.
Even after all of that, I wouldn't actually be able to measure an error of 0.1% unless I had some kind of reference that was 10x more accurate than this DIY try.
I did do some measurements of so large resistors, and found that one of my 10Mohm 1/4 W 5% has about 1pF, and a 1.6Mohm resistor 1/4 W 10% has about 0.5 pF.
That kinda is opposite to what I was thinking about the ratio of the capacitance to the resistance.
Any divider network, would need some type of capacitor network like @Kleinstein was saying.
I took a look that the capacitors at digikey, and they only have 1% tolerances for through hole type.
That's too high if 0.1% is supposed to be the total error.
You got any ideas?
sooo...
Anyone know how to calibrate the DC voltage measurement of a Mastech MS8040 ?
@med6753 on that tear down thread said they were able to, but never actually said how it was done.
https://www.eevblog.com/forum/testgear/part-2-teardownevaluation-mastech-ms8040-bench-dmm/msg1089068/#msg1089068Once you start worrying about high impedance dividers and accurate AC division, you have entered oscilloscope design territory so there may be some lessons there.
High impedance dividers are a real problem for low frequency noise at low division ratios where noise matters more. If your input voltage range is not too great and you must have the lowest noise, consider a bootstrapped input buffer with so no high impedance input divider is needed.
If you have unexplained errors in your properly compensated high impedance divider, consider hook from the printed circuit board as an explanation. If you do not have any way to control this, consider air wiring the high impedance dividers.
Error in the amplifier increases as open loop gain falls with frequency; 40dB of excess gain is needed for an error of 1% and 60dB for an error of 0.1%. So for a 0.1% error at 100kHz and gain of 10, the open loop gain needs to be 80dB and the gain-bandwidth product needs to be 1GHz for a typical unity gain compensated amplifier. (1) 0.1% is a little less than 0.01dB so calibration is going to be a problem and I think the high impedance attenuators are going to create greater errors than this anyway as described above.
Common unity gain compensated amplifiers have limited open loop gain at higher frequencies increasing error. Consider decompensated amplifiers or adding a fast fixed voltage gain stage (2) within the feedback loop of a precision gain stage. As a bonus, either of these things also raises the full power bandwidth. High precision voltmeters may do either or both of these things even at the low frequencies they operate at. Some audio designs go to extreme lengths to minimize distortion.
(1) I have not looked at this in a long time but I think I got it right and the results are consistent with the design and specifications of precision wideband instruments.
(2) Current feedback operational amplifiers are good for this as shown in the application note I linked and they also unload the precision amplifier. Thermal feedback within an operational amplifier limits low frequency open loop gain limiting precision.
That makes sence about the scope dividers, and old scopes used variable trimmer resistors and caps.
I don't think the noise is going to be a problem anymore after seeing the results of today's tests, the input of the LTC1968 would be either between 50mVrms and 500mVrms, or 100mVrms and 200mVrms.
I really don't like bootstrapping, it limits the frequency range doesn't it?
The errors in phase vs frequency of the inverting OP, would have significance, but after measuring the resistors today, it would require balancing capacitors.
I'm starting to remember something like this, I actually made a 1000x scope probe once to measure 100kV, and it actually worked well at 5khz. I'm not sure if it still works, but I still have it.
Anyways, because the capacitors are required, the first idea with just the 2 or 3 Gain OPs and the resistor/capacitor divider network is probably the cheapest, and it could use some variable resistors and caps to offset the tolerances. Maybe also pot the fixed resistors and caps in corona dope.
That's not really a best solution because those things never stay put. Basically, as a rule of thumb for the "add-on" the divider network would have to be calibrated every time you use it just to be sure the resistors and caps didn't change.
I'm not sure I'm using an open loop gain OP.
I thought the gain was caused by positive feedback with the non inverting OP, and negative feedback with the inverting OP for unity gain, and attenuation.
Would you provide a rough schematic of what you are talking about?
Thanks.