EEVblog Electronics Community Forum
Products => Test Equipment => Topic started by: henk on November 04, 2020, 12:45:55 am
-
I've been wanting to get a proper DMM for a while, and found the DMM6500, that I find especially interesting due to the additional digitizing function, as it looks to be fast enough for anything I could want to use a scope for. And also functions as a 1-channel DAQ. What I cannot seem to find however, is information on the accuracy of the digitizer.
The manual has only 1 page on the accuracy of digitizing, and only specifies the accuracy using 1000 samples per second with a 100-reading filter. Which seems to be within the range of normal, non-digitizing operation.
I'm trying to figure out how it would fare in 100kS/s continuous streaming over usb, or 1MS/s triggered measurements.
Thanks in advance for any advice.
-
How much accuracy do you need?
-
In another DMM6500 thread here there are a few reports on the digitizer performance. AFAIR the input amplifier and especially the divider for the higher voltage ranges have a limited bandwidth. So while one gets a high sampling rate the accuracy in the time domain is nowhere close to a scope. Expect distortion like a badly de-compensated probe.
With scopes you look for the BW and not so much the sampling rate (as long as it is sufficient).
Giving the accuracy only for the 1000 SPS mode is kind of odd. It makes sense to assume averaging for amplitude / gain accuracy, that is OK. However this leaves the AC performance of the front end totally open. Ideally I would want some kind of at least typical gain and phase versus frequency plot, or at last something like the step response.
From what I have seen here in the forum I would consider the extra fast ADC of limited use. In principle I like the idea, if they would use the fast ADC for digital RMS and provide a compensated input divider and input amplifier with sufficient bandwidth. However this would add to the costs. Maybe they get it better the next time.
-
@exe: I don't have a perticular accuracy in mind that I would need. To judge if the feature is worth the extra money for me I am interested in getting a feeling of the performance.
@Kleinstein: Yea, I looked through other threads but was unable to find a clear spec. I get what you're saying with it being far away from a proper scope with this sample rate. Though that is not really what I would be looking for. When I say scope I mean that I would be interested in the high speed logging.
I don't see myself measuring any harmonic signals above 100Hz anytime soon. I may be interested in ripple from dc power supplies/boost converters; microcontroller current while switching energy modes; and maybe investigating any harmonics in the 230V 50Hz mains power.
The question for me remains though, for a given range, and SPS, I would have no clue what order of magnitude the accuracy would be.
And yes that would only be of limited use, but I'd still like to know what measurements are feasable and what aren't
-
The extra specs sheet gives a BW ( -3dB) for the digitizer mode:
210 kHz for 100 mV and 1 V range, 440 kHz for 10 V and 17 kHz for 100 and 1000 V. So the use for the 100 and 1000 V mode is relatively limited. There are also SFDR and THD+Noise numbers.
Specifying AC performance is not as simple as with DC.
-
Thanks for replying, and taking the effort to look up some stuff.
I guess the issue mostly lies with my inability to understand AC specs. Especially if these AC specs have to be applied to semi-dc conditions (example: 5V DC voltage with some drift and ac ripple, or: microcontroller current, where there are steps in the current.)
I hoped for specs similar to the DC specs, except for a higher digitizing frequency, and without filter.
Right now I'm trying to interpret the AC specs you refer to (also attatched to this post), but despite looking at several definitions of SFDR and THD+noise, I cannot figure it out completely.
The bandwitdh I get, any harmonics above the cutoff frequency is not properly measured due to internal amp's not being fast enough or something.
SFDR is a bit more vague, am I correct in stating that with a SFDR of 75dB (30e6) for 1kHz and 100mV. Would mean that a 2kHz small signal on top of a 1kHz 100mV signal, can be detected if it is larger than 100mV/30e6=30nV? looks unreasonably precise to me. Does this also mean that on a DC of 100mV, a 30nV ripple can be detected?
I am familliar with the SNR when it comes to generating, and describing analog signals. Less so when measuring them though. For THD+Noise SNDR, the value is 65 for 100mV and 1kHz, am I right in describing this as aliased harmonics and noise that are "created" due to sampling of an analog signal. These measured "fake" signals are at least 65dB lower (100nV)? And does this mean that this error is of no concern when sampling DC or step functions?
Finally how would the effective number of bits be applied? 100mV, 1kHz is specced to be only 9 effective number of bits, or 512 levels. This would suggest to me that a resolution of 0,2mV is attainable.
-
The effective number of bits should be another way to express the SNDR. Each bit gives some 6 dB plus there is some offset to start with. It does not mean the the quantization steps are so large is means the total error from noise and nonlinearity can be as large as the noise error from a perfect ADC with that number of bits.
Somehow the numbers don't seem to fit, not even for the lowest frequency.
I am a little confused to have so different numbers and especially so much lower performance with the 50 kHz signal. This signal may start to create some nonlinearity in amplifiers, but it looks like quite a lot of effect.
The dB number give the attenuation of the power. So 60 dB is a factor of 1000 in the voltage only. So 75 dB down from 100 mV is some 18 µV, which sounds a lot more reasonable, more like in the high side.
-
ah yea, whoops, looked at the power decibels, instead of amplitude. With regards to the bits I should indeed have said accuracy, not resolution. The effective bit not fitting to SNDR is still a bit of a mystery then.
I maybe forgot to ask expicitly, but is my description of the different accuracies correct? Just to make sure I understand the specs.
You note that 18µV is a bit high for 100mV signal/range, though I feel that with the possibility to log the signal to a PC at 100kHz, such accuracy is quiet nice, or I'm not familliar with alternatives?
The 18µV also matches up with the specified 2yr DC accuracy (0.02% of range = 20µV)
-
The SNDR does not include gain error, so those DC accuracy specs a separate. In addition there can be additional errors for the digitizing mode, as the signal and reference goes a different path. I would expect some additional TC from the reference divider form the 7 V to some 2.5 for the fast ADC.
A noise level of 18 µV with a 220 kHz bandwidth is not really impressive. This would be some 38 nV/sqrt(Hz). It is not bad, but not really good either.
-
That's a very interesting twist :). Tek guys tried to convince me that their digitizer is like an oscilloscope, but "much better". Well, my 8-bit oscilloscope has somewhat 500uV p-p noise (don't remember at which bandwidth, unfortunately, it has a variable filter on input, afaik 30kHz-100MHz), so, I guess, they are correct.
-
The dc specs I noted are for dc digitizer, not normal multimeter mode. You also note the 18µV noise is not that impressive. Do you mean for a scope, or datalogger/digitizer?
For now I'm pretty much inclined to buy this DMM. As with these specs I think it'll do for anything I could want a regular scope for, and also has the benefit of long-term highspeed datalogging. Oh, and the fact that it's a pretty good 6.5 digit DMM. If only the price... Ah well, due to some pandemic I didn't go on holiday this year, so have some spare funds to waste...
-
You also note the 18µV noise is not that impressive.
Actually, I don't know :). For all my purposes that would be enough. I think it gives a lot of value for the price. I woudn't expect miracles from it because, according a tek guy on this forum, the hardest challenge was to fit it into certain price. So, it's not built of top-performance parts.
-
Scope noise specs are often for 20MHz BW, so this is 100 times more than the DMM6500 and thus 10 times the noise as a natural step. The other point is pp noise- the RMS number is about 6 times lower. So this would be some 80 µV_RMS and if for a 20 MHZ BW this would translate to some 8 µV_RMS noise for 200 kHz BW. This sounds about reasonable for a scope.
The DMM6500 has still the higher vertical resolution (e.g. 16 Bits though often only some 12 Bit effective resolution) compared to the usual DSO with 8 Bit nominal and maybe 7 bit ENOB. If a high res. mode with digital filtering / BW reduction is there one may gain some 3 bit going down from 20 MHz to some 300 kHz.
The scopes usually have more and finer spaced ranges an can thus make up some of the limited resolution.
However the scope as the better timing accuracy and BW. The DMM6500 in this comparison is slow and it may be more than just the BW limit (e.g. 220 kHz) but also additional phase errors and different BW between ranges. The obvious test would be to use the 1 MS/s mode of the DMM6500 to look at a clean rectangular signal of maybe some 1 kHz or 100 Hz.
The fast transients may be some additional difficulty with limited slow rate.
Because it is build to a price, the input lacks fine tuned compensation - the step one does with the x10 probe at the scope.
-
You also note the 18µV noise is not that impressive.
Actually, I don't know :). For all my purposes that would be enough. I think it gives a lot of value for the price. I woudn't expect miracles from it because, according a tek guy on this forum, the hardest challenge was to fit it into certain price. So, it's not built of top-performance parts.
Yep, as I am familiar (designer) with the hardware design of this. The design is completely optimized for low cost. That includes choice of hardware components, architecture, calibration, supporting firmware, specifications (and what is NOT spec AC wise). It should not at all be compared to a scope. Scopes are optimized for time domain performance and accuracy. Apples to oranges comparison here. Designing to (and verifying) AC accuracies is a tricky endeavor which wasn't suitable for this class of instrument.
What we implemented allows the user some time domain insight into their signal. The specs were limited deliberately. The goal was to maximize bang for your buck. :)
-
The obvious test would be to use the 1 MS/s mode of the DMM6500 to look at a clean rectangular signal of maybe some 1 kHz or 100 Hz.
I've ordered one, so once it arrives I'll do the next best thing available to me: a "step" function using a switch/wire, though the cleanness of this is disputable, but I'll see how it looks.
It should not at all be compared to a scope. Scopes are optimized for time domain performance and accuracy. Apples to oranges comparison here.
...
What we implemented allows the user some time domain insight into their signal.
Call it my ignorance, but I'm not really sure what to compare it to if not a scope. The only other thing I can think of that have these kinds of sample rates are DAQ's, that don't have a display (and often cost a similar amount, without including a 6.5 digit DMM); Or scopes, that have worse accuracy in the voltage scale, but higher sample rate (and accuracy) in time domain, and have higher BW.
Then again, maybe I'm just not used to higher frequency AC signals, and think of them too much as a changing DC signal. As for me; in the foreseeable future, use case scenario's i'd be interested would be either square waves/step functions at ~µs resolution (microcontroller current/outputs), the more detailed shape of sub 100Hz AC signal's, and maybe just maybe some rough info on ~kHz pwm signals.
And if I've interpret the specs correctly, this should all be attainable with the DMM6500. Also I personally call a 1µs resolution, or 10µs when streaming to computer, more than just "some" time domain insight.