Author Topic: How fast an multimeter should be .. ??  (Read 17705 times)

0 Members and 1 Guest are viewing this topic.

Offline Kiriakos-GRTopic starter

  • Super Contributor
  • ***
  • !
  • Posts: 3525
  • Country: gr
  • User is banned.
    • Honda AX-1 rebuild
How fast an multimeter should be .. ??
« on: May 03, 2011, 08:29:23 pm »
The answer of how fast one multimeter should be ?

Partially came to me by Googling around for the word Microsecond,
frequently used in many sales papers about test equipment .

At the Wiki site , found an nice description , that helped me to understand ,
that time has nothing to do as factor of quality .

Link : http://en.wikipedia.org/wiki/Microsecond

In those examples of time VS what is possible to happen in it .
I had realized that many electrical signals in order get  measured ,
the DMM it should actually wait for them , as they are slower than the 250uS ,
that a professional and modern DMM unit can easily handle .

I actually got impressed about the reference as :

250 microseconds – cycle time for highest tone in telephone audio (4 kHz)

I am still exploring this subject about time VS time that the signals need so to get formed ,
so to get an picture , of what we actually need in multimeters as speed .



 
 
  
 

Offline Kiriakos-GRTopic starter

  • Super Contributor
  • ***
  • !
  • Posts: 3525
  • Country: gr
  • User is banned.
    • Honda AX-1 rebuild
Re: How fast an multimeter should be .. ??
« Reply #1 on: May 03, 2011, 08:52:52 pm »
On a second note , the example of :

100 microseconds (0.1 ms) – cycle time for frequency 10 kHz

It translates to : four needed samples for an accurate measurement = 4X100 = 400 microseconds ,
plus the time that the DMM will need so to (compare - calculate) the data and give the reading on the screen.

« Last Edit: May 03, 2011, 08:54:38 pm by Kiriakos-GR »
 

Offline allanw

  • Frequent Contributor
  • **
  • Posts: 343
    • Electronoblog
Re: How fast an multimeter should be .. ??
« Reply #2 on: May 04, 2011, 01:52:05 am »
The time response isn't that relevant unless you're talking about the ability of the multimeter to measure high frequency signals in AC mode.

You want a decent display refresh rate so that it updates often enough so you don't have to wait a second for it to show a voltage. This is probably something like 4 Hz for a good handheld multimeter. A multimeter trades off speed of data acquisition for accuracy. This is because the method they use to measure the voltage is an integrating ADC. The longer you let it run, the better it averages out noise and such. And it's cheaper than measuring faster for no gain.

Good bench multimeters can measure something like 1000+ a second, which is useful for data acquisition but pointless for displaying when 4Hz suffices.

And of course, you can always buy dedicated data acquisition units that can sample at 100MHz+ at high resolution. Even faster are oscilloscopes.
 

Offline insurgent

  • Regular Contributor
  • *
  • Posts: 78
Re: How fast an multimeter should be .. ??
« Reply #3 on: May 04, 2011, 02:51:30 am »
Which brings up another question: Why do all these high tech pieces of equipment with USB ports use RS232 emulation to connect to the PC? It can't be *that* hard for the likes of Rigon, Agilent, etc to write a functional USB driver in this day and age. Talk about a waste of 1.4 or 57MB/sec!
 

Offline Kiriakos-GRTopic starter

  • Super Contributor
  • ***
  • !
  • Posts: 3525
  • Country: gr
  • User is banned.
    • Honda AX-1 rebuild
Re: How fast an multimeter should be .. ??
« Reply #4 on: May 04, 2011, 03:10:21 am »
Ok I will bring one example, so to all talk about it .

We have an multimeter that claims measurements to AC up to 100kHz .

The wiki page estimates that the for frequency 100kHz  needs 10 microseconds (uS) – cycle time .

And so the DMM needs 10 microseconds to wait for the signal to pass threw , just once.

The average Joe , will never measure AC at 100kHz , not even at 50 kHz .
And cares mostly about the display refresh times .
That is 4-5 times max per second in the modern hand-held DMM

The five times max per second display refresh rate, uses  200.000 (uS) just for one refresh time

And my poor brain tells me , that if the DMM takes four measurements at 100kHz as sample rate,
needs just 40 microseconds (uS) to do it.  

If from the single refresh time of 200.000 (uS) will remove the 40 microseconds (uS) that the DMM needs so to do the job.
It has 199.960 (uS) as free time, so to smoke even a cigarette ..  :D

I am trying to simplify things, so to gain an better understanding ,
I do not know if it works that way .    :)





 
« Last Edit: May 04, 2011, 03:33:22 am by Kiriakos-GR »
 

Offline Kiriakos-GRTopic starter

  • Super Contributor
  • ***
  • !
  • Posts: 3525
  • Country: gr
  • User is banned.
    • Honda AX-1 rebuild
Re: How fast an multimeter should be .. ??
« Reply #5 on: May 04, 2011, 03:21:23 am »
Which brings up another question: Why do all these high tech pieces of equipment with USB ports use RS232 emulation to connect to the PC? It can't be *that* hard for the likes of Rigon, Agilent, etc to write a functional USB driver in this day and age. Talk about a waste of 1.4 or 57MB/sec!

It called compatibility my friend , and yes it does have limits .  :)

Today I found even worst Data logger software and driver  , that gets supplied with the cheap DMM of 100$  , and so after seeing it,
I stopped blaming the other ones , because from now and on they look as better ones.    :D
« Last Edit: May 04, 2011, 03:28:51 am by Kiriakos-GR »
 

Offline vk6zgo

  • Super Contributor
  • ***
  • Posts: 7563
  • Country: au
Re: How fast an multimeter should be .. ??
« Reply #6 on: May 04, 2011, 03:30:27 am »
Regardless of the display rate,the A-D converter in the DMM still needs to sample at a fast enough rate to measure the real 100kHz signal,& not

just some result of aliasing,just as in a DSO.

Diverging a bit,have you ever had the situation where your DMM ,on dc volts, reads  a series of pulses as a dc voltage?

Some years back,I was looking at the +150v boost HT on my Mother-in -Law's TV set,& read about +120v---a bit low,but it didn't look that bad.

When I put an Oscilloscope on the same point,the problem was clear--the filter capacitor was dead,& what I was seeing was unfiltered rectified

15kHz pulses.

The Fluke 77 wasn't even rated to read 15kHz on the ac range,but it could read it well enough on the dc range to confuse me! :D


VK6ZGO
 

Offline Psi

  • Super Contributor
  • ***
  • Posts: 9889
  • Country: nz
Re: How fast an multimeter should be .. ??
« Reply #7 on: May 04, 2011, 04:52:05 am »
Which brings up another question: Why do all these high tech pieces of equipment with USB ports use RS232 emulation to connect to the PC? It can't be *that* hard for the likes of Rigon, Agilent, etc to write a functional USB driver in this day and age. Talk about a waste of 1.4 or 57MB/sec!

I much prefer when they use generic usb-rs232 emulation. It's so much easier to watch traffic and build your own software to manipulate the data when it's just an RS232 port to the OS.
« Last Edit: May 04, 2011, 04:53:47 am by Psi »
Greek letter 'Psi' (not Pounds per Square Inch)
 

Offline Neilm

  • Super Contributor
  • ***
  • Posts: 1545
  • Country: gb
Re: How fast an multimeter should be .. ??
« Reply #8 on: May 04, 2011, 05:58:53 am »
Which brings up another question: Why do all these high tech pieces of equipment with USB ports use RS232 emulation to connect to the PC? It can't be *that* hard for the likes of Rigon, Agilent, etc to write a functional USB driver in this day and age. Talk about a waste of 1.4 or 57MB/sec!

Which pieces in particular? A multimeter than measures at 1kHz will probably be able to transfer the data down an RS232 emulator, so why waste money in producing a USB driver? It would not add any benifit to the product.

Neil
Two things are infinite: the universe and human stupidity; and I'm not sure about the the universe. - Albert Einstein
Tesla referral code https://ts.la/neil53539
 

Offline scrat

  • Frequent Contributor
  • **
  • Posts: 608
  • Country: it
Re: How fast an multimeter should be .. ??
« Reply #9 on: May 04, 2011, 11:28:46 am »
There is a reason why the measurement takes long time: they filter the various measurements, to achieve highest accuracy (and resolution, I guess).
One machine can do the work of fifty ordinary men. No machine can do the work of one extraordinary man. - Elbert Hubbard
 

Offline sacherjj

  • Frequent Contributor
  • **
  • Posts: 993
  • Country: us
Re: How fast an multimeter should be .. ??
« Reply #10 on: May 04, 2011, 01:52:34 pm »
There is a reason why the measurement takes long time: they filter the various measurements, to achieve highest accuracy (and resolution, I guess).

And averaging of the measurements over a short term, to eliminate odd noise.  Plus you have to add any calculation time.  Display generation and driving.  Etc.
 

alm

  • Guest
Re: How fast an multimeter should be .. ??
« Reply #11 on: May 04, 2011, 08:53:36 pm »
Good bench multimeters can measure something like 1000+ a second, which is useful for data acquisition but pointless for displaying when 4Hz suffices.
I disagree, 4Hz is not fast enough in my opnion. It's certainly acceptable in many cases, but I'd take faster if I could get it. I notice that bench meters sampling at faster rates are much more responsive, which is useful when dealing with unstable, intermittent or fast changing signals (eg. when adjusting a trim pot). If I'm adjusting a trim pot, the faster the feedback the better. Having delay in the feedback loop is bad for any control system, including humans. This is similar to the fast continuity response that Dave cares so much about. A slow sampling DMM will just show an average value when presented with eg. a low-frequency (few Hz) oscillation (too low for ACV). This was shown in Dave's video blog about the AVRISP LM317 mod I think.

Very few hand held meters offer sampling rates beyond 4Hz for anything but the bargraph (which is of limited value if you need precision), however, so I guess either the costs or power usage would be prohibitive.
 

Offline tekfan

  • Frequent Contributor
  • **
  • Posts: 385
  • Country: si
Re: How fast an multimeter should be .. ??
« Reply #12 on: May 04, 2011, 10:13:57 pm »
Good bench multimeters can measure something like 1000+ a second, which is useful for data acquisition but pointless for displaying when 4Hz suffices.
I disagree, 4Hz is not fast enough in my opnion. It's certainly acceptable in many cases, but I'd take faster if I could get it. I notice that bench meters sampling at faster rates are much more responsive, which is useful when dealing with unstable, intermittent or fast changing signals (eg. when adjusting a trim pot). If I'm adjusting a trim pot, the faster the feedback the better. Having delay in the feedback loop is bad for any control system, including humans. This is similar to the fast continuity response that Dave cares so much about. A slow sampling DMM will just show an average value when presented with eg. a low-frequency (few Hz) oscillation (too low for ACV). This was shown in Dave's video blog about the AVRISP LM317 mod I think.

Very few hand held meters offer sampling rates beyond 4Hz for anything but the bargraph (which is of limited value if you need precision), however, so I guess either the costs or power usage would be prohibitive.

I don't even care anymore. I use all sorts of meters. From 1000 displayed readings per second which is pretty useless since all you see is 8888 on the display unless you're doing some serious datalogging. To waiting 10 seconds to get a reading (but you do get 7.5 counts).

I think probably the slowest multimeter was the Solartron 7081. It uses some sort of double integration technique which results in such high performance but it takes 40 seconds to get a reading but you do get an amazing 8.5 digits of resolution. This was the first multimeter of it's kind. There have been massive improvements in the design in the new precision meters.

I just had to attatch a picture of it. It's such a cool design with the softbuttons and blue VFD
One can never have enough oscilloscopes.
 

Offline Kiriakos-GRTopic starter

  • Super Contributor
  • ***
  • !
  • Posts: 3525
  • Country: gr
  • User is banned.
    • Honda AX-1 rebuild
Re: How fast an multimeter should be .. ??
« Reply #13 on: May 06, 2011, 02:46:13 pm »
Well It looks that I found my way, about the questions like how fast should be .
By reading some Fluke technical documents about industrial type measurements.

And the key point looks to be , that the DMM products that has as target an special field ,
they are fast as it should , so to perform specific known measurements (routine check ).

About electronics and control systems , this is another planet and needs different in specs gear.

By having all those in mind , the words like : choose the right DMM for you , now makes sense to me. 



(soon I will write an very detailed topic about my findings )

   

 

Offline allanw

  • Frequent Contributor
  • **
  • Posts: 343
    • Electronoblog
Re: How fast an multimeter should be .. ??
« Reply #14 on: May 06, 2011, 06:55:10 pm »
Ok I will bring one example, so to all talk about it .

We have an multimeter that claims measurements to AC up to 100kHz .

The wiki page estimates that the for frequency 100kHz  needs 10 microseconds (uS) – cycle time .

And so the DMM needs 10 microseconds to wait for the signal to pass threw , just once.

The average Joe , will never measure AC at 100kHz , not even at 50 kHz .
And cares mostly about the display refresh times .
That is 4-5 times max per second in the modern hand-held DMM

The five times max per second display refresh rate, uses  200.000 (uS) just for one refresh time

And my poor brain tells me , that if the DMM takes four measurements at 100kHz as sample rate,
needs just 40 microseconds (uS) to do it.  

If from the single refresh time of 200.000 (uS) will remove the 40 microseconds (uS) that the DMM needs so to do the job.
It has 199.960 (uS) as free time, so to smoke even a cigarette ..  :D

I am trying to simplify things, so to gain an better understanding ,
I do not know if it works that way .    :)

I think you must mean 200ms instead of 200us.

A multimeter measures AC to some minimum frequency, definitely less than 50Hz. That's what limits its sampling time. If its lowest frequency is say, 10 Hz, then it must take at least 0.1 seconds to sample the signal. That's why the AC update rates are so slow.

If there were some sort of switch to dictate the minimum frequency, then you could sample it much faster. Like if there were some 100 kHz to 1MHz range, it would then only need to sample based on the lowest frequency 100 kHz or 10us like you said.
« Last Edit: May 06, 2011, 06:57:23 pm by allanw »
 

Offline Kiriakos-GRTopic starter

  • Super Contributor
  • ***
  • !
  • Posts: 3525
  • Country: gr
  • User is banned.
    • Honda AX-1 rebuild
Re: How fast an multimeter should be .. ??
« Reply #15 on: May 06, 2011, 07:53:22 pm »
Well my lets call it investigation ,
was about :   

1) The Min/Max/Average sampling rate ( how fast can be the fastest)
2) The Peak/Min/max  = 250 micro seconds (fixed in all DMM)
3) The sampling rate VS wave length (related to time-alive time )

And I agree with you that the lowest the Freq , the delay on measurement gets higher  .  :)
« Last Edit: May 06, 2011, 08:00:23 pm by Kiriakos-GR »
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf