Electronics > Beginners
GPIB readout time of 3458A DVM
imisaac:
I was reading the 3458A voltage via a GPIB-to-USB interface in a Labview program. The data format was set to SREAL and the auto zero function was ON. In each loop, I took only 1 reading from the DVM. It turned out that the loop time needed to take a single reading from the DVM depended on the NPLC cycle that was used to average the reading. The dependence plot is attached.
The plot shows that the minimum loop time was 20 ms regardless how few of the NPLC cycle used. This implied that the maximum sampling rate was 50 Hz.
Anyone knows how to improve the sampling rate further?
bson:
I'm not directly familiar with the 3458A, but manual triggering may be limited to an integral PLC. Things like SCPI ...:READ? will manually trigger, then wait for a sample, so if you're using it in a loop your trigger rate will be PLC. In this case the NPLC setting may only affect the sample aperture. Look into automated time based triggers and then wait for a value without triggering (I forget the command sequence off hand). More modern instruments like the 3446x series can also make time based samples of a given aperture into a buffer, and you can drain the buffer in whatever way is convenient - in a loop that regularly fetches whatever is available, or just fetch it all at the end if it has enough memory. Not sure what the 3458A era instruments offer in this regard, but I think the 3458A is modern enough to likely have similar functionality.
macboy:
I believe I've read that the 3458A can deliver 100000 (one hundred thousand) samples per second at 4.5 digits over GPIB. I don't know what settings or commands are used for that, but you can definitely do better than 50 Hz.
If you can only get 50 Hz, you probably have line sync turned on. This arms the trigger exactly once per line cycle (probably at a zero crossing) in order to help remove the effect of AC line interference. Since the trigger must be armed before it can be fired, the maximum sample rate is then limited to the line frequency of 50 Hz. Unlike simple DMMs, a complex system multimeter like this doesn't take readings continuously at the maximum rate unless you specifically configure it to do that.
To get the maximum rate from the instrument, you definitely want to avoid having to repeatedly tell it to take a reading, or to ask it for the most recent reading. Instead, you want to configure it to read continuously at the rate you want (with the trigger settings you want), use binary format (not ASCII), then set up the GPIB host to read continuously from the instrument. When configured correctly, the instrument should sent data samples over the bus as they become ready, for as long as it is addressed to talk. Certainly, it should be a dedicated GPIB bus with no other devices which might trigger an interrupt (GPIB SRQ). You might need to get into lower level GPIB programming for this, as higher level APIs will usually hide things like sending "talk" and "untalk" commands to the devices.
Tony_G:
There are a number of examples for this in the User Manual - pg 156 onwards.
In fact pg 163 talks specifically about doing this across GPIB.
Worth giving it a read (the code samples are in HPBasic but you can use the functions in any VISA app).
TonyG
imisaac:
Thanks for the suggestions!
It may be true that the sampling rate can be increased further if the DVM digitises the signal quickly (e.g. 100 kHz) and save a bunch of data points in the buffer waiting to be retrieved. However, this only works if the buffer size is large enough to accommodate the measurement data. If the measurement data exceeds the capacity of the internal buffer size, then one has to:
(a) send another retrieving command to the DVM to retrieve another bunch of data and/or
(b) use a smaller sampling rate such that the whole measurement can be fitted into a single buffer size.
Neither scenario is ideal for me. If I use point (a) solution, then the time interval between the subsequent bunch of data is undefined. If I use point (b) solution, then I can not carry out a long measurement at high sampling rate. Therefore, I am stuck with the situation I am in now. I rely on the PC to define the sampling interval for me, instead of the DVM.
The 50 Hz sampling rate limitation may have originated from:
(a) the DVM/PC communication protocol (GPIB-USB in my case),
(b) the programming language (Labview in my case),
(c) the DVM internal triggering rate (as macboy already mentioned).
It is still unclear which is the definite bottleneck at the moment. I attach a screenshot of my simple Labview program to facilitate discussion. My ultimate goal is to continuously sample a DC voltage with a well defined sampling interval for a long time (e.g. weeks)……
Navigation
[0] Message Index
[#] Next page
Go to full version