What exactly are you trying to do? That's a lot of info you're asking for... Like for DCV Bandwidth: it varies depending on a handful of settings and isn't spec'd, so if you tell me why you want it or how you'll use the info, I can estimate a figure for you. For something like a block diagram, what are you looking to find?I am an engineer old school.
I used to understand the device with which I work. It is very difficult for me to work with a black box which I do not understand.
For example, I measure the noise of a low-noise power source. And I want to know the band of the device as the measured noise level depends on it. And I want to have this answer in my head and not ask the engineer Keithley every time.
Or I do not want to hear the switching of relyushek in double dimensions. For example, I crossed out this mode for myself from the applicable. And it turns out that under certain conditions I can use it.
Those. I really want to work with the device myself without distracting Keythley engineers from work.
Moreover, the programmers received detailed instructions for working with the device for 1000 pages. And engineers have nothing but a very modest verbal description which is difficult to understand
I am an engineer old school.
To know when the relay will click, and when not.
What is the input bandwidth for DCV.
Is it possible to somehow find out the exact values of current shunts. The device itself must know them?
How are the processors, trigger connected inside ...
And any other information on the device. Scheme. Block diagram Sketches.
the actual data rate / time spacing
actual aperture and measurement sequence in the Ohms modes
Also for the digital filtering used in the meter, it would be nice to know about the exact filter function, as this effects noise estimates from the readings. With filtering use the simple std. deviation values calculated have to be taken with a grain of salt - as the raw data may be correlated.
Not knowing such details might lead to problems like not always waiting long enough for settling. This could lead to problems like the odd out-layers once every 160 readings seen in some cases - though in this case more like a problem for the DMM internal software.
Since the DMM6500 can take simultaneous readings of current and voltage is there some script to measure power other than using a shunt resistor?The device cannot measure current and voltage simultaneously. Voltage only or current only. At the same time spending a relay resource.
How was I able to measure the DC voltage using the rear ports and DC current as a secondary measurement then?
There were no relays switching between the two modesMegaVolt is right, the DMM6500 can't measure Voltage and Current simultaneously, but you might not always hear relays clicking. There are a couple range combinations that don't require a relay switch, typically because the the signal paths use the same range resistors. 10V and 1A is an example combo, but there are a few others. This behavior was the same for the 7510 as far as I know, but the range combinations might be different.
I noticed that but for the ranges where a simultaneous non-relay switching measurement, is there a script that can multiply the current and voltage and give me a power reading instead of using a shunt resistor and the voltage ratio method?
Oh well in that case: I gathered some info from the design engineers. We're really quite happy to share details, but it helps us if you're specific in what you want to know and why you want to know it. I think I addressed all the questions below:
What is the input bandwidth for DCV.From a purely hardware perspective up to but not including the ADC: up to hundreds of kHz to support the DCV digitizer function for 10V and less ranges. 100V and 1000V ranges a lot less than that due to internal 10Meg divider. Generally similar to digitzer specs.
How are the processors, trigger connected inside ...
And any other information on the device. Scheme. Block diagram Sketches.The processors/trigger/measurement relationship is too complicated to document here and changes depending on the specifics of how you're measuring. Is there some case you had in mind? We want to try and answer your question, but we could make hundreds of different block diagrams about the box.
Some of the internal pics for DMM6500 you can find here. Perhaps they can shed some more light on how it's designed and working.
I noticed that but for the ranges where a simultaneous non-relay switching measurement, is there a script that can multiply the current and voltage and give me a power reading instead of using a shunt resistor and the voltage ratio method?
Are there are plans to release calibration/service manual for DMM6500? I'd be interested to see the calibration points for ACV.
Oh well in that case: I gathered some info from the design engineers. We're really quite happy to share details, but it helps us if you're specific in what you want to know and why you want to know it. I think I addressed all the questions below:Many thanks to Brad O for the information gathered.
Then if you do not mind, I will continue to ask questions.
QuoteWhat is the input bandwidth for DCV.From a purely hardware perspective up to but not including the ADC: up to hundreds of kHz to support the DCV digitizer function for 10V and less ranges. 100V and 1000V ranges a lot less than that due to internal 10Meg divider. Generally similar to digitzer specs.Does this mean that I must include an external anti-aliasing filter to eliminate the effect of spectrum overlap. And at the same time reduce the input noise level?
QuoteHow are the processors, trigger connected inside ...
And any other information on the device. Scheme. Block diagram Sketches.The processors/trigger/measurement relationship is too complicated to document here and changes depending on the specifics of how you're measuring. Is there some case you had in mind? We want to try and answer your question, but we could make hundreds of different block diagrams about the box.For example, I absolutely do not understand how I can get all the signal samples through the remote interface. Not one buffer. Namely, all data without gaps. I do not understand how the data pass. How does screen information or network communication affect information gathering? How do I communicate on the network so as not to interfere with the measurements. I do not understand how I should set up triggers and buffers so that the data is not lost and at the same time I had time to read them over the network.
Although I am very interested in the type of resistor 10 MΩ and the method of its connection to the input.
For digitize functions, there is a high frequency low pass filter ahead of the ADC but it is not sufficient (with respect to the noise floor) to eliminate all alias effects for a signal with content >500kHz. Whether or not that matters to your measurement depends on what the signal looks like (FFT-wise).
It is definitely possible for the instrument to take data faster than it can transmit. No remote interface has an actual "streaming" option (that is possible with KickStart, though it still isn't magically faster than the bus). What interface are you trying to use? (LAN/USB/GPIB?) What measurement function are you using? What kind of sample rate are you trying to use?
However nearly 1 ms would be relatively long.
There may be some extra adjustment measurements (e.g. AC scale, temperature etc. ) going on too. For a more sigma delta like ADC there can be some time for the "soft" start of the aperture window.
In principle an SD like ADC could get away without an extra delay in an non AZ mode, if slight overlap / correlation is accepted.
Another possible reason maybe time for data transfer from the ADC to the display / output part of the meter.
Are these directly read delays (e.g. from the time stamp in the data file) or are these number only calculated back from the data rate ?
Disable temperature correction
Before you start your adjustment, you must turn off temperature correction. Run the following commands to turn off temperature correction.
cal.adjust.step.setup("TC_EN")
cal.adjust.step.execute("TC_EN", 0)
Impedance is forced to 10M by default to prevent apparent errors like "My DMM is measuring 10V with nothing connected", which our support engineers used to get many, many calls about.
I calculated the time for one countdown for digitalization mode V.
1 KS / s - 924 Hz
10 KS / s - 5506 Hz
100 KS / s - 11096 Hz
1 MS / s - 13199 Hz
Tell me who knows what a set of settings should be to see 1 MS / s?
I haven't seen any problem filling a 7M buffer in exactly 7 seconds.
You can't capture in real-time at that speed through any external link. You have to use a large buffer and dump it afterward.
I guess it starts making sense when you have a trigger model running instead of auto triggering, because between each capture operation there's a lot of code to run. I find it logical the samples per seconds are stable only within the capture count.
It's a bit like a scope's segmented memory, the rate of triggering is not the same as the ADC rate.
It's because it cannot run the trigger script a million times per second. To get a guaranteed timing there's no way around setting the size of your segments.
It's because it cannot run the trigger script a million times per second. To get a guaranteed timing there's no way around setting the size of your segments.For the digitizing V mode, is it possible to somehow disable the trigger and possibly the screen.
Are there any other ways to get continuous data in the buffer? Particularly interested in DCV mode.
Is there somewhere a tutorial on setting up triggers and writing scripts?
You mean continuously overwriting itself? It doesn't look like it, not at 1MS/s.
Not sure how it works exactly, but the buffer needs to be processed in software after each acquisition (it says "processing backlog" if it cannot do it realtime). So in very high speed digitizing modes it cannot be a free running loop regardless of the size. In fact when I set a massive buffer it automatically disable the continuous trigger, which is fine considering this is a pretty big operation to process millions of samples, calculating averages and std deviation, etc... But yeah it would have been nice to sample continuously and process only when stopping.
The trigger interface is like a visual block programming directly on the device, relatively easy when you know what you're trying to accomplish with it, much more powerful than the hard coded things I'm used to. I haven't read the manual yet I'm still playing around.
MikeP:
Probably you set the digitize Count to 1 and went into continuous trigger mode? When you do that, triggering is handled by the display processor, so when that processor gets busy it will stop triggering and catch up with whatever else it's being told to do (like update the graph image). That's where those gaps in data are coming from. A couple ways around this come to mind:
- Set a higher count to capture all the data you need. Triggering within a Count set is handled by a separate processor that won't get caught up in display stuff, you also won't see the lines connecting separate groups of data that may be a little confusing. The graph's smart scaling in x will by default show you the latest group of readings.
- Use a trigger model. Triggering from a trigger model is also handled by a separate processor and a very simple model can allow continuous data capture. I'm attaching a script (change the .txt to .tsp to use it) where the last lines set up a trigger model that starts a digitize voltage measurement (that continues infinitely) and then stops the measurement when the TRIGGER key is pressed. The display might lag behind slightly depending on your other settings, but there won't be any gaps in the data. If you digitize at a really high rate then you may see a pop-up like "Processing reading backlog...". That message means the display processor needs to catch up with the data buffer and it will stop other activities until it catches up, usually no more than a second or two.