For my current project, I want to be able to automate a number of tests and measurements. I'm using a Rigol DS2072 and a Rigol DG4062 to start. Just getting set up, I figured I'd do something simple like profile an RC low-pass filter. I'm working in Python with PyVISA and the NI-VISA backend. The test equipment is connected through ethernet.
Here's what I've been able to do: connecting to the equipment is easy, and changing the configurations works without a problem. I was able to make the profile by changing the signal generator output, allowing for some settling time, and querying the amplitude of the oscilloscope input. Adding some automatic gain control to keep the measured sine wave in an accurate measurement region, and I found a response curve that very closely matches the predicted.
Here's the problem: doing the measurement this way is **slow**. Mostly due to needing a significant amount of time to let the generator make the change and settle. I've also noticed spurious outputs from the scope, and so I have it taking a number of readings and averaging the IQR.
What I would like to do is just have the generator sweep over the frequency range and remove the setelling time issue entirely. But to do this, I need to be able to read the waveform from the scope to make the calculation. But every time I read the waveform data, I get only an array of zeros.
I installed marmad's Rigol UltraVision Utilities software, and was somehow able to get it to send me a series of frames, the first being all zeroes and the rest having real data that matched the input signal. So clearly I am doing something wrong.
Could someone familiar enough with the Rigol programming explain the right series of commands to get a full waveform in ascii?