Referring to the user guide this scope doesn’t have sin(x)/x interpolation and the display mode is vector only, to my knowledge it’s the same as in the MSO5000 scopes.
Check section 19.1, page 238 in the user guide.
(Attachment Link)
I meant a display where the signal isn't being represented using signal theory math.
(or, at least, the best approximation to it which can be done within the constraints of the device)
If you turn off sin(x)/x then what do you use to display the signal? Linear interpolation?
1M tests used an open input. If I used an external 50ohm termination then that's effectively the same as the 50 ohm mode, unless there is actually different paths for the 50ohm and 1M input which is usually not the case for scopes.
So that graph you just posted in correct. I thought you wanted an open input?
For the comparison of the RMS noise with the 20 MHz filter enables the type of filter can make a difference, as the noise BW is different from the -3dB BW.
Referring to the user guide this scope doesn’t have sin(x)/x interpolation and the display mode is vector only, to my knowledge it’s the same as in the MSO5000 scopes.
Check section 19.1, page 238 in the user guide.
(Attachment Link)
Did you even watch Daves video?
The HDO4000 doesn’t have bode plot option, nor sin(x)/x
It's got Sinx/x, but you can't turn it off.
Referring to the user guide this scope doesn’t have sin(x)/x interpolation and the display mode is vector only, to my knowledge it’s the same as in the MSO5000 scopes.
Check section 19.1, page 238 in the user guide.
(Attachment Link)
Can anyone explain to me what Agilent/Keysight does differently than anyone else that lets their scopes have highly responsive UIs? I watched Dave's video of the Rigol, and I'm just immensely disappointed by how clunky the UI seemed to operate. Input lag on the touchscreen, moving waveforms around causes all of the updating to stop, and the UI seems to slow to to like 8 FPS, and so on.
...then you go watch Dave's video of the Agilent 3000 series from 11 years ago, and the UI is consistently fast/responsive. The scope keeps updating even when moving things around on the screen, and the whole thing seems to run at a constant 30 (60?) FPS.
11 years later, and other scopes still don't match it. What's the deal here? You'd think 11 years of embedded CPU improvements (I saw the Rigol has a 6-core CPU!) would allow other manufactures to catch up. Does Keysight just do something inherently different with their architecture compared to everyone else?
Can anyone explain to me what Agilent/Keysight does differently than anyone else that lets their scopes have highly responsive UIs? I watched Dave's video of the Rigol, and I'm just immensely disappointed by how clunky the UI seemed to operate. Input lag on the touchscreen, moving waveforms around causes all of the updating to stop, and the UI seems to slow to to like 8 FPS, and so on.
...then you go watch Dave's video of the Agilent 3000 series from 11 years ago, and the UI is consistently fast/responsive. The scope keeps updating even when moving things around on the screen, and the whole thing seems to run at a constant 30 (60?) FPS.
11 years later, and other scopes still don't match it. What's the deal here? You'd think 11 years of embedded CPU improvements (I saw the Rigol has a 6-core CPU!) would allow other manufactures to catch up. Does Keysight just do something inherently different with their architecture compared to everyone else?
Can anyone explain to me what Agilent/Keysight does differently than anyone else that lets their scopes have highly responsive UIs? I watched Dave's video of the Rigol, and I'm just immensely disappointed by how clunky the UI seemed to operate. Input lag on the touchscreen, moving waveforms around causes all of the updating to stop, and the UI seems to slow to to like 8 FPS, and so on.
...then you go watch Dave's video of the Agilent 3000 series from 11 years ago, and the UI is consistently fast/responsive. The scope keeps updating even when moving things around on the screen, and the whole thing seems to run at a constant 30 (60?) FPS.
11 years later, and other scopes still don't match it. What's the deal here? You'd think 11 years of embedded CPU improvements (I saw the Rigol has a 6-core CPU!) would allow other manufactures to catch up. Does Keysight just do something inherently different with their architecture compared to everyone else?
It's got Sinx/x, but you can't turn it off.Turning it off would produce a mathematically incorrect display.
Can anyone explain to me what Agilent/Keysight does differently than anyone else that lets their scopes have highly responsive UIs? I watched Dave's video of the Rigol, and I'm just immensely disappointed by how clunky the UI seemed to operate. Input lag on the touchscreen, moving waveforms around causes all of the updating to stop, and the UI seems to slow to to like 8 FPS, and so on.
...then you go watch Dave's video of the Agilent 3000 series from 11 years ago, and the UI is consistently fast/responsive. The scope keeps updating even when moving things around on the screen, and the whole thing seems to run at a constant 30 (60?) FPS.
11 years later, and other scopes still don't match it. What's the deal here? You'd think 11 years of embedded CPU improvements (I saw the Rigol has a 6-core CPU!) would allow other manufactures to catch up. Does Keysight just do something inherently different with their architecture compared to everyone else?
The Keysight MegaZoom IV ASIC does the dedicated work drawing the screen:
The only thing that Keysight are "stuck" with is the internal 4M sample memory.
If the Megazoom V just had external memory it would still be a killer today.
For the comparison of the RMS noise with the 20 MHz filter enables the type of filter can make a difference, as the noise BW is different from the -3dB BW.
The noise bandwidth is always different from the -3-dB bandwidth because of the shape factor of the filter, unless you manage a brick wall. A single pole rolloff, which will be typical for a 20 MHz filter, yields a noise bandwidth 1.6 times the -3dB bandwidth. Even some old oscilloscopes use higher order filters, but it was pretty rare. DSOs could of course do all kinds of weird things, but it is important to have a physical filter to prevent aliasing of the noise unless filtering is done during decimation.
Can anyone explain to me what Agilent/Keysight does differently than anyone else that lets their scopes have highly responsive UIs? I watched Dave's video of the Rigol, and I'm just immensely disappointed by how clunky the UI seemed to operate. Input lag on the touchscreen, moving waveforms around causes all of the updating to stop, and the UI seems to slow to to like 8 FPS, and so on.
...then you go watch Dave's video of the Agilent 3000 series from 11 years ago, and the UI is consistently fast/responsive. The scope keeps updating even when moving things around on the screen, and the whole thing seems to run at a constant 30 (60?) FPS.
11 years later, and other scopes still don't match it. What's the deal here? You'd think 11 years of embedded CPU improvements (I saw the Rigol has a 6-core CPU!) would allow other manufactures to catch up. Does Keysight just do something inherently different with their architecture compared to everyone else?
The Keysight MegaZoom IV ASIC does the dedicated work drawing the screen:
The only thing that Keysight are "stuck" with is the internal 4M sample memory.
If the Megazoom V just had external memory it would still be a killer today.
Yes an no. Megazoom scopes render waveform on tiny screens on even tinier screen area.They would need to to make a new one with higher resolution.
Can anyone explain to me what Agilent/Keysight does differently than anyone else that lets their scopes have highly responsive UIs? I watched Dave's video of the Rigol, and I'm just immensely disappointed by how clunky the UI seemed to operate. Input lag on the touchscreen, moving waveforms around causes all of the updating to stop, and the UI seems to slow to to like 8 FPS, and so on.
...then you go watch Dave's video of the Agilent 3000 series from 11 years ago, and the UI is consistently fast/responsive. The scope keeps updating even when moving things around on the screen, and the whole thing seems to run at a constant 30 (60?) FPS.
11 years later, and other scopes still don't match it. What's the deal here? You'd think 11 years of embedded CPU improvements (I saw the Rigol has a 6-core CPU!) would allow other manufactures to catch up. Does Keysight just do something inherently different with their architecture compared to everyone else?
The Keysight MegaZoom IV ASIC does the dedicated work drawing the screen:
The only thing that Keysight are "stuck" with is the internal 4M sample memory.
If the Megazoom V just had external memory it would still be a killer today.
Yes an no. Megazoom scopes render waveform on tiny screens on even tinier screen area.They would need to to make a new one with higher resolution. Megazoom 4 screen would look like a thumbnail photo on screens of any of new touchscreen scopes. It's hard coded plotting area is 640x400. It would need to quadruple pixel count (screen area). Megazoom is fast because it deals with little data and because what it does is hardcoded. It calculates on decimated data, has very limited FFT points etc.. What many people say are virtues (fast response time for people who think that is most important feature) was made by sacrificing other capabilities. While there are many other people out there who think large memory, advanced measurements of full data etc.. are more important. Market seems to need both and both are being sold. Tool for the job.
The HDO4000 doesn’t have bode plot option, nor sin(x)/x
It's got Sinx/x, but you can't turn it off.
Referring to the user guide this scope doesn’t have sin(x)/x interpolation and the display mode is vector only, to my knowledge it’s the same as in the MSO5000 scopes.
Check section 19.1, page 238 in the user guide.
(Attachment Link)
I meant a display where the signal isn't being represented using signal theory math.
(or, at least, the best approximation to it which can be done within the constraints of the device)
If you turn off sin(x)/x then what do you use to display the signal? Linear interpolation?
I meant a display where the signal isn't being represented using signal theory math.
(or, at least, the best approximation to it which can be done within the constraints of the device)
If you turn off sin(x)/x then what do you use to display the signal? Linear interpolation?WTF is signal theory math?
If you turn off sin(x)/x you get the raw samples, to treat how you will. If you turn on sin(x)/x you get one filtered version of the samples, which is probably the best kind of filter for a broad range of applications. However, like all filters it has plus and minus points, especially with regard to how the phase get mangled. There is nothing magically "correct" about sin(x)/x.
I meant a display where the signal isn't being represented using signal theory math.
(or, at least, the best approximation to it which can be done within the constraints of the device)
If you turn off sin(x)/x then what do you use to display the signal? Linear interpolation?WTF is signal theory math?
If you turn off sin(x)/x you get the raw samples, to treat how you will. If you turn on sin(x)/x you get one filtered version of the samples, which is probably the best kind of filter for a broad range of applications. However, like all filters it has plus and minus points, especially with regard to how the phase get mangled. There is nothing magically "correct" about sin(x)/x.No, sin x/ x is not a filter! sin x/x is a method for constructing a visible signal from samples. By definition, the trace you get from sin x/x reconstruction goes through all the sample points. There is no phase mangling, adding fictional information or whatever.
I meant a display where the signal isn't being represented using signal theory math.
(or, at least, the best approximation to it which can be done within the constraints of the device)
If you turn off sin(x)/x then what do you use to display the signal? Linear interpolation?WTF is signal theory math?
If you turn off sin(x)/x you get the raw samples, to treat how you will. If you turn on sin(x)/x you get one filtered version of the samples, which is probably the best kind of filter for a broad range of applications. However, like all filters it has plus and minus points, especially with regard to how the phase get mangled. There is nothing magically "correct" about sin(x)/x.No, sin x/ x is not a filter! sin x/x is a method for constructing a visible signal from samples. By definition, the trace you get from sin x/x reconstruction goes through all the sample points. There is no phase mangling, adding fictional information or whatever.At low frequencies it works really well. The closer you get to the Shannon rate the more fiction occurs between the actual samples, and most displays don't even highlight where the actual samples are. You can get very misleading images.
At low frequencies it works really well. The closer you get to the Shannon rate the more fiction occurs between the actual samples, and most displays don't even highlight where the actual samples are. You can get very misleading images.
The interpolation filter can not deal with aliasing.
One may get something close to a sinx/x reconstruction if part the 200 MHz BW limit is actually realized digital and not analog.
The manual that was posted earlier appears unfinished, I wouldn't rely on it for a complete+accurate list of features.