... comes with Danaher grade non-support.
As for the RTE, we had an offer (for 350Mhz) that was well below 7k, so its quite cheap, just 1k extra on the RTM.
And a good 3.5k below a HDO6k.
Sadly, as always, bosses decide only on numbers, guess why i have a 100Mhz DPO3k on my bench every day. Its f.ing cheap but well it DOES suck.
In the real world, 2 MS is a lot of sample data, 64K is a long FFT record, and sub-millivolt signals are often best observed with said FFT.
Not in 2015, no, and even less so for a 1GHz+ scope with sample rates in excess of 4GSa/s. 2Mpts memory was pretty good (not stellar, though) 15 years ago, however even back then 64k FFT was pretty poor.
To offer a scope like the Keysight DSOX4k in the price range it sits in with 4M memory and 64k FFT is a pretty bad joke in this day and age.
I guess you haven't had to use their high end scopes (i.e. the DSO90k) where some firmware versions suffered from jumping encoders, wrong indications (i.e. showing channels as set to GND when they weren't) and other silly things. Granted, these problems were have been fixed, but Agilent didn't exactly rush out to squash these bugs.
Curious, what do you tend to work on (either at home or work) that frequently needs longer sample records and larger FFT kernels?
QuoteTo offer a scope like the Keysight DSOX4k in the price range it sits in with 4M memory and 64k FFT is a pretty bad joke in this day and age.
How the memory is used is much more important than how much there is, in my experience.
Everyone's application is different, but from what I've seen in the scope business, it is far too easy to get caught up in "bench racing" and specsmanship. You end up with a crappier instrument, albeit one with larger numbers on the sell sheet.
It's much like the thread on microscopes, where Rupunzell points out that quality can't really be inferred from the specs alone.
I don't care that much about my microscope, but I rely on my oscilloscope to cover my own shortcomings as an engineer, so it's almost literally a part of me.
Parameters like record length aren't limited only by marketing considerations in the Agilent/Keysight scopes, the way they are in everyone else's. They are limited by the ASICs that they use to process and display data.
The sample record is shorter in my MSO6000-series scope than it would be in an optioned-out Tek MDO, but the fact that I don't have to fiddle with it to get the display I want counts for a lot. Let's see, which button do I press now? Do I want a "fast acquisition" or a "long acquisition"? Gee, I dunno. How about you just show me the signal?
QuoteI guess you haven't had to use their high end scopes (i.e. the DSO90k) where some firmware versions suffered from jumping encoders, wrong indications (i.e. showing channels as set to GND when they weren't) and other silly things. Granted, these problems were have been fixed, but Agilent didn't exactly rush out to squash these bugs.
This is true. I've been lucky so far, in that I've been able to stick with the last generation of Agilent scopes to run VxWorks. The controls are responsive and reliable (if not the best-feeling.) Not many firmware bugs, either.
I don't look forward to "upgrading" to a scope running a heavyweight desktop OS...
Sometimes you can compensate or work around sample memory limitations to an extend, but at the end of the day if the sample memory is small then this will limit a scope's performance.
Sometimes you can compensate or work around sample memory limitations to an extend, but at the end of the day if the sample memory is small then this will limit a scope's performance.
OK, convince me for a "typical case" user of an oscilloscope looking at the time-domain representation of a signal. Oscilloscope != spectrum/vector/network analyser.
If you have a long duration signal, when isn't is sufficient to digitise at the full rate? Provided you then present three values at each displayed point: mean, min and max so as to attempt to mimic the delightful informative "fuzziness" that can be seen on analogue scopes.
Background: I've been using analogue scopes for 40 years, and digital scopes for 25 years. I have my own opinions as to the advantages, disadvantages, and overlap of each. I like analogue displays, and eye diagrams (and similar "heat maps") when appropriate. I once (in ~1992) had a student build a deep digitiser for just your use case, analysis of comms signals.
I'm a strong believer of debugging digital signals in the digital domain, and that almost all signals are analogue (except where we choose to interpret them as digital).
Hence I will claim that anything equivalent to using a scope to capturing and decoding, for example, the data on an serial peripheral interface (SPI) is misguided. After all, noise/thresholds/settings may cause the receiver to interpret the analogue signal in a different way. Use a scope to ensure signal integrity, then debug what the receiver reckons it has received.
QuoteHence I will claim that anything equivalent to using a scope to capturing and decoding, for example, the data on an serial peripheral interface (SPI) is misguided. After all, noise/thresholds/settings may cause the receiver to interpret the analogue signal in a different way. Use a scope to ensure signal integrity, then debug what the receiver reckons it has received.
Great, so instead of using one instrument which has the capabilities to thoroughly analyze the signal on the physical and logical level to make sure the input signal is really OK (and just looking on the signal integrity is a far cry from making sure it is!) you want to rely on a receiver which (especially if part of the UUT) might or might not work, and if not you can't really say if its a problem with the sink or the source because you don't know what's actually going in (and it's not just the physical signal where problems can occur). You could of course then try another receiver, and if it works then conclude that the first receiver is broken. But it might just be that there *is* a problem with the source, and the second transmitter just has a wider tolerance in some of its specifications (or a different firmware, or some other minor difference) which allows it to work where the first receiver didn't. Which means while you now believe that your transmission system is now working again there's still a problem, and it will stop working as soon as you replace the sink with one with slightly tighter specs. All while you could have just used the scope to listen to what's going on over the transmission line, which would have shown instantly that there's a flaw in some of the message formats sent by the source.
You stated that you have over 40 years experience with analog scopes, and please don't take this the wrong way but I think that shows. That example above is pretty much taken from reality (a case I witnessed where the EE relied on the receiver to verify if the transmission system was working when it really wasn't; as it turned out he did so because that's the approach he learnt many moons ago and he didn't really know how to work with the advanced features of the scope he had on his bench), and it reflects some of the problems I see pretty often with older engineers that have trained, learnt and spent most of their time on analog scopes and simple DSOs, and who don't really have a grip on how to make use of the advanced analysis capabilities of a modern mid-range or high-end DSO (which at the end of the day are pretty much signal analyzers). It doesn't mean they are bad engineers (a lot of them are actually pretty good) but quite often they waste time and effort working around things that with modern scopes no longer are a problem.
I have seen cases where people have relied on the interpretation of a signal by a general purpose oscilloscope, thought they did (or didn't!) see any fault - only to find out that the real receiver was interpreting the signal in subtly different ways. In those cases the "added functionality" actively misled them and delayed their finding the problem.QuoteBeing able to find the cause of a problem is a talent in itself and does not have much to do with the tools. More sophisticated tools only make is easier/faster to check/verify parts of a circuit but it is still the person operating it which needs to have the brains to work out where the problem is.
I have seen cases where people have relied on the interpretation of a signal by a general purpose oscilloscope, thought they did (or didn't!) see any fault - only to find out that the real receiver was interpreting the signal in subtly different ways. In those cases the "added functionality" actively misled them and delayed their finding the problem.Being able to find the cause of a problem is a talent in itself and does not have much to do with the tools. More sophisticated tools only make is easier/faster to check/verify parts of a circuit but it is still the person operating it which needs to have the brains to work out where the problem is.
You acknowledge that you have a fairly specialised requirement, and I don't disagree with your assessment of the right tool for your job. I suspect I would come to the same conclusion as you.
My question, which you chose not to address, was about tools for less specialised, more general requirements.
Well of course you have to examine the entire signal chain to determine where the fault is being introduced. That's statin' the bleedin' obvious!
N.B. receiver != radio receiver, except in special cases.
I have seen cases where people have relied on the interpretation of a signal by a general purpose oscilloscope, thought they did (or didn't!) see any fault - only to find out that the real receiver was interpreting the signal in subtly different ways. In those cases the "added functionality" actively misled them and delayed their finding the problem.
You have omitted to recognise that I have been using digital scopes for 25 years.
You cannot possibly know that I have been working on novel instrumentation for 30 of the past 40 years, for many things from optical fibres through multiple LANs and wireless LANs to cellular phone systems (with diversions into road surfaces and building materials). Much of that was in HP Labs, including the RF departments of local universities and with the people that designed the front-ends for their most advanced scopes.
So yes, I am very familiar with understanding new tools' strengths, limitations, and how they can be misleading - just like all tools.
I would add that a more sophisticated/complex tool has more and more subtle ways of leading people astray. That's especially true if they are not already familiar with it and can't/won't take the time to learn what it is doing and how to use it.
I would add that a more sophisticated/complex tool has more and more subtle ways of leading people astray. That's especially true if they are not already familiar with it and can't/won't take the time to learn what it is doing and how to use it.Any professional should know his tools. Where I work it's pretty much expected that engineers take time and familiarize themselves with new equipment they might work with. That includes having a look at the manual, and taking say a new scope and play around with it for half an hour.
If you want long recording memory,
check out Rigol DS6104, 140Mpts. Warning: loud fan and some minor software bugs, no big issues though.
For "industry standard" UI and spectrum analyzer: MDO3k/4k
For making Wuerstchenhund happy: LeCroy WaveRunner 3k.
Warning: not widely used as HPAK or Tek
The reason I say Tek is "std" is because in so many academic papers scope screen shots are Tek's.
Sorry for suggesting the wrong product, because I have no idea about LeCroy's line up, and I've never seen one in real life. In my place it is pretty much all Tek and HPAK.
Rigol has segmented memory and advanced triggers, just not as good as other brands'. If you think thoroughly about the signal to be tested, and tailor a proper trigger condition for it, it will work just fine.
BTW, ETS is not the biggest reason I returned that unit. The reason is I found a $6000 MSOX3104A offer and the carelessness of Rigol quality control (angled BNC connector) and really loud fan.
I managed to measure phase delay with other methods, as usual, improvised tool works the best on its intended job.
And its not just memory search and triggers that are not as good as any other scope in that price range, there's also FFT which does what, 4k points or so?
What data set size is used to calculate the FFT on the DS series of scopes?
The FFT is calculated using the displayed data set. DS1000E/D/B/CA series scopes = 600 Data points DS1000Z Series (Displayed) = 1200 (Memory) = 16,384 DS2000 Series = 2048 points DS4000 series = 2048 points DS6000 series = 2048 points
It's actually 2048 points according to the following app note: FFT data depth. Though, this app note says it's 700 for the 4k and 6k but in the brief FOR that app note it again says 2048:QuoteWhat data set size is used to calculate the FFT on the DS series of scopes?
The FFT is calculated using the displayed data set. DS1000E/D/B/CA series scopes = 600 Data points DS1000Z Series (Displayed) = 1200 (Memory) = 16,384 DS2000 Series = 2048 points DS4000 series = 2048 points DS6000 series = 2048 points
I ended up with a R&S RTE1204, the extra bandwidth did not cost much extra and we did get a pretty good deal trading in an old scope and bought demo used probes.
LeCroy did not have any opportunity to come and do a demo (it was very short notice...) and I did not want to order without first trying them out.
The scope has arrived now, but I'm away from work, so it's not yet unboxed.