Hi Martin. Will Siglent be able to decode an IR remote control signal?
Hi,
No, whether recs 80 or rc-5 or hitachi code.
Hm... if this can do it in the future (firmware) it would be unique….
Just a little wonder if SDS1104X E would not be enough, because it is significantly different. I know it has only 1GS / s and memory about 14Mb. Now I have Agilent 1GS / s but memory only 4Kb. Rather, I have no idea what the advantage of 50 Ohm inputs. It just eroded me if it would be sufficient for me cheaper 4 channel.
As always, it´s depending from what you want to do with it.
I´ve owned a siglent sds1104X-E and was pleased about it - Maybe it got all you need, then go for this and save a lot of money.
Can you briefly only basic points as the difference between them? I mean, besides memory and better sampling, what else would the 2000 model have or better? Like mathematical functions or other things?
Hi,
Bigger screen (10" instead 7"), zone trigger, autosense probe inputs, 50ohm Inputs, formular editor (math), "real" built in awg, touchscreen, have a look at the two specsheets:
SDS1000X-ESDS2000X-Plus
By the way and just recognized it now….
Test it again and lowered the timebase to 100ms - It decodes 15000 packages and this is the limit, at 200 or 500ms the amount is fixed to 15000.
LOL, the maximum frames are already given in the specs….
Hi,
Bigger screen (10" instead 7"), zone trigger, autosense probe inputs, 50ohm Inputs, formular editor (math), "real" built in awg, touchscreen, have a look at the two specsheets:
Add mouse and keyboard control, Power Analysis, inbuilt MSO HW, smart fan ? (like SDS5000X), 2 Mpts FFT, RTC, Histograms, DMM mode, vertical zoom..............
….1ppm timebase accuracy, instead 25ppm...and so on.
So it's clear. Although I won't use everything, I would cry that I don't have it. So no compromises :-) SDS2104X +
Martin, can you confirm 2kX Plus also has a smart fan ?
My SDS5kX fan is at full power at boot but slows to much quieter within a minute of boot.
Just checked it out:
No, it remains on the same (quiet) level.
For the record: there is a
thread about the SDS5000X (total mess though due to corrupt screen shots which make the forum SW hide all following postings on page 1) which indicates that measurement are not performed at all for large memory/capture buffer sizes. I.e. it seems as if even something simple as an "edge counter" measurement is not performed unless you reduce the memory size or zoom in. So it seems that the SDS5000X is not able to perform measurements on its whole deep memory - which would somewhat kill the idea of deep memory for certain applications and would be the final nail in the coffin for me. So I wonder if someone checked this on the SDS2000X+.
Maybe I get it wrong, but I took a squarewave ( I borrowed my stb-3 today) of 1Mhz, zoom in it measures the edges, zoom out, it measure it too.
But it couldn´t measure more than 200000 edges.
Maybe I get it wrong, but I took a squarewave ( I borrowed my stb-3 today) of 1Mhz, zoom in it measures the edges, zoom out, it measure it too.
Trolls looking for bugs that ain't there !
But it couldn´t measure more than 200000 edges.
Yes and it indicates that.
At least it's measuring something but this "saturation" is somewhat alarming. I wonder if other measurements like DC and the average etc. calculated from them are also only performed for a part of the measured data.
Besides: does this counter count both edges or did you configure it that way? With only one edge counted, I would have expected 100k edges with 10ms/div and 1MHz square wave.
At least it's measuring something but this "saturation" is somewhat alarming. I wonder if other measurements like DC and the average etc. calculated from them are also only performed for a part of the measured data.
Besides: does this counter count both edges or did you configure it that way? With only one edge counted, I would have expected 100k edges with 10ms/div and 1MHz square wave.
And what is the practical use of such a measurement ?
At least it's measuring something but this "saturation" is somewhat alarming
For me it´s not.
In fact, I only want to measure what´s actually on the screen.
Practice example, triangle waveform, it´s offset.
To measure it right, the whole screen was needing.
And only one period.
This scope can handle it, so no worries whatsoever.
At least it's measuring something but this "saturation" is somewhat alarming. I wonder if other measurements like DC and the average etc. calculated from them are also only performed for a part of the measured data.
Besides: does this counter count both edges or did you configure it that way? With only one edge counted, I would have expected 100k edges with 10ms/div and 1MHz square wave.
And what is the practical use of such a measurement ?
Think about tracking math where you can see a demodulated PWM waveform. Or as I wrote before: numerical analysis of a long train of pulses to know the min/max and average. A useful purpose is to measure the time a microcontroller is spending inside an interrupt routine measured over a longer period of time.
At least it's measuring something but this "saturation" is somewhat alarming. I wonder if other measurements like DC and the average etc. calculated from them are also only performed for a part of the measured data.
Besides: does this counter count both edges or did you configure it that way? With only one edge counted, I would have expected 100k edges with 10ms/div and 1MHz square wave.
And what is the practical use of such a measurement ?
Think about tracking math where you can see a demodulated PWM waveform. Or as I wrote before: numerical analysis of a long train of pulses to know the min/max and average. A useful purpose is to measure the time a microcontroller is spending inside an interrupt routine measured over a longer period of time.
Statistics and Histograms are the more appropriate tools for this.
Performa01 posted screenshots using the these functions earlier in this thread I believe.
In fact, I only want to measure what´s actually on the screen.
"On the screen" is a bit vague. I'm not talking about offscreen data (as there should be any considerable amount of it if at all). But the measurements have to be done from the data buffer and not from some screen presentation of the data (like former Agilents did). The saturation hints towards a limitation of measurements, i.e. not all the data captured (and displayed on the screen) is used for measurements. This is obviously a different approach to speed up the measurements, but one that can make the deep memory measurements invalid and thus unusable for certain applications.
BTW: did you check this in single trigger or normal trigger mode (not stopped/stopped)?
Statistics and Histograms are the more appropriate tools for this.
But those are also devaluated by limiting the measurements to a certain amount. Is this threshold (200k) at least documented somewhere?
At least it's measuring something but this "saturation" is somewhat alarming. I wonder if other measurements like DC and the average etc. calculated from them are also only performed for a part of the measured data.
Besides: does this counter count both edges or did you configure it that way? With only one edge counted, I would have expected 100k edges with 10ms/div and 1MHz square wave.
And what is the practical use of such a measurement ?
Think about tracking math where you can see a demodulated PWM waveform. Or as I wrote before: numerical analysis of a long train of pulses to know the min/max and average. A useful purpose is to measure the time a microcontroller is spending inside an interrupt routine measured over a longer period of time.
Statistics and Histograms are the more appropriate tools for this.
But where do the statistics come from if the actual amount of data that can be analysed is limited? You keep thinking too much in terms of periodic signals and not in an entire waveform belonging to a test case which you want to analyse. In my world periodic signals are very uninteresting. The whole point of deep memory is to be able to analyse entire waveforms.
Statistics and Histograms are the more appropriate tools for this.
But those are also devaluated by limiting the measurements to a certain amount. Is this threshold (200k) at least documented somewhere?
Have no idea until I get one and I'm not about to investigate it with a 5kX as it may well be different.
Only Martin and other 2kX Plus owners can investigate the
perceived limits to simple measurements whereas Statistics and Histograms are
live measurements only constrained by the amount of time they are left to accumulate data and show it in the format they do.
If 'counts' are your thing maybe a 10+ digit counter would be a better instrument instead of robbing valuable DSO display real estate with 6+ digit numbers.
The pot is full with a delicious soup, everyone is satisfied.
Only one is looking for the hair in the soup that doesn't exist. The one has to buy the soup he thinks there is no hair in the soup, even if it costs 10 to 20 times more than the tasty, cheap soup the others taste.
It's important to know the limits of the instrument you're using, wouldn't you agree? Especially for statistics, it's important to understand the sample size that is the base for the computation. Only then can you know if the instrument fits the intended use.