I used analog oscilloscopes before but now I bought my first digital one. I just believed before in what I saw on monitor. Now I have enough time and possibility to compare.
Which of signal is correct - same signal. I only changed input resistance and bits resolution on oscilloscope. I did not touch generator. (connection by RG58U cable) SIGLENT2352 Plus
I repeat my answer from the other thread:
1. The 10 bit mode captures cannot be correct, because as we all should know, 100 MHz bandwidth is way too low for the reproduction of a 50 MHz squarewave.
2. Basic transmission line theory rules out the 1M "measurements" as well, because we cannot expect any signal fidelity/accuracy at frequencies that high with a severe mismatch like this.
You wanted to say ns, not ms, right?
You did not deliberately attach screenshots that don't have enough resolution to clearly see the parameters, did you?
You have had a look into the manual, haven't you? Still you've obviously missed the part where it says that 10 bit mode is always limited to 100 MHz bandwidth
- Yes, it is 1ns not 1ms - I already corrected it.
- I can read details on picture I added without problems. I can even read all scope settings on my pictures.
It’s fine that you can “even read your settings”. If you want help from others, you might want to make sure that settings are actually readable, not just guessable. Because it’s no fun to try to decipher that pixel-mush.My fault, sorry for that.
- I read manual, this is why tested signal is 50MHz not more. Isn't 50MHz signal in 100 MHz bandwidth? Or is it something else?
Pardon? Are you kidding me? We are talking about square waves here. 100 MHz bandwidth isn’t sufficient to reproduce _any_ squarewave at 50 MHz, no matter what the transition times are, because even the 3rd harmonic is already outside the bandwidth then.
These are absolute basics, and it’s the very same on analog scopes…
I thought the higher impedance the lower source loading and in result more clear/readable/low deformed measured signal.
This is so at DC and in the audio range of frequencies - and can even be stretched up to a few MHz in many circumstances. But as soon as the length of a link exceeds about 1/20 of the wavelength of the signal that is to be transferred, we enter serious territory, known as HF. Now (among other things) transmission line theory becomes increasingly important and there is no room for mismatched connections anymore.
For 50 MHz fundamental frequency, the wavelength is about 6 m. This means that any link length greater than 30 cm is critical already. Take the NVP (Nominal Velocity of Propagation) of RG58 coax cable into consideration (0.66), then it’s only 20 cm. And this is only for the fundamental frequency! If you want a reasonably undistorted signal, all relevant harmonics must be correctly transferred too, so you can easily divide the number by e.g. 7 (to ensure proper transfer up to the 7th harmonic), and the link must not have a physical length of more than 3 cm - as long as it is not properly terminated, that is. There is a reason why better (= high bandwidth) scopes have a switchable 50 ohms input impedance (and the really high bandwidth ones might have only 50 ohms and nothing else)!
Yes, there is written in manual that resolution "10bits" works up to "about" 100MHz. It looks like it is only fake resolution maybe for "zoom function" for LF signals. Now I understand problematic a bit more.
Are you sure? 😉
So you call a 100 MHz bandwidth “LF signals” now. An interesting statement for someone who doesn’t seem to even know transmission line theory (or the bandwidth requirements of a square wave).
Apart from R&S (10 bit) and LeCroy (up to 12 bit), who have scopes with true high resolution converters, most of the better known DSO manufacturers (Keysight, Tek and also R&S) offer “fake” resolutions, even up to 16 bits sometimes. It works well, it trades in resolution for bandwidth, it’s common and, most importantly, it’s an option. Take it or leave it.
I suspected the funny results at 10 bit resolution were something to do with memory & sample rates (isn't it always?), & expected some clues on the display, but no!
The results are not “funny”, but that’s what you get when you feed a 50 MHz square into a system with only 100 MHz bandwidth.
I have attached a screenshot that is actually readable. Have a look at it!
Logically, a DSO capable of native 10 bit resolution would, by default, run in that mode, so is it always "fake", or does it just "run out of grunt" at higher frequencies?
As explained earlier, this is an 8 bit scope and nowhere is it advertised as 10 bit. It happens to have a 10 bit HiRes mode, it's there if someone wants to trade in the higher resolution against bandwidth. In this forum there are already a number of postings (with screenshots) that explain and demonstrate the value of that mode.
A lot less "traps for young players" with my old Tek 7613!
I know. It was so much better in the good old days.