I've read in
Yokogowa and Tek app notes that bandwidth of a DSO is sampling_rate/2.5. For an analog o'scope, I can intuitively understand the -3 dB point, but for digital types, could someone explain how it is derived?
BW and sampling rate are completely different things.
could someone explain how it is derived
it is not derived. All that app note talks about is the
minimum sampling rate required to correctly represent the signal of particular frequency.
The factor 2.5 is basically the Nyquist frequency plus a margin to allow signal reconstruction algorithms (so your eyes can see the signal as a signal) and an anti-aliasing filter to do their work.
Say you have a 4 channel scope with 100 MHz bandwidth spec, it's the same as with analog scopes (not worse than -3db @ 100MHz). And as it is a cheap scope, it has only 500 MSPS sample rate. All fine if you measure 100 MHz signal with only one channel enabled, you get full 500 MSps sample rate. But most of the scopes share their sample rate between the channels. Once you enable second channel, you get only 250 MSps sample rate which is marginally enough. Enable 3 or 4 channels and you still have the full 100 MHz analog bandwidth but only have 125 MSps sample rate and your signal won't be correctly shown on the display but some sort of distorted garbage instead. All that said given that the scope has a good Sin(x)/x interpolation, otherwise much higher sample rate will be required to acceptably restore the waveform.
Thank you everyone for the replies.