It's a 'trap' I did fall into years ago when I purchased my first scope, believing it would read amplitude correctly at the rated bandwidth.
While 3dB is an industry standard, so was feet and inches until the world went metric. Hopefully 100Mhz BW will actually mean 100Mhz BW @ full amplitude one day.
OK, for a start, what does "full amplitude mean"? I'm going to pretend you wrote something actually measurable there, like 0.1dB.
So, should Horowitz and Hill update their definition of bandwidth/corner-frequency? From BW = 1/(2*pi*R*C) to BW" @ 0.1dB" = <hacky correction factor>/RC? From a beautifully elegant definition of the point where the reactances of the R and the C are equal, to some hacky pragmatic definition optimized for confused oscilloscope buyers?
Shall we redefine spectral linewidth of laser to not be the FWHM measurement (which is another word for -3dB)? Acoustic systems? Mechanical filters?
Or should we do the most horrible thing of all, and redefine bandwidth to mean @ 0.1dB for oscilloscope only? WTF?
I am not insinuating it I am saying it loud and clear. 100MHz BW should mean exactly that as far as the instrument accuracy goes. Not half down the power scale. Or perhaps the units should be labeled 100Mhz at half true power readings.
Guess what. 100 MHz bandwidth
means 100Mhz at half true power readings. That's just exactly what it means, and always has. You suggest the scopes should have "100Mhz at half true power readings" printed on them. That's what they already have written on them, unless
you make unwarranted and wrong assumptions about what the word "bandwidth" means, and always has meant.
I worked in the advertising industry for a long time as a production engineer and the ambiguity found in so called engineering disciplines and nomenclature is unpleasant.
Ambiguity?
Ambiguity? Bandwidth is
unambiguously the -3dB point, across all engineering disciplines. You're trying to introduce the ambiguity here by proposing an alternative
The industry does specify RF amplifiers at 1dB compression point for nonlinearity as well as the IP3 point. So it's not as though these things are unimportant.
WTF? Is there a
precedent for measuring non-linearity at the "3dB compression point"? Like the innumerable precedents for -3dB as a bandwidth measurements, e.g. elegant formulas for such a thing in Horowitz and Hill? The very same datasheets for RF amplifiers still specify the bandwidth at -3dB, with perfect consistency with the definition of bandwidth in the rest of engineering. Your analogy makes no sense. Engineers understand and expect the bandwidth to be 3dB down, so that's what datasheets provide. Engineers understand and expect the non-linearity to be measured at the stated point, so that's what datasheets provide.
The status quo always needs upgrading as technology moves ahead.
The "improvements" that you propose are far, far more confusing and ugly than the reality we have today. Only when viewed through the tunnel-vision of oscilloscope buying does it even
slightly make sense, and even then it's completely at odds with all we've all learned about filters bandwidths and spectral linewidths, not to mention other fields of engineering. I can't stress this enough, your campaign to redefine the word "bandwidth" will most surely fail. I encourage you to bring
your definition into line with the rest of the world.
Edit: besides, even a scope that was "100 MHz" by your definition would not handle 90 MHz square waves properly. Is that an oscilloscope manufacturer scam as well, now? You can't get away from it, there's no substitute for actual understanding.