It's a very typical misunderstanding that CRTs have extremely wide gamut.
In fact, the gamut in a typical CRT is rather poor due to the phosphors available. Especially the green phophor gives significantly wrong peak wavelength, way too yellowish. AFAIK, green phosphors closer to the right wavelength either have low light output power (hence, poor efficiency), or they are expensive. Red is too orangish, too.
Think about "computer screen 0,255,0" and "grass green". The "computer green" is too yellow, and prevents some very good looking deep cyanish-green shades from being shown. This green region is something the sRGB standard lacks the most - and it's exactly because the standard has been defined constrained by the CRTs.
Some expensive CRTs designed for color grading work have wider gamut. CRT projectors sometimes use filter dyes to absorb "the wrong side" of the wide wavelength peak, reducing light output but widening the gamut.
It's just that early LCDs - and many modern LCDs still - have even worse gamut. Since they work/worked by subtracting from the white backlight, using sharply cut dyes in the filters would reduce the brightness by wasting most of the backlight power as heat.
Again, LCDs with wider gamuts have been available for long - and unsurprisingly, they were expensive and consumed a lot of power, and you could literally feel the heat in the front of such a monitor!
Some of the most important features regarding CRT use in graphical/photoediting/color grading appicalitons are/were: high dynamic range, predictable and clean curve (which can be easily linearized), wide viewing angle while retaining the dynamic range and curve shape, and no quantization issues. But people always say it's the "gamut", since that's the nice fancy word they have heard, while, in reality, that's the most mediocre thing on CRTs.