I've seen this happening with some 1980s TV sets and computer monitors. It's also showing up on one that I'm testing now, and I'm trying to understand the cause.
When the picture content is dark overall, the CRT shows those dark colors well enough, and small spots of the brightest possible white are very bright.
But when the picture is bright on average (say lots of white), without changing the brightness/contrast controls, all colors become noticeably dimmer.
More precisely...
- Let's say I'm showing something like a standard color-bars test screen, and set the brightness as high as it can get while black is still black.
- Now switch to a very dark screen: dim greys are now brighter than they were, and the black as well (faint scanlines are showing).
- Now switch to a very bright screen: the same dim greys aren't visible (or just barely distinguishable from black), and the same white appears more like light gray.
So what's the physical explanation behind this? And why do later CRTs not show this phenomenon (or make it much less noticeable)?
I thought it may be related to blooming (brighter images make the picture a little bigger, so the same energy is spread over a larger area = slightly less luminance from each phosphor dot). But the size increase seems too slight to account for the very visible dimming. And some later CRTs with similar blooming don't show nearly as much dimming.
ATM I'm not really thinking of trying to 'fix' it, especially if it's not a defect. Just trying to understand why it happens.