General > General Technical Chat
CRT gets dimmer when average scene brightness is higher - why?
(1/1)
Cugel:
I've seen this happening with some 1980s TV sets and computer monitors. It's also showing up on one that I'm testing now, and I'm trying to understand the cause.
When the picture content is dark overall, the CRT shows those dark colors well enough, and small spots of the brightest possible white are very bright.
But when the picture is bright on average (say lots of white), without changing the brightness/contrast controls, all colors become noticeably dimmer.
More precisely...
* Let's say I'm showing something like a standard color-bars test screen, and set the brightness as high as it can get while black is still black.
* Now switch to a very dark screen: dim greys are now brighter than they were, and the black as well (faint scanlines are showing).
* Now switch to a very bright screen: the same dim greys aren't visible (or just barely distinguishable from black), and the same white appears more like light gray.
So what's the physical explanation behind this? And why do later CRTs not show this phenomenon (or make it much less noticeable)?
I thought it may be related to blooming (brighter images make the picture a little bigger, so the same energy is spread over a larger area = slightly less luminance from each phosphor dot). But the size increase seems too slight to account for the very visible dimming. And some later CRTs with similar blooming don't show nearly as much dimming.
ATM I'm not really thinking of trying to 'fix' it, especially if it's not a defect. Just trying to understand why it happens.
james_s:
I strongly suspect what is happening is the EHT is sagging under load resulting in reduced brightness. This will also make the picture expand slightly but due to overscan that may not be noticeable.
RoGeorge:
From what I know, the cause is not enough electrons generated by the electron gun. There were some procedures to generate a small arch inside the tube, in order to rejuvenate the area responsible with thermionic emission. Most of the time that didn't work, or at most the rejuvenation was effective only for a short period, a few more weeks.
Simply said, old filament.
T3sl4co1l:
Bloom and intensity should be proportional, if it's high voltage. So it sounds unlikely.
Other possibilities are sag in the mid voltage supplies (100s V), perhaps due in part to increased HV consumption, maybe something about the cathode drivers (depends if video is driven into cathode or grid, I think cathode more commonly? I forget, haven't looked at a TV in ages, monitors usually did this though). Or just plain old shitty DC restore in the video path, or none at all, just flat AC coupling (top level Muntzing).
Tim
Navigation
[0] Message Index
Go to full version