Author Topic: CRT gets dimmer when average scene brightness is higher - why?  (Read 648 times)

0 Members and 1 Guest are viewing this topic.

Offline CugelTopic starter

  • Newbie
  • Posts: 1
  • Country: il
I've seen this happening with some 1980s TV sets and computer monitors.  It's also showing up on one that I'm testing now, and I'm trying to understand the cause.

When the picture content is dark overall, the CRT shows those dark colors well enough, and small spots of the brightest possible white are very bright.
But when the picture is bright on average (say lots of white), without changing the brightness/contrast controls, all colors become noticeably dimmer.

More precisely...
  • Let's say I'm showing something like a standard color-bars test screen, and set the brightness as high as it can get while black is still black.
  • Now switch to a very dark screen:  dim greys are now brighter than they were, and the black as well (faint scanlines are showing).
  • Now switch to a very bright screen:  the same dim greys aren't visible (or just barely distinguishable from black), and the same white appears more like light gray.

So what's the physical explanation behind this?  And why do later CRTs not show this phenomenon (or make it much less noticeable)?

I thought it may be related to blooming (brighter images make the picture a little bigger, so the same energy is spread over a larger area = slightly less luminance from each phosphor dot).  But the size increase seems too slight to account for the very visible dimming.  And some later CRTs with similar blooming don't show nearly as much dimming.

ATM I'm not really thinking of trying to 'fix' it, especially if it's not a defect.  Just trying to understand why it happens.
 

Offline james_s

  • Super Contributor
  • ***
  • Posts: 21611
  • Country: us
Re: CRT gets dimmer when average scene brightness is higher - why?
« Reply #1 on: April 02, 2022, 07:53:49 pm »
I strongly suspect what is happening is the EHT is sagging under load resulting in reduced brightness. This will also make the picture expand slightly but due to overscan that may not be noticeable.
 
The following users thanked this post: Cugel

Online RoGeorge

  • Super Contributor
  • ***
  • Posts: 7012
  • Country: ro
Re: CRT gets dimmer when average scene brightness is higher - why?
« Reply #2 on: April 02, 2022, 08:10:19 pm »
From what I know, the cause is not enough electrons generated by the electron gun.  There were some procedures to generate a small arch inside the tube, in order to rejuvenate the area responsible with thermionic emission.  Most of the time that didn't work, or at most the rejuvenation was effective only for a short period, a few more weeks.

Simply said, old filament.
 
The following users thanked this post: Cugel

Offline T3sl4co1l

  • Super Contributor
  • ***
  • Posts: 22436
  • Country: us
  • Expert, Analog Electronics, PCB Layout, EMC
    • Seven Transistor Labs
Re: CRT gets dimmer when average scene brightness is higher - why?
« Reply #3 on: April 02, 2022, 08:16:05 pm »
Bloom and intensity should be proportional, if it's high voltage.  So it sounds unlikely.

Other possibilities are sag in the mid voltage supplies (100s V), perhaps due in part to increased HV consumption, maybe something about the cathode drivers (depends if video is driven into cathode or grid, I think cathode more commonly? I forget, haven't looked at a TV in ages, monitors usually did this though).  Or just plain old shitty DC restore in the video path, or none at all, just flat AC coupling (top level Muntzing).

Tim
Seven Transistor Labs, LLC
Electronic design, from concept to prototype.
Bringing a project to life?  Send me a message!
 
The following users thanked this post: Cugel


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf