General > General Technical Chat
Acceleration voltage effect on VFD phosphor lifetime
f36:
I'm thinking about using a small VFD (a funky Soviet single-pixel tube; kinda like a magic eye, but without variable shadow) in a future project. It's rated for 50V anode voltage, but I already have a ~200V rail in the project for other purposes. Assuming the tube doesn't arc over inside, the 200V should be able to light it up just fine, and I should be able to reduce the heater power and grid voltage to get the emission current back in spec.
My concern is regarding the lifetime of the phosphor (maybe ZnS or CdS?) and how the higher-energy electrons might affect it. It seems like the main (?) mechanism for phosphor degradation is due to the electrons converting the metal cations back into their neutral, metallic form, which reduces the efficacy of the phosphor. However, I haven't really found much information regarding the factors influencing the rate of this process. Is it related to electron energy (anode voltage), rate of electrons/accumulation of charge (current), or a thermal thing related to power?
My intuition says it would have more to do with emission current than acceleration voltage. I would think the difference between 50eV and 200eV electrons isn't going to be the difference between breaking ionic bonds and not, but more free electrons available could accelerate damage. If so, limiting the emission current should be sufficient to retain the lifetime of the tube, and the max voltage would be limited by arcover or your geiger counter starting to tick...
Ian.M:
Underrunning your heater is asking for trouble with cathode poisoning, or if the grid voltage ever spikes, cathode stripping. Why not simply include a dropper resistor in series with the anode, calculated to bring the anode voltage down to spec at the nominal anode current? You'll still need to verify it doesn't flashover or fail to blank due to the increased E field when the grid is grounded.
f36:
Yeah, that would be the obvious solution, but I was initially concerned that that resistor would cause the power consumption of the whole thing to increase by 4x (this thing is going to be battery powered). But, if the same current is required for the same brightness regardless of voltage, then it would make no difference, apart from possibly working at a lower heater temperature, but then that brings the problems you mentioned.
I am also planning on turning the heater on and off to save power, so I would think a lower temperature could improve lifetime there. I suppose I could program the heater to increase in temperature with the anode voltage disabled to clean it periodically if that would help. Or I could just not care and put a 10% bigger battery in ::)
David Hess:
--- Quote from: f36 on February 26, 2024, 01:41:37 am ---
My concern is regarding the lifetime of the phosphor (maybe ZnS or CdS?) and how the higher-energy electrons might affect it. It seems like the main (?) mechanism for phosphor degradation is due to the electrons converting the metal cations back into their neutral, metallic form, which reduces the efficacy of the phosphor.
--- End quote ---
I thought phosphor degradation was primarily from heating, which explains why aluminized phosphors are so much tougher.
f36:
--- Quote from: David Hess on February 26, 2024, 02:31:22 am ---I thought phosphor degradation was primarily from heating, which explains why aluminized phosphors are so much tougher.
--- End quote ---
I was mainly going off of this section in Wikipedia which mentions a number of different factors, most of which presumably could be accelerated by increased temperature.
https://en.wikipedia.org/wiki/Phosphor#Phosphor_degradation
Navigation
[0] Message Index
[#] Next page
Go to full version