General > General Technical Chat
The Mehikon - broadcast TV color eraser
tooki:
--- Quote from: Bud on August 21, 2020, 05:57:52 am ---With interlaced method of transmission you send half of the picture 50 or 60 times per second. With progressive method you send full picture at full resolution each time, therefore doubling the bandwidth. I can speculate that 1080i may have to do with lower bandwidth media including perhaps interconnecting cables. It would be less demanding to the system throughput than 1080p.
--- End quote ---
Interlacing was originally invented to reduce the demands on the circuitry of TV receivers, limited by 1930s technology.
It (sorta) made sense to preserve that in digital video standards intended to be displayed on CRT SD TVs. In digital, they should have just made it 1080p30 instead of 1080i. (The things where you’d want the extra frame rate of p60 are also the ones where you want the clear movement of progressive, which is why sports, for example, typically got broadcast in 720p rather than 1080i.)
NiHaoMike:
--- Quote from: basinstreetdesign on August 21, 2020, 04:44:07 am ---Yes it would but you would have to move the sound carrier up by the same amount to avoid the original problem: If there were any non-linearities in the video system then an annoying stationary dot pattern would be caused by the sound carrier at 4.5 MHz. In that case, all NTSC receivers would have to be returned as well, and that wasn't about to happen.
--- End quote ---
So then move the sound carrier that slight amount.
--- Quote from: Bud on August 21, 2020, 05:57:52 am ---With interlaced method of transmission you send half of the picture 50 or 60 times per second. With progressive method you send full picture at full resolution each time, therefore doubling the bandwidth. I can speculate that 1080i may have to do with lower bandwidth media including perhaps interconnecting cables. It would be less demanding to the system throughput than 1080p.
--- End quote ---
1080p at 30 FPS would be exactly the same pixel rate. Also, since the video is compressed, doubling the frame rate does not double the bandwidth.
Instead of interlacing, they could have "checkerboarded" the half frames so that the artifacts would just appear as a blurred image instead of more visible lines.
janoc:
--- Quote from: tooki on August 21, 2020, 10:21:23 am ---With regards to CRT TVs? Absolutely. It’s clearly visible, especially in peripheral vision. 60Hz really is the lower bound for being flicker free, 50Hz is just below it.
--- End quote ---
I think the issue is more that if you watched a 60Hz TV you didn't have any 60Hz light sources around (everything being 50Hz in Switzerland), so you didn't get the beating effect that makes it much more visible. Or you have superhuman eyes.
--- Quote from: tooki on August 21, 2020, 10:21:23 am ---There’s a reason 100Hz (50Hz x2) TVs were developed, but nobody ever had to make 120Hz (60Hz x2) TVs.
--- End quote ---
That's not true, for example 120Hz CRTmonitors were completely common (and expensive!). And pretty much all 100Hz TVs were multi-standard, they were able to double both 50Hz PAL/SECAM signals and 60Hz NTSC.
Bud:
--- Quote from: NiHaoMike on August 21, 2020, 01:22:45 pm ---1080p at 30 FPS would be exactly the same pixel rate. Also, since the video is compressed, doubling the frame rate does not double the bandwidth.
--- End quote ---
..and you would get horrible flicker as a bonus
--- Quote ---Instead of interlacing, they could have "checkerboarded" the half frames so that the artifacts would just appear as a blurred image instead of more visible lines.
--- End quote ---
All that shenanigan just because someone wanted even 30Hz ?
tooki:
--- Quote from: janoc on August 21, 2020, 01:33:31 pm ---
--- Quote from: tooki on August 21, 2020, 10:21:23 am ---With regards to CRT TVs? Absolutely. It’s clearly visible, especially in peripheral vision. 60Hz really is the lower bound for being flicker free, 50Hz is just below it.
--- End quote ---
I think the issue is more that if you watched a 60Hz TV you didn't have any 60Hz light sources around (everything being 50Hz in Switzerland), so you didn't get the beating effect that makes it much more visible. Or you have superhuman eyes.
--- Quote from: tooki on August 21, 2020, 10:21:23 am ---There’s a reason 100Hz (50Hz x2) TVs were developed, but nobody ever had to make 120Hz (60Hz x2) TVs.
--- End quote ---
That's not true, for example 120Hz CRTmonitors were completely common (and expensive!). And pretty much all 100Hz TVs were multi-standard, they were able to double both 50Hz PAL/SECAM signals and 60Hz NTSC.
--- End quote ---
Sorry, your assumptions are just all wrong.
When I moved from USA to Switzerland (obviously not bringing any TVs with us, since they wouldn’t work), I immediately noticed how TVs here were flickery. Nothing to do with lighting artifacts, nor with a side by side comparison, but rather something that I was readily able to detect in isolation. This has nothing to do with “superhuman” vision: It’s widely documented that human peripheral vision is easily capable of much faster flicker detection than central vision.
It’s my understanding that if a 100Hz TV supported NTSC input (which was FAR from universal!), that 60Hz was not frame-doubled. Maybe some could. Either way, that’s neither here nor there, since NTSC countries didn’t have 120Hz TVs. They never existed because 60Hz didn’t flicker the way 50Hz does. Nobody in an NTSC country would have had a 100Hz TV, even if it could do 120Hz.
120Hz-capable computer displays were common among high end CRTs, but that’s got nothing to do with TVs.
Navigation
[0] Message Index
[#] Next page
[*] Previous page
Go to full version