General > General Technical Chat
Why does NTSC use a strange frequency for the chroma carrier?
<< < (3/7) > >>
NiHaoMike:
If you took all the frequencies and increased them slightly to increase the vertical refresh rate to exactly 60Hz, it will play just fine on old TVs. That probably wasn't an option for broadcast back then since it would increase the required bandwidth, even if only by a small amount.

They really should have abandoned that oddball frame rate and interlacing with the move to digital, it's not at all helpful in modern times.
dmills:
In the early days of black and white telly, the power grid sync was very much a thing for a couple of reasons.

Firstly, a static hum bar is less problematic then a moving one on the rx side, and high voltage caps were EXPENSIVE enough that quality gear used chokes (of several henries at significant DC current) to help out with ripple reduction, when you are trying to build a $100 TV set (As 'Mad' Muntz did, fascinating story), expensive caps and large chokes don't cut it.

Secondly, TV stations were fairly local affairs, with a GOOD set you could be all of 40 miles from the transmitter, with a cheap one, maybe 10 miles, it was mostly NOT a national sort of thing, so nationwide sync was not required.

Thirdly, outside broadcast needed the truck to be running at the same frame rate as the studio, most easily achieved by taking a reference from the local lower grid, phase might differ somewhat but there are ways around that. In a time before GPS or reasonable broadcast references (LORAN) the grid provided a locally reliable timing reference.

Of course with black and white the need to match chroma phase didn't exist, so it was mainly about frame timing (You could skew the line timing with massive coils of coax in the stations basement, there is a reason BBC television centre was built with the studios in a circle around a central building housing the galleries and apparatus rooms, it was so that all the cables were the same length!). 

And yes, "Never Twice Same Colour" assumed that the more modern gear could cope without being synced to the mains, and we have been swearing about 'drop frame' timecode ever since (That 1000/1001 thing screws up converting frame counts to time in an easy way, and it is FAR too easy to not notice that the video is drop frame but the audio isn't.....).

 
BrianHG:

--- Quote from: Ben321 on December 02, 2022, 12:51:46 pm ---
--- Quote from: BrianHG on December 01, 2022, 12:19:13 am ---
--- Quote from: james_s on November 30, 2022, 11:15:08 pm ---RCA discovered that if the chrominance (color) information, which had a similar spectrum, was modulated on a carrier that was a half-integer multiple of the line rate, its signal peaks would fit neatly between the peaks of the luminance data and interference was minimized. It was not eliminated, but what remained was not readily apparent to human eyes. (Modern televisions attempt to reduce this interference further using a comb filter.)

--- End quote ---
With that perfect '0.5 integer' oscillations per line, on older picture tubes which had slower phosphor, combined with the interlace to double the 240p to 480i, the continuous scan effect which can be seen as 'dot crawl' on modern better performance tubes was almost smeared to visual perfection on the display CRTs of the day when the standard was devised.  Yes, for the older B&W TV which had a finer/non dot-pitch, this dot-crawl effect was self-erasing instead of generating a random noise pattern over each saturated color if any other  frequency was used.

The true frequency is 3.579545 Mhz with a repeating 45 at the end.
The 4x frequency, 14.31818 Mhz, is the reference crystal used by many modern NTSC compatible display generators like OSD, small computers and cameras.  Through simple integer division, all the necessary constructs of an NTSC video signal can be generated.  EG: take the 14318180 Hz and divide by 910 to get the horizontal 15734.26 Hz.  Divide that by the 525 lines of video in an interlaced source and get the 29.970hz frame rate, x2=59.94hz laced.  Or, for 240p used by many older gaming consoles, 15734.26/262 = 60.034hz.

--- End quote ---

Nope. Actually the 240p used by older consoles (or at least the N64 I tested it with) uses 263 lines, and thus a frame rate of 59.83 fps. The result of using an odd number of lines guaranties that the chroma carrier phase inverts with each frame, so it doesn't produce a stationary dot pattern on the screen. The dot pattern inverts brightness levels with each frame so it visually cancels out. If an even number of lines were used (like the 242 lines you suggested), it the chroma carrier wouldn't invert phase each frame, and thus a stationary dot pattern would be visible on the screen.

--- End quote ---
Funny, my Atari 800, Commodore 64, Amiga 1000 in progressive mode, Nintendo Entertainment System, Apple IIc, Tektronix reference color bar generator in progressive mode, all had no dot crawl at the edge of their colored graphics.  They all produced a frozen zipper dot pattern and they all used 262 lines.  The N64 had an interlaced output.
EPAIII:
There's a lot of good information here and some that is just nonsense. As a career broadcast engineer I saw from the conversion from B&W to color TV to the conversion to digital. Here is my attempt to separate the facts from the BS.

B&W TV came first and was based on five numbers: 30 Hz Frame Rate, 60 Hz Field Rate, 15750 Hz Horizontal Rate, 525 Horizontal Lines per Frame, and 2 Fields per Frame. Those numbers are all exactly compatible.

15750 lines per second / 30 frames per second = 525 lines per frame

15750 lines per second / 60 frames per second = 256.5 lines per field

There were two fields per frame and strangely they were called Frame 1 and Frame 2.
The half line fraction in the field rate provided the interlace where the half line number 256.5 which started Frame 2, in theory, started half way across the screen and between lines 1 and 2 of Frame 1. This was done in B&W TV to minimize the appearance of flicker. Two half images, which were superimposed (interlaced) over each other formed one complete image. So the flicker rate was 60 Hz but the information rate was only 30 Hz.

Since B&W TV was well established before color came along, a major goal was compatibility. The old B&W TVs had to receive and display, in black and white of course, the new color broadcasts. And the new color TVs had to do like wise for B&W broadcasts. So no major changes could be made to the five numbers I gave above. But TVs in those days used TUBE circuitry. I remember the TV service man coming to our house with a big case of TUBES and most of the problems were fixed in ten or fifteen minutes with one or more new tubes. These tube circuits had loose tolerances and often things like the horizontal and vertical frequency controls had to be adjusted: they were up front on most TVs of that day. So the older B&W and even the new color TVs would tolerate some small changes in those numbers.

James and his reference to Wikipedia explains the why of the 3.579545 Mhz color sub carrier. It was chosen so that the energy due to that sub carrier fell between that of the horizontal rate. But a small adjustment had to be made in that horizontal rate to keep all the numbers compatible with each other. Please see the Wikipedia explanation. So they adjusted the color horizontal rate to be 15734+. And since there were exactly 525 horizontal lines per frame, the frame also had to have a slightly lower frequency.

All the talk about hum bars in the video may have been a factor in some B&W TVs, but I watched a lot of B&W TV and, while possible, it was not a major factor. For the most part it probably meant that the TV needed something beyond a new tube or two so it went to the shop for repairs. In other words, the design engineers at RCA and other TV equipment OEMs knew it could happen, but they choose to all that possibility. It would probably sell more new, color TVs.

As for TV stations using the 60 Hz power line frequency for synchronization, that was never done in any facility I worked in. While the long term accuracy of the power line frequency was and still is used to synchronize household and other clocks, in the short term (minutes and hours in duration) it could drift quite a bit as loads were added and shed by all the power customers across the country. They lost time in the daytime hours and sped their generators up to regain it at night. That was simply not good enough to keep a TV station, even a B&W station, within legal limits so it was never used. And the only facilities that had rubidium standards were the three/four networks. I guess they used them for synchronization between their major facilities. But they were expensive so were not widely used. TV stations had what were called sync generators which had a crystal oscillator, usually in a temperature controlled oven, and dividers that generated all the lower frequency signals from the oscillator's frequency. Most stations had at least two with an automatic changeover switch for reliability. I had a bad sync generator once in a B&W station and almost nothing else worked properly while it was bad. The sync generator was the heartbeat of the station.

All the sync generators had an external input for a signal that could be used to lock it's oscillator and all the additional signals it produced into step with any external video signal that was provided. Of course, when this happened there was a very noticeable jump or roll in the video. When either a regular TV station or a "remote truck" was going to send a live video feed to a network center, it was normally done by synchronizing that station's or remote truck's second sync generator to the network signal coming from network's New York or LA facility. Then during a station break, but not a commercial, and BEFORE the feed to the network center was to begin, that second sync generator was switched to be the primary one "on line". The "on air" picture would jump but not in a program or commercial. The station or remote truck was then synchronized with the network control center and their feed was synchronized so the network center could do things like switches or fades or other video effects with no noticeable problems. Shortly before the network center was going to use that video feed, the external signal at the station or remote truck would be disconnected allowing their sync generator to free wheel. But the network would synchronize their generator to the incoming feed from that facility. The station or remote truck was the source of all timing for the duration. It was a timing dance between the network and the remote facility and sometimes it went wrong. Then something like a major league ball game would be in a feedback loop with the network center with both trying to lock to the other and the pictures were totally scrambled. That was how it was done before digital synchronizers became affordable.

As to the use of sync generators in TV stations being modern or just today, that is total BS. My first sync generator, at a station in the 1960s was a tube model. It was a nightmare to keep running properly. But it did the job. We replaced it with a transistor based one a few months later. The tolerance on the color SC (the 3.579545 Mhz oscillator) was +/- 10 Hz. We often kept it within +/- 3 Hz and the latest oscillators which are in sync generators for digital stations can do better than +/- 1 Hz at that frequency by using higher frequency oscillators and dividing down. And yes, some TV stations do still use the older signals from their sync generators.

Yes, the change from the "exact" 60 Hz Field rate to 59.9... did produce problems for video editing. But, for the most part, those problems only surfaced long after color TV was firmly established. They came into the fore when digital editing systems were developed. Before that, all video editing was done on video tape recorders which had either mechanical timers or simple digital readouts that worked from those mechanical timers. Time code on the video tape was developed somewhere between when the first digital editors became available and the editors just knew that adjustments were needed. In my last job, an editing facility, we had to be precise and the digital editing systems we used did take this into account. There were two types of time code in use then. Drop frame time code omitted two counts on certain frames of the video to keep the time code in sync with a real world clock. And non drop frame did not omit those two counts so the time code was a number of seconds off by the end of a half hour program. We had calculators for this when needed. I wrote one using Excel.

Oh well, this is long enough.
BrianHG:
Some additional info: (Basically EPAIII's response in video form...)


Navigation
Message Index
Next page
Previous page
There was an error while thanking
Thanking...

Go to full version
Powered by SMFPacks Advanced Attachments Uploader Mod