Author Topic: Why does NTSC use a strange frequency for the chroma carrier?  (Read 3541 times)

0 Members and 1 Guest are viewing this topic.

Offline Ben321Topic starter

  • Frequent Contributor
  • **
  • Posts: 894
Why does NTSC use a strange frequency for the chroma carrier?
« on: November 30, 2022, 11:10:31 pm »
Instead of something simple like 3.5MHz, it's supposed to be exactly 3.579545MHz and accurate to within 3PPM (parts per million), which at this frequency corresponds to a frequency error of no more than 10.738635Hz. So a more reasonable frequency like 3.5MHz is well outside the specification. Why did the inventors of NTSC decide on such an arbitrary frequency of 3.579545MHz for the chroma carrier frequency?
« Last Edit: November 30, 2022, 11:19:50 pm by Ben321 »
 

Offline james_s

  • Super Contributor
  • ***
  • Posts: 21611
  • Country: us
Re: Why does NTSC use a strange frequency for the chroma carrier?
« Reply #1 on: November 30, 2022, 11:15:08 pm »
Because one of the primary design goals was to make it compatible with existing B&W equipment.

Wikipedia actually has a reasonably detailed explanation.
The original black and white NTSC television standard specified a frame rate of 30 Hz and 525 lines per frame, or 15750 lines per second. The audio was frequency modulated 4.5 MHz above the video signal. Because this was black and white, the video consisted only of luminance (brightness) information. Although all of the space in between was occupied, the line-based nature of the video information meant that the luminance data was not spread uniformly across the frequency domain; it was concentrated at multiples of the line rate. Plotting the video signal on a spectrogram gave a signature that looked like the teeth of a comb or a gear, rather than smooth and uniform.

RCA discovered that if the chrominance (color) information, which had a similar spectrum, was modulated on a carrier that was a half-integer multiple of the line rate, its signal peaks would fit neatly between the peaks of the luminance data and interference was minimized. It was not eliminated, but what remained was not readily apparent to human eyes. (Modern televisions attempt to reduce this interference further using a comb filter.)

To provide sufficient bandwidth for the chrominance signal, yet interfere only with the highest-frequency (and thus least perceptible) portions of the luminance signal, a chrominance subcarrier near 3.6 MHz was desirable. 227.5 = 455/2 times the line rate was close to the right number, and 455's small factors (5 × 7 × 13) make a divider easy to construct.

However, additional interference could come from the audio signal. To minimize interference there, it was similarly desirable to make the distance between the chrominance carrier frequency and the audio carrier frequency a half-integer multiple of the line rate. The sum of these two half-integers implies that the distance between the frequency of the luminance carrier and audio carrier must be an integer multiple of the line rate. However, the original NTSC standard, with a 4.5 MHz carrier spacing and a 15750 Hz line rate, did not meet this requirement: the audio was 285.714 times the line rate.

While existing black and white receivers could not decode a signal with a different audio carrier frequency, they could easily use the copious timing information in the video signal to decode a slightly slower line rate. Thus, the new color television standard reduced the line rate by a factor of 1.001 to 1/286 of the 4.5 MHz audio subcarrier frequency, or about 15734.2657 Hz. This reduced the frame rate to 30/1.001 ≈ 29.9700 Hz, and placed the color subcarrier at 227.5/286 = 455/572 = 35/44 of the 4.5 MHz audio subcarrier.
 
The following users thanked this post: tom66, N2IXK, brabus, tooki, schmitt trigger, karpouzi9

Offline Ben321Topic starter

  • Frequent Contributor
  • **
  • Posts: 894
Re: Why does NTSC use a strange frequency for the chroma carrier?
« Reply #2 on: November 30, 2022, 11:22:06 pm »
Because one of the primary design goals was to make it compatible with existing B&W equipment.

Wikipedia actually has a reasonably detailed explanation.
The original black and white NTSC television standard specified a frame rate of 30 Hz and 525 lines per frame, or 15750 lines per second. The audio was frequency modulated 4.5 MHz above the video signal. Because this was black and white, the video consisted only of luminance (brightness) information. Although all of the space in between was occupied, the line-based nature of the video information meant that the luminance data was not spread uniformly across the frequency domain; it was concentrated at multiples of the line rate. Plotting the video signal on a spectrogram gave a signature that looked like the teeth of a comb or a gear, rather than smooth and uniform.

RCA discovered that if the chrominance (color) information, which had a similar spectrum, was modulated on a carrier that was a half-integer multiple of the line rate, its signal peaks would fit neatly between the peaks of the luminance data and interference was minimized. It was not eliminated, but what remained was not readily apparent to human eyes. (Modern televisions attempt to reduce this interference further using a comb filter.)

To provide sufficient bandwidth for the chrominance signal, yet interfere only with the highest-frequency (and thus least perceptible) portions of the luminance signal, a chrominance subcarrier near 3.6 MHz was desirable. 227.5 = 455/2 times the line rate was close to the right number, and 455's small factors (5 × 7 × 13) make a divider easy to construct.

However, additional interference could come from the audio signal. To minimize interference there, it was similarly desirable to make the distance between the chrominance carrier frequency and the audio carrier frequency a half-integer multiple of the line rate. The sum of these two half-integers implies that the distance between the frequency of the luminance carrier and audio carrier must be an integer multiple of the line rate. However, the original NTSC standard, with a 4.5 MHz carrier spacing and a 15750 Hz line rate, did not meet this requirement: the audio was 285.714 times the line rate.

While existing black and white receivers could not decode a signal with a different audio carrier frequency, they could easily use the copious timing information in the video signal to decode a slightly slower line rate. Thus, the new color television standard reduced the line rate by a factor of 1.001 to 1/286 of the 4.5 MHz audio subcarrier frequency, or about 15734.2657 Hz. This reduced the frame rate to 30/1.001 ≈ 29.9700 Hz, and placed the color subcarrier at 227.5/286 = 455/572 = 35/44 of the 4.5 MHz audio subcarrier.


Also there's this issue though.
That is, it isn't actually the ideal chroma carrier frequency for 60Hz powerline frequency, even though that theoretically keeping it mathematically related to the power line frequency was the intent. The old black&white TV vertical scanning frequency was exactly 60 fields per second, the same as the power line frequency, but with the chroma carrier frequency they chose, the scanning frequency now needs to be 59.94 fields per second. Why not just select a chroma carrier who's frequency would work with the existing power line frequency? Using the same math they used to derive the vertical scanning frequency from the chroma carrier frequency, I did the reverse and found the ideal chroma carrier frequency for a 60 fields per second vertical scanning frequency. And that would be 3.583125MHz. That would also allow for a nicer horizontal scanning frequency of 15750 lines per second (same as used on black&white TV) instead of the weird horizontal scanning frequency of 15734.26 lines per second.

Yet there must be some technical reason they chose the exact chroma carrier frequency they did. Is it easier to make quartz crystal oscillators at that exact frequency or something?
 

Offline james_s

  • Super Contributor
  • ***
  • Posts: 21611
  • Country: us
Re: Why does NTSC use a strange frequency for the chroma carrier?
« Reply #3 on: November 30, 2022, 11:43:48 pm »
The technical reason is detailed in the text I posted, the line frequency is irrelevant, maybe in the earliest days they tried to sync the vertical to the line frequency but that has not been done in recent history.
 

Online coppice

  • Super Contributor
  • ***
  • Posts: 8605
  • Country: gb
Re: Why does NTSC use a strange frequency for the chroma carrier?
« Reply #4 on: November 30, 2022, 11:47:48 pm »
Instead of something simple like 3.5MHz, it's supposed to be exactly 3.579545MHz and accurate to within 3PPM (parts per million), which at this frequency corresponds to a frequency error of no more than 10.738635Hz. So a more reasonable frequency like 3.5MHz is well outside the specification. Why did the inventors of NTSC decide on such an arbitrary frequency of 3.579545MHz for the chroma carrier frequency?
If you see a funky frequency in use there are 2 explanations that cover almost every case:
  • Parts for that frequency were in high volume use, and very cheap
  • Its maths. There is some pattern in the operations that leads naturally to the chosen frequency
Your mistake is using terms like "reasonable" and "arbitrary". They blind you to reasonable thinking, and make you write arbitrary rubbish. The first possibility above is clearly backwards. The mass availability of cheap 3.579545MHz crystals means a lot of later things chose to use that frequency, purely for cheapness. However, those crystals were not widely available when the NTSC spec was written, so it has to be a pattern in the maths. Others have explained the actual pattern.

« Last Edit: November 30, 2022, 11:49:29 pm by coppice »
 

Offline TimFox

  • Super Contributor
  • ***
  • Posts: 7934
  • Country: us
  • Retired, now restoring antique test equipment
Re: Why does NTSC use a strange frequency for the chroma carrier?
« Reply #5 on: November 30, 2022, 11:49:33 pm »
They couldn't make the frame rate coherent to the local 60 Hz power line, which is also unstable over relatively short times, and varies over geography with power grids.
The result of the ratio between 59.94 and 60 Hz can cause a "hum bar" to slowly march down the screen, with a roughly 16 second period.
It has nothing to do with quartz manufacture, since near a particular frequency the achievable tolerance (in ppm) is essentially the same.
US TV stations used approximately 3.58 MHz derived from rubidium frequency standards for the chrominance subcarrier.
Thereafter, 3.58 MHz crystals and crystal oscillators became popular (after the NTSC standard was adopted), due to mass production for TV receivers.
« Last Edit: November 30, 2022, 11:51:19 pm by TimFox »
 

Online coppice

  • Super Contributor
  • ***
  • Posts: 8605
  • Country: gb
Re: Why does NTSC use a strange frequency for the chroma carrier?
« Reply #6 on: November 30, 2022, 11:53:44 pm »
They couldn't make the frame rate coherent to the local 60 Hz power line, which is also unstable over relatively short times, and varies over geography with power grids.
The result of the ratio between 59.94 and 60 Hz can cause a "hum bar" to slowly march down the screen, with a roughly 16 second period.
It has nothing to do with quartz manufacture, since near a particular frequency the achievable tolerance (in ppm) is essentially the same.
US TV stations used 3.58 MHz derived from rubidium frequency standards for the chrominance subcarrier.
The use of rubidium clocks was not to make the transmission super accurate. It was so various feeds into the studio for live events didn't slew relative to each other. They could mix the feeds by simply delaying them until their frame timing lined up. These days we would just reclock the data, but the technology for that wasn't around when rubidium clocks were first chosen for this job.
 

Offline BrianHG

  • Super Contributor
  • ***
  • Posts: 7660
  • Country: ca
Re: Why does NTSC use a strange frequency for the chroma carrier?
« Reply #7 on: December 01, 2022, 12:19:13 am »
RCA discovered that if the chrominance (color) information, which had a similar spectrum, was modulated on a carrier that was a half-integer multiple of the line rate, its signal peaks would fit neatly between the peaks of the luminance data and interference was minimized. It was not eliminated, but what remained was not readily apparent to human eyes. (Modern televisions attempt to reduce this interference further using a comb filter.)
With that perfect '0.5 integer' oscillations per line, on older picture tubes which had slower phosphor, combined with the interlace to double the 240p to 480i, the continuous scan effect which can be seen as 'dot crawl' on modern better performance tubes was almost smeared to visual perfection on the display CRTs of the day when the standard was devised.  Yes, for the older B&W TV which had a finer/non dot-pitch, this dot-crawl effect was self-erasing instead of generating a random noise pattern over each saturated color if any other  frequency was used.

The true frequency is 3.579545 Mhz with a repeating 45 at the end.
The 4x frequency, 14.31818 Mhz, is the reference crystal used by many modern NTSC compatible display generators like OSD, small computers and cameras.  Through simple integer division, all the necessary constructs of an NTSC video signal can be generated.  EG: take the 14318180 Hz and divide by 910 to get the horizontal 15734.26 Hz.  Divide that by the 525 lines of video in an interlaced source and get the 29.970hz frame rate, x2=59.94hz laced.  Or, for 240p used by many older gaming consoles, 15734.26/262 = 60.034hz.
« Last Edit: December 01, 2022, 02:33:27 am by BrianHG »
 
The following users thanked this post: SeanB

Offline Ben321Topic starter

  • Frequent Contributor
  • **
  • Posts: 894
Re: Why does NTSC use a strange frequency for the chroma carrier?
« Reply #8 on: December 02, 2022, 12:51:46 pm »
RCA discovered that if the chrominance (color) information, which had a similar spectrum, was modulated on a carrier that was a half-integer multiple of the line rate, its signal peaks would fit neatly between the peaks of the luminance data and interference was minimized. It was not eliminated, but what remained was not readily apparent to human eyes. (Modern televisions attempt to reduce this interference further using a comb filter.)
With that perfect '0.5 integer' oscillations per line, on older picture tubes which had slower phosphor, combined with the interlace to double the 240p to 480i, the continuous scan effect which can be seen as 'dot crawl' on modern better performance tubes was almost smeared to visual perfection on the display CRTs of the day when the standard was devised.  Yes, for the older B&W TV which had a finer/non dot-pitch, this dot-crawl effect was self-erasing instead of generating a random noise pattern over each saturated color if any other  frequency was used.

The true frequency is 3.579545 Mhz with a repeating 45 at the end.
The 4x frequency, 14.31818 Mhz, is the reference crystal used by many modern NTSC compatible display generators like OSD, small computers and cameras.  Through simple integer division, all the necessary constructs of an NTSC video signal can be generated.  EG: take the 14318180 Hz and divide by 910 to get the horizontal 15734.26 Hz.  Divide that by the 525 lines of video in an interlaced source and get the 29.970hz frame rate, x2=59.94hz laced.  Or, for 240p used by many older gaming consoles, 15734.26/262 = 60.034hz.

Nope. Actually the 240p used by older consoles (or at least the N64 I tested it with) uses 263 lines, and thus a frame rate of 59.83 fps. The result of using an odd number of lines guaranties that the chroma carrier phase inverts with each frame, so it doesn't produce a stationary dot pattern on the screen. The dot pattern inverts brightness levels with each frame so it visually cancels out. If an even number of lines were used (like the 242 lines you suggested), it the chroma carrier wouldn't invert phase each frame, and thus a stationary dot pattern would be visible on the screen.
 

Online tom66

  • Super Contributor
  • ***
  • Posts: 6678
  • Country: gb
  • Electronics Hobbyist & FPGA/Embedded Systems EE
Re: Why does NTSC use a strange frequency for the chroma carrier?
« Reply #9 on: December 02, 2022, 01:32:15 pm »
A broadcast engineer once told me, earlier broadcasts were synchronised to mains frequency is it provided a convenient timing reference for the studio's equipment.  This would also eliminate interference from the slight flicker caused by the high intensity incandescent lamps in use in the studio.  This works even if the phases are different for lighting and cameras; the light level will vary throughout the scan for all phase options, but this would be much less noticeable than having a scrolling bar.  Placing lights on independent phases would further help reduce this issue.  I am not sure how common this was, but the lighting equipment would have consumed kilowatts so it seems reasonable to me.

The receiving sets themselves could not rely on the mains frequency being the same as the transmission frequency.  One, because the signal had to be broadcast some 100's of kilometers for some users, and even the speed of light variation over that distance was non-negligible (100km = 0.3ms, or about 5 lines, meanwhile the phase of the power line varies according to the paths through the power grid, so it gets messy to analyse.)  And the second reason was that the US had independent power grids, and still does today.  Most span multiple states, but back then some were powering only a few cities.   Nowadays I recall the largest is the East Coast one covering five states, but it is still conceivable a studio could be on one grid and a television receiver on another.  So the televisions always synchronised to the vertical blanking interval, except perhaps for the very earliest experiments by Baird and the like.  I imagine some older sets did benefit if the frequency was the same, because any bulk capacitors they used for filtering out the mains variation on supply rails would be in sync with the broadcast, but I doubt this persisted much past the early days as it couldn't be relied upon.

Of course, nowadays studios have independent timing references and do not use the mains frequency as a broadcast reference.
« Last Edit: December 02, 2022, 01:34:16 pm by tom66 »
 

Offline NiHaoMike

  • Super Contributor
  • ***
  • Posts: 8973
  • Country: us
  • "Don't turn it on - Take it apart!"
    • Facebook Page
Re: Why does NTSC use a strange frequency for the chroma carrier?
« Reply #10 on: December 02, 2022, 01:43:48 pm »
If you took all the frequencies and increased them slightly to increase the vertical refresh rate to exactly 60Hz, it will play just fine on old TVs. That probably wasn't an option for broadcast back then since it would increase the required bandwidth, even if only by a small amount.

They really should have abandoned that oddball frame rate and interlacing with the move to digital, it's not at all helpful in modern times.
Cryptocurrency has taught me to love math and at the same time be baffled by it.

Cryptocurrency lesson 0: Altcoins and Bitcoin are not the same thing.
 

Offline dmills

  • Super Contributor
  • ***
  • Posts: 2093
  • Country: gb
Re: Why does NTSC use a strange frequency for the chroma carrier?
« Reply #11 on: December 02, 2022, 03:07:43 pm »
In the early days of black and white telly, the power grid sync was very much a thing for a couple of reasons.

Firstly, a static hum bar is less problematic then a moving one on the rx side, and high voltage caps were EXPENSIVE enough that quality gear used chokes (of several henries at significant DC current) to help out with ripple reduction, when you are trying to build a $100 TV set (As 'Mad' Muntz did, fascinating story), expensive caps and large chokes don't cut it.

Secondly, TV stations were fairly local affairs, with a GOOD set you could be all of 40 miles from the transmitter, with a cheap one, maybe 10 miles, it was mostly NOT a national sort of thing, so nationwide sync was not required.

Thirdly, outside broadcast needed the truck to be running at the same frame rate as the studio, most easily achieved by taking a reference from the local lower grid, phase might differ somewhat but there are ways around that. In a time before GPS or reasonable broadcast references (LORAN) the grid provided a locally reliable timing reference.

Of course with black and white the need to match chroma phase didn't exist, so it was mainly about frame timing (You could skew the line timing with massive coils of coax in the stations basement, there is a reason BBC television centre was built with the studios in a circle around a central building housing the galleries and apparatus rooms, it was so that all the cables were the same length!). 

And yes, "Never Twice Same Colour" assumed that the more modern gear could cope without being synced to the mains, and we have been swearing about 'drop frame' timecode ever since (That 1000/1001 thing screws up converting frame counts to time in an easy way, and it is FAR too easy to not notice that the video is drop frame but the audio isn't.....).

 
 
The following users thanked this post: SeanB

Offline BrianHG

  • Super Contributor
  • ***
  • Posts: 7660
  • Country: ca
Re: Why does NTSC use a strange frequency for the chroma carrier?
« Reply #12 on: December 02, 2022, 10:38:25 pm »
RCA discovered that if the chrominance (color) information, which had a similar spectrum, was modulated on a carrier that was a half-integer multiple of the line rate, its signal peaks would fit neatly between the peaks of the luminance data and interference was minimized. It was not eliminated, but what remained was not readily apparent to human eyes. (Modern televisions attempt to reduce this interference further using a comb filter.)
With that perfect '0.5 integer' oscillations per line, on older picture tubes which had slower phosphor, combined with the interlace to double the 240p to 480i, the continuous scan effect which can be seen as 'dot crawl' on modern better performance tubes was almost smeared to visual perfection on the display CRTs of the day when the standard was devised.  Yes, for the older B&W TV which had a finer/non dot-pitch, this dot-crawl effect was self-erasing instead of generating a random noise pattern over each saturated color if any other  frequency was used.

The true frequency is 3.579545 Mhz with a repeating 45 at the end.
The 4x frequency, 14.31818 Mhz, is the reference crystal used by many modern NTSC compatible display generators like OSD, small computers and cameras.  Through simple integer division, all the necessary constructs of an NTSC video signal can be generated.  EG: take the 14318180 Hz and divide by 910 to get the horizontal 15734.26 Hz.  Divide that by the 525 lines of video in an interlaced source and get the 29.970hz frame rate, x2=59.94hz laced.  Or, for 240p used by many older gaming consoles, 15734.26/262 = 60.034hz.

Nope. Actually the 240p used by older consoles (or at least the N64 I tested it with) uses 263 lines, and thus a frame rate of 59.83 fps. The result of using an odd number of lines guaranties that the chroma carrier phase inverts with each frame, so it doesn't produce a stationary dot pattern on the screen. The dot pattern inverts brightness levels with each frame so it visually cancels out. If an even number of lines were used (like the 242 lines you suggested), it the chroma carrier wouldn't invert phase each frame, and thus a stationary dot pattern would be visible on the screen.
Funny, my Atari 800, Commodore 64, Amiga 1000 in progressive mode, Nintendo Entertainment System, Apple IIc, Tektronix reference color bar generator in progressive mode, all had no dot crawl at the edge of their colored graphics.  They all produced a frozen zipper dot pattern and they all used 262 lines.  The N64 had an interlaced output.
« Last Edit: December 02, 2022, 10:53:38 pm by BrianHG »
 

Offline EPAIII

  • Super Contributor
  • ***
  • Posts: 1022
  • Country: us
Re: Why does NTSC use a strange frequency for the chroma carrier?
« Reply #13 on: December 03, 2022, 10:50:20 am »
There's a lot of good information here and some that is just nonsense. As a career broadcast engineer I saw from the conversion from B&W to color TV to the conversion to digital. Here is my attempt to separate the facts from the BS.

B&W TV came first and was based on five numbers: 30 Hz Frame Rate, 60 Hz Field Rate, 15750 Hz Horizontal Rate, 525 Horizontal Lines per Frame, and 2 Fields per Frame. Those numbers are all exactly compatible.

15750 lines per second / 30 frames per second = 525 lines per frame

15750 lines per second / 60 frames per second = 256.5 lines per field

There were two fields per frame and strangely they were called Frame 1 and Frame 2.
The half line fraction in the field rate provided the interlace where the half line number 256.5 which started Frame 2, in theory, started half way across the screen and between lines 1 and 2 of Frame 1. This was done in B&W TV to minimize the appearance of flicker. Two half images, which were superimposed (interlaced) over each other formed one complete image. So the flicker rate was 60 Hz but the information rate was only 30 Hz.

Since B&W TV was well established before color came along, a major goal was compatibility. The old B&W TVs had to receive and display, in black and white of course, the new color broadcasts. And the new color TVs had to do like wise for B&W broadcasts. So no major changes could be made to the five numbers I gave above. But TVs in those days used TUBE circuitry. I remember the TV service man coming to our house with a big case of TUBES and most of the problems were fixed in ten or fifteen minutes with one or more new tubes. These tube circuits had loose tolerances and often things like the horizontal and vertical frequency controls had to be adjusted: they were up front on most TVs of that day. So the older B&W and even the new color TVs would tolerate some small changes in those numbers.

James and his reference to Wikipedia explains the why of the 3.579545 Mhz color sub carrier. It was chosen so that the energy due to that sub carrier fell between that of the horizontal rate. But a small adjustment had to be made in that horizontal rate to keep all the numbers compatible with each other. Please see the Wikipedia explanation. So they adjusted the color horizontal rate to be 15734+. And since there were exactly 525 horizontal lines per frame, the frame also had to have a slightly lower frequency.

All the talk about hum bars in the video may have been a factor in some B&W TVs, but I watched a lot of B&W TV and, while possible, it was not a major factor. For the most part it probably meant that the TV needed something beyond a new tube or two so it went to the shop for repairs. In other words, the design engineers at RCA and other TV equipment OEMs knew it could happen, but they choose to all that possibility. It would probably sell more new, color TVs.

As for TV stations using the 60 Hz power line frequency for synchronization, that was never done in any facility I worked in. While the long term accuracy of the power line frequency was and still is used to synchronize household and other clocks, in the short term (minutes and hours in duration) it could drift quite a bit as loads were added and shed by all the power customers across the country. They lost time in the daytime hours and sped their generators up to regain it at night. That was simply not good enough to keep a TV station, even a B&W station, within legal limits so it was never used. And the only facilities that had rubidium standards were the three/four networks. I guess they used them for synchronization between their major facilities. But they were expensive so were not widely used. TV stations had what were called sync generators which had a crystal oscillator, usually in a temperature controlled oven, and dividers that generated all the lower frequency signals from the oscillator's frequency. Most stations had at least two with an automatic changeover switch for reliability. I had a bad sync generator once in a B&W station and almost nothing else worked properly while it was bad. The sync generator was the heartbeat of the station.

All the sync generators had an external input for a signal that could be used to lock it's oscillator and all the additional signals it produced into step with any external video signal that was provided. Of course, when this happened there was a very noticeable jump or roll in the video. When either a regular TV station or a "remote truck" was going to send a live video feed to a network center, it was normally done by synchronizing that station's or remote truck's second sync generator to the network signal coming from network's New York or LA facility. Then during a station break, but not a commercial, and BEFORE the feed to the network center was to begin, that second sync generator was switched to be the primary one "on line". The "on air" picture would jump but not in a program or commercial. The station or remote truck was then synchronized with the network control center and their feed was synchronized so the network center could do things like switches or fades or other video effects with no noticeable problems. Shortly before the network center was going to use that video feed, the external signal at the station or remote truck would be disconnected allowing their sync generator to free wheel. But the network would synchronize their generator to the incoming feed from that facility. The station or remote truck was the source of all timing for the duration. It was a timing dance between the network and the remote facility and sometimes it went wrong. Then something like a major league ball game would be in a feedback loop with the network center with both trying to lock to the other and the pictures were totally scrambled. That was how it was done before digital synchronizers became affordable.

As to the use of sync generators in TV stations being modern or just today, that is total BS. My first sync generator, at a station in the 1960s was a tube model. It was a nightmare to keep running properly. But it did the job. We replaced it with a transistor based one a few months later. The tolerance on the color SC (the 3.579545 Mhz oscillator) was +/- 10 Hz. We often kept it within +/- 3 Hz and the latest oscillators which are in sync generators for digital stations can do better than +/- 1 Hz at that frequency by using higher frequency oscillators and dividing down. And yes, some TV stations do still use the older signals from their sync generators.

Yes, the change from the "exact" 60 Hz Field rate to 59.9... did produce problems for video editing. But, for the most part, those problems only surfaced long after color TV was firmly established. They came into the fore when digital editing systems were developed. Before that, all video editing was done on video tape recorders which had either mechanical timers or simple digital readouts that worked from those mechanical timers. Time code on the video tape was developed somewhere between when the first digital editors became available and the editors just knew that adjustments were needed. In my last job, an editing facility, we had to be precise and the digital editing systems we used did take this into account. There were two types of time code in use then. Drop frame time code omitted two counts on certain frames of the video to keep the time code in sync with a real world clock. And non drop frame did not omit those two counts so the time code was a number of seconds off by the end of a half hour program. We had calculators for this when needed. I wrote one using Excel.

Oh well, this is long enough.
« Last Edit: December 03, 2022, 11:01:56 am by EPAIII »
Paul A.  -   SE Texas
And if you look REAL close at an analog signal,
You will find that it has discrete steps.
 
The following users thanked this post: SeanB, rsjsouza, voltsandjolts, Mr.B, Fgrir, james_s, schmitt trigger, Exosia

Offline BrianHG

  • Super Contributor
  • ***
  • Posts: 7660
  • Country: ca
Re: Why does NTSC use a strange frequency for the chroma carrier?
« Reply #14 on: December 03, 2022, 11:26:08 am »
Some additional info: (Basically EPAIII's response in video form...)


« Last Edit: December 03, 2022, 11:30:03 am by BrianHG »
 

Offline dmills

  • Super Contributor
  • ***
  • Posts: 2093
  • Country: gb
Re: Why does NTSC use a strange frequency for the chroma carrier?
« Reply #15 on: December 03, 2022, 10:31:53 pm »
My feeling is that we forget how quickly the consumer electronics industry was moving back in the post way years.
I have no trouble believing that locking the hum bar was a real advantage in the early days of black and white and was unimportant (even for most black and white sets) by the time colour arrived.
The UK switched from 405 line VHF to 625 line PAL UHF at around this time so a lot of the older sets would have became obsolete relatively quickly even if the VHF transmitters were still operating simply because people wanted colour and the higher resolution made a difference even if you could only afford the newer black and white set.

The BBC were still taking a mains hookup out of a nearby house to sync their generators so that the generated power frequency would track the grid as late as five years ago, it may well have been about flicker reduction on the camera side when shooting something in a street with LED streetlighting that had some flicker (or the once common domestic CFL that often had horrific flicker). 

Now that is 50 fields per second, with no weird offset and the UK has one grid with no internal DC links or such so the whole thing tracks reasonably closely absent transient events.

It might just be something carried over as a 'best practise' thing from the just post war era, or as I say they might be shooting 50hz and doing it for flicker reduction. I have seen this when shooting features for later broadcast which means that global house sync is not a thing, obviously unworkable if chasing GPS for a live broadcast.

House sync is still very much a thing, Tek/Telestream being the usual suspects in an SDI plant, with Meinburg or sometimes Evertz doing the job for a baseband IP facility (There are alternative vendors for both, but probably 90% of the time these are what you will see, SPG8000 and such).
These days they chase GPS time as a global reference so as long as you have punched in the appropriate link delay offset you can have local video and remote contribution arrive at the switcher with at worst a few lines of offset. Things are MUCH easier when you have a global time reference that works with negligible offset.

 A fun one is that while classic gen lock has a delay equal to sync gen -> camera -> router, something like 2110 using PTP measures and compensates the delay from the sync gen -> camera, and separately does the delay sync gen -> router so the amount of cable delay compensation is different... Had more then one headache explaining that one.   

DF timecode became MUCH more of a pain as the audio workflow and video side drifted apart as the DAW became much more of a goto weapon for audio while video was still mostly on tape.

Yea, broadcast engineers are usually time nuts, comes with the territory.
 
 
 
 

Offline schmitt trigger

  • Super Contributor
  • ***
  • Posts: 2205
  • Country: mx
Re: Why does NTSC use a strange frequency for the chroma carrier?
« Reply #16 on: December 04, 2022, 02:59:07 am »
Lengthy but very interesting information EPAIII.

It reminded me of my time as an intern in a local TV station during the summer of 1977. I distinctly remember the sync generator cabinet, full of tubes and the size of a fridge. It was being replaced with a transistorized unit. I was there helping with the cutting and preparing dozens of cables.
A couple of days before we were to turn the unit “live”, the anxiety was palpable. When it finally did, on the wee hours of a Sunday morning, everyone cheered. My young, ignorant mind could not comprehend why all the fuzz.
But as your explanation eloquently shows, without a sync generator you didn’t have a TV station. Period.
 

Offline Ben321Topic starter

  • Frequent Contributor
  • **
  • Posts: 894
Re: Why does NTSC use a strange frequency for the chroma carrier?
« Reply #17 on: December 29, 2022, 09:24:38 pm »
without a sync generator you didn’t have a TV station. Period.

Not entirely true. Having a single sync generator for your station is good to stop issues when switching between cameras. Without it, each camera would need to have its own internal sync pulse generator, and when switching cameras, the sync signals would switch to the next camera. The result is that when receiving the signal on a TV receiver, the TV would momentarily lose sync and the picture would flash momentarily as it locked onto the new camera's sync signal. This would maybe be annoying to TV watchers, but it wouldn't prevent the broadcast from being received. Remember, the sync pulses are detected by circuits in each TV receiver. So you don't actually need the national power grid to act as a master-clock for the entire country's TV system, nor does your TV station need a locally-generated master-clock (it only needs one to avoid this camera-switching visual artifact). There's nothing in terms of a technical impossibility that would prevent a TV station from operating with each camera containing its own internal sync pulse generator.
 

Offline NiHaoMike

  • Super Contributor
  • ***
  • Posts: 8973
  • Country: us
  • "Don't turn it on - Take it apart!"
    • Facebook Page
Re: Why does NTSC use a strange frequency for the chroma carrier?
« Reply #18 on: December 30, 2022, 04:54:45 am »
Not entirely true. Having a single sync generator for your station is good to stop issues when switching between cameras. Without it, each camera would need to have its own internal sync pulse generator, and when switching cameras, the sync signals would switch to the next camera. The result is that when receiving the signal on a TV receiver, the TV would momentarily lose sync and the picture would flash momentarily as it locked onto the new camera's sync signal. This would maybe be annoying to TV watchers, but it wouldn't prevent the broadcast from being received. Remember, the sync pulses are detected by circuits in each TV receiver. So you don't actually need the national power grid to act as a master-clock for the entire country's TV system, nor does your TV station need a locally-generated master-clock (it only needs one to avoid this camera-switching visual artifact). There's nothing in terms of a technical impossibility that would prevent a TV station from operating with each camera containing its own internal sync pulse generator.
Fran did a video on a 60s era video editing machine.

Cryptocurrency has taught me to love math and at the same time be baffled by it.

Cryptocurrency lesson 0: Altcoins and Bitcoin are not the same thing.
 

Offline EPAIII

  • Super Contributor
  • ***
  • Posts: 1022
  • Country: us
Re: Why does NTSC use a strange frequency for the chroma carrier?
« Reply #19 on: January 03, 2023, 10:20:43 am »
Oh yea? Try explaining that to any of the chief engineers I worked for or to me on the occasions when I held that job. Lets see now, without ONE sync generator running the entire station:

Every time the station went from a program to a commercial, from one commercial to another, from a commercial to a station ID, from any of the above back to a program, the picture would roll on all those TV sets. And each and every sponsor who had PAID for those commercials would refuse to pay. As a side note, back in the day before video tape recorders, sponsors played games with the TV stations. They deliberately made the commercials so difficult to perform on a live basis that the announcer or other talent involved often made mistakes. And then the sponsor demanded what was called a "make good". That was a free airing of the same commercial. A two for the price of one bargain. Oh, and THAT is the real reason why the local stations purchased video tape recorders. They recorded the commercials to be sure they didn't need to give away free air time. And the sync generator kept those video tape recorders in step with the programs so no rolling picture and no free air time.

If the station had a live or recorded studio program with more than one camera (they all did) then the picture would roll whenever a switch was made from one camera to another. That would have been a horrible distraction and it is one that I doubt that anyone here can ever recall seeing.

If even the simplest text message was to be placed in either a live or a recorded program, then the program's source and the text's source had to be synchronized. There's that pesky old SINGLE sync generator again.

And don't even get me started with the news programs. Those people could drive you to the insane asylum if even the tinyest thing went wrong.

No, I am sorry, but my TV career started in the 1960s and even long before that point, a TV station without a sync generator was totally unthinkable. And probably cause for losing your job if it failed. That's why there were always at least two with an automatic changeover switch. Some stations even had three. But ONLY ONE was in use at any given time.

And I don't know where this power line business came from. The sync generators in the individual stations had their own master oscillators. The only nation wide clock was the network that the station belonged to and they only synchronized to the network's signal when necessary. The power lines then, and now drifted far too much to ever use even for just a black and white facility. At least from the 1960s to the present day the power line frequency was NEVER used by TV stations. Or the networks.



without a sync generator you didn’t have a TV station. Period.

Not entirely true. Having a single sync generator for your station is good to stop issues when switching between cameras. Without it, each camera would need to have its own internal sync pulse generator, and when switching cameras, the sync signals would switch to the next camera. The result is that when receiving the signal on a TV receiver, the TV would momentarily lose sync and the picture would flash momentarily as it locked onto the new camera's sync signal. This would maybe be annoying to TV watchers, but it wouldn't prevent the broadcast from being received. Remember, the sync pulses are detected by circuits in each TV receiver. So you don't actually need the national power grid to act as a master-clock for the entire country's TV system, nor does your TV station need a locally-generated master-clock (it only needs one to avoid this camera-switching visual artifact). There's nothing in terms of a technical impossibility that would prevent a TV station from operating with each camera containing its own internal sync pulse generator.
Paul A.  -   SE Texas
And if you look REAL close at an analog signal,
You will find that it has discrete steps.
 
The following users thanked this post: SeanB, rsjsouza

Offline EPAIII

  • Super Contributor
  • ***
  • Posts: 1022
  • Country: us
Re: Why does NTSC use a strange frequency for the chroma carrier?
« Reply #20 on: January 03, 2023, 10:27:31 am »
PS: The broadcast video cameras did and still do have have their own, internal sync generator circuits. This allows them to be used on a stand-alone basis. But when they are in a studio environment or as part of a remote pickup with more than one camera, they are always synchronized with a sync generator or a single camera used as the master. Broadcast level cameras always have an input jack for importing that synchronizing signal.
Paul A.  -   SE Texas
And if you look REAL close at an analog signal,
You will find that it has discrete steps.
 
The following users thanked this post: tom66, SeanB, rsjsouza

Offline SeanB

  • Super Contributor
  • ***
  • Posts: 16272
  • Country: za
Re: Why does NTSC use a strange frequency for the chroma carrier?
« Reply #21 on: January 06, 2023, 11:34:53 am »
Yes broadcast cameras can both run external sync or run free, and generally they now all are free running, because the control van normally has a frame buffer and resynchroniser for every analogue camera, and for digital inputs that is provided by the input side as it receives full frames of data and sends them back out to the control. Having a sync though does mean all the cameras will run at the exact frame rate, so there is no chance of a frame being repeated or dropped, as the clocks drift with long broadcasts. modern digital stores are great, even on things as mundane as the broadcast VTR, which has it both at input and output, which allows for a perfect freeze frame, with seamless resume, as it simply records the freeze frame off the live video, then stops the tape, and reverses back a few frames, so that it can resume at exactly that frame. All due to cheap memory, which was a game changer.

IIRC the BBC still broadcast 405 line TV till 1985, starting in 1937. When the transmitters were turned off they reckoned that there were possibly 10 viewers who still had a working 405 line TV set. 48 years of the same standard, which is pretty much the longest TV standard to be in continual use.
 

Offline rsjsouza

  • Super Contributor
  • ***
  • Posts: 5980
  • Country: us
  • Eternally curious
    • Vbe - vídeo blog eletrônico
Re: Why does NTSC use a strange frequency for the chroma carrier?
« Reply #22 on: January 07, 2023, 11:07:47 pm »
I loved this information, EPAIII! Quite interesting to see how things are much more complex than my then young mind would simply take for granted.
Vbe - vídeo blog eletrônico http://videos.vbeletronico.com

Oh, the "whys" of the datasheets... The information is there not to be an axiomatic truth, but instead each speck of data must be slowly inhaled while carefully performing a deep search inside oneself to find the true metaphysical sense...
 

Offline Ben321Topic starter

  • Frequent Contributor
  • **
  • Posts: 894
Re: Why does NTSC use a strange frequency for the chroma carrier?
« Reply #23 on: January 14, 2023, 05:18:46 am »
And I don't know where this power line business came from. The sync generators in the individual stations had their own master oscillators. The only nation wide clock was the network that the station belonged to and they only synchronized to the network's signal when necessary. The power lines then, and now drifted far too much to ever use even for just a black and white facility. At least from the 1960s to the present day the power line frequency was NEVER used by TV stations. Or the networks.

The power line business came from the fact that in whatever country a TV standard was designed in, the field rate is always the same as the power line frequency (at least for the black&white version of the standard). In the US, the TV field rate is 60Hz. In Europe it's 50Hz. If the field rate was arbitrary, and not based on the power line frequency, then why is it that TV signal standards always use the same frequency as the power line frequency in that country for the video signal field rate?
 

Offline BrianHG

  • Super Contributor
  • ***
  • Posts: 7660
  • Country: ca
Re: Why does NTSC use a strange frequency for the chroma carrier?
« Reply #24 on: January 14, 2023, 10:59:25 am »
And I don't know where this power line business came from. The sync generators in the individual stations had their own master oscillators. The only nation wide clock was the network that the station belonged to and they only synchronized to the network's signal when necessary. The power lines then, and now drifted far too much to ever use even for just a black and white facility. At least from the 1960s to the present day the power line frequency was NEVER used by TV stations. Or the networks.

The power line business came from the fact that in whatever country a TV standard was designed in, the field rate is always the same as the power line frequency (at least for the black&white version of the standard). In the US, the TV field rate is 60Hz. In Europe it's 50Hz. If the field rate was arbitrary, and not based on the power line frequency, then why is it that TV signal standards always use the same frequency as the power line frequency in that country for the video signal field rate?

In the day of vacuum tubes and not well regulated power supplies for CRT tubes, whose beam location is positioned by 'magnetic' field generated by the electronics, if in Europe, if you created a 55hz video standard where the power grid is 50hz, your picture would visibly 'wobble' 5 times a second because of both the internal circuitry precision and external 50hz magnetic fields coming from stray power lines or transformers.  This was before they day of feedback and regulated drive circuitry for the magnetic 'yoke' which positions the electron beam forming the picture's raster.

Having a 50hz picture with a source interference of 50hz from your source power, this ever so slight bend in the picture would not be visible as it would be stationary or scrolling vertically too slow to be noticed.  (If this bend were to scroll vertically fast enough, IE: my above mentioned 5hz for a 55hz field rate, that bend will appear as a wobble in the image.)
« Last Edit: January 14, 2023, 11:39:23 am by BrianHG »
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf