General > General Technical Chat

Why does NTSC use a strange frequency for the chroma carrier?

(1/7) > >>

Ben321:
Instead of something simple like 3.5MHz, it's supposed to be exactly 3.579545MHz and accurate to within 3PPM (parts per million), which at this frequency corresponds to a frequency error of no more than 10.738635Hz. So a more reasonable frequency like 3.5MHz is well outside the specification. Why did the inventors of NTSC decide on such an arbitrary frequency of 3.579545MHz for the chroma carrier frequency?

james_s:
Because one of the primary design goals was to make it compatible with existing B&W equipment.

Wikipedia actually has a reasonably detailed explanation.
The original black and white NTSC television standard specified a frame rate of 30 Hz and 525 lines per frame, or 15750 lines per second. The audio was frequency modulated 4.5 MHz above the video signal. Because this was black and white, the video consisted only of luminance (brightness) information. Although all of the space in between was occupied, the line-based nature of the video information meant that the luminance data was not spread uniformly across the frequency domain; it was concentrated at multiples of the line rate. Plotting the video signal on a spectrogram gave a signature that looked like the teeth of a comb or a gear, rather than smooth and uniform.

RCA discovered that if the chrominance (color) information, which had a similar spectrum, was modulated on a carrier that was a half-integer multiple of the line rate, its signal peaks would fit neatly between the peaks of the luminance data and interference was minimized. It was not eliminated, but what remained was not readily apparent to human eyes. (Modern televisions attempt to reduce this interference further using a comb filter.)

To provide sufficient bandwidth for the chrominance signal, yet interfere only with the highest-frequency (and thus least perceptible) portions of the luminance signal, a chrominance subcarrier near 3.6 MHz was desirable. 227.5 = 455/2 times the line rate was close to the right number, and 455's small factors (5 × 7 × 13) make a divider easy to construct.

However, additional interference could come from the audio signal. To minimize interference there, it was similarly desirable to make the distance between the chrominance carrier frequency and the audio carrier frequency a half-integer multiple of the line rate. The sum of these two half-integers implies that the distance between the frequency of the luminance carrier and audio carrier must be an integer multiple of the line rate. However, the original NTSC standard, with a 4.5 MHz carrier spacing and a 15750 Hz line rate, did not meet this requirement: the audio was 285.714 times the line rate.

While existing black and white receivers could not decode a signal with a different audio carrier frequency, they could easily use the copious timing information in the video signal to decode a slightly slower line rate. Thus, the new color television standard reduced the line rate by a factor of 1.001 to 1/286 of the 4.5 MHz audio subcarrier frequency, or about 15734.2657 Hz. This reduced the frame rate to 30/1.001 ≈ 29.9700 Hz, and placed the color subcarrier at 227.5/286 = 455/572 = 35/44 of the 4.5 MHz audio subcarrier.

Ben321:

--- Quote from: james_s on November 30, 2022, 11:15:08 pm ---Because one of the primary design goals was to make it compatible with existing B&W equipment.

Wikipedia actually has a reasonably detailed explanation.
The original black and white NTSC television standard specified a frame rate of 30 Hz and 525 lines per frame, or 15750 lines per second. The audio was frequency modulated 4.5 MHz above the video signal. Because this was black and white, the video consisted only of luminance (brightness) information. Although all of the space in between was occupied, the line-based nature of the video information meant that the luminance data was not spread uniformly across the frequency domain; it was concentrated at multiples of the line rate. Plotting the video signal on a spectrogram gave a signature that looked like the teeth of a comb or a gear, rather than smooth and uniform.

RCA discovered that if the chrominance (color) information, which had a similar spectrum, was modulated on a carrier that was a half-integer multiple of the line rate, its signal peaks would fit neatly between the peaks of the luminance data and interference was minimized. It was not eliminated, but what remained was not readily apparent to human eyes. (Modern televisions attempt to reduce this interference further using a comb filter.)

To provide sufficient bandwidth for the chrominance signal, yet interfere only with the highest-frequency (and thus least perceptible) portions of the luminance signal, a chrominance subcarrier near 3.6 MHz was desirable. 227.5 = 455/2 times the line rate was close to the right number, and 455's small factors (5 × 7 × 13) make a divider easy to construct.

However, additional interference could come from the audio signal. To minimize interference there, it was similarly desirable to make the distance between the chrominance carrier frequency and the audio carrier frequency a half-integer multiple of the line rate. The sum of these two half-integers implies that the distance between the frequency of the luminance carrier and audio carrier must be an integer multiple of the line rate. However, the original NTSC standard, with a 4.5 MHz carrier spacing and a 15750 Hz line rate, did not meet this requirement: the audio was 285.714 times the line rate.

While existing black and white receivers could not decode a signal with a different audio carrier frequency, they could easily use the copious timing information in the video signal to decode a slightly slower line rate. Thus, the new color television standard reduced the line rate by a factor of 1.001 to 1/286 of the 4.5 MHz audio subcarrier frequency, or about 15734.2657 Hz. This reduced the frame rate to 30/1.001 ≈ 29.9700 Hz, and placed the color subcarrier at 227.5/286 = 455/572 = 35/44 of the 4.5 MHz audio subcarrier.

--- End quote ---

Also there's this issue though.
That is, it isn't actually the ideal chroma carrier frequency for 60Hz powerline frequency, even though that theoretically keeping it mathematically related to the power line frequency was the intent. The old black&white TV vertical scanning frequency was exactly 60 fields per second, the same as the power line frequency, but with the chroma carrier frequency they chose, the scanning frequency now needs to be 59.94 fields per second. Why not just select a chroma carrier who's frequency would work with the existing power line frequency? Using the same math they used to derive the vertical scanning frequency from the chroma carrier frequency, I did the reverse and found the ideal chroma carrier frequency for a 60 fields per second vertical scanning frequency. And that would be 3.583125MHz. That would also allow for a nicer horizontal scanning frequency of 15750 lines per second (same as used on black&white TV) instead of the weird horizontal scanning frequency of 15734.26 lines per second.

Yet there must be some technical reason they chose the exact chroma carrier frequency they did. Is it easier to make quartz crystal oscillators at that exact frequency or something?

james_s:
The technical reason is detailed in the text I posted, the line frequency is irrelevant, maybe in the earliest days they tried to sync the vertical to the line frequency but that has not been done in recent history.

coppice:

--- Quote from: Ben321 on November 30, 2022, 11:10:31 pm ---Instead of something simple like 3.5MHz, it's supposed to be exactly 3.579545MHz and accurate to within 3PPM (parts per million), which at this frequency corresponds to a frequency error of no more than 10.738635Hz. So a more reasonable frequency like 3.5MHz is well outside the specification. Why did the inventors of NTSC decide on such an arbitrary frequency of 3.579545MHz for the chroma carrier frequency?

--- End quote ---
If you see a funky frequency in use there are 2 explanations that cover almost every case:

* Parts for that frequency were in high volume use, and very cheap
* Its maths. There is some pattern in the operations that leads naturally to the chosen frequencyYour mistake is using terms like "reasonable" and "arbitrary". They blind you to reasonable thinking, and make you write arbitrary rubbish. The first possibility above is clearly backwards. The mass availability of cheap 3.579545MHz crystals means a lot of later things chose to use that frequency, purely for cheapness. However, those crystals were not widely available when the NTSC spec was written, so it has to be a pattern in the maths. Others have explained the actual pattern.

Navigation

[0] Message Index

[#] Next page

There was an error while thanking
Thanking...
Go to full version
Powered by SMFPacks Advanced Attachments Uploader Mod