Author Topic: How were color bars broadcast on TV?  (Read 592 times)

0 Members and 1 Guest are viewing this topic.

Offline Ben321

  • Frequent Contributor
  • **
  • Posts: 743
How were color bars broadcast on TV?
« on: October 28, 2021, 05:17:35 am »
I know that some stations would broadcast the SMPTE color bars or other similar color bar patterns when they weren't broadcasting content. I also know there's 2 types of color bar patters. 75% intensity and 100% intensity. With 75% intensity, the maximum excursion of the chroma carrier for the yellow bar would be 100IRE, but the white bar would actually be approximately 77IRE (75*0.95 +7.5=76.875). The other standard color bars are the 100% intensity colors, and have the white bar at 100IRE, but the maximum excursion for the chroma carrier for the yellow bar would be at approximately 131 IRE. However I think this could pose a problem with transmitters. The current NTSC standard has 120IRE being considered full modulation, and because transmitters use inverse AM modulation, at full modulation the carrier signal completely stops transmitting. And above full modulation, the carrier wave's phase is actually inverted. So if you put 131 IRE signal level for a full intensity yellow bar into a transmitter that has full modulation at only 120 IRE, that would cause a problem.

When there's a phase inversion of a signal presented to the AM demodulator in the TV receiver, this will result in various artifacts in the demodulated signal. How is this situation resolved when a studio transmits a full intensity color bars pattern? Or do they simply never transmit a color bar pattern that's above 75% intensity for over-the-air transmission of the color bars, and use the 100% intensity color bars pattern only for calibrating equipment directly connected by wire in the studio?
« Last Edit: October 28, 2021, 05:19:31 am by Ben321 »
 

Offline mansaxel

  • Super Contributor
  • ***
  • Posts: 3203
  • Country: se
  • SA0XLR
    • My very static home page
Re: How were color bars broadcast on TV?
« Reply #1 on: October 28, 2021, 06:14:27 am »
The 75% bars are the only ones used in production and transmission as far as I can tell. At least here in what-used-to-be-PAL-land this now is 98% archaeology since the production chain since 20 years is SDI (or encoded with compression to transmit over IP, to be decoded back to SDI later) and transmission is DVB-T / DVB-T2.

These days transmission is tested using Wireshark for protocols like SMPTE ST2110. Yes, our instrument has bars, but the most work is done using the timing and packet stream analysis tools.

"All broadcast engineers are closet time-nuts"
            (Another forum member to me, in privmsg)

Offline Ben321

  • Frequent Contributor
  • **
  • Posts: 743
Re: How were color bars broadcast on TV?
« Reply #2 on: October 28, 2021, 07:44:45 am »
The 75% bars are the only ones used in production and transmission as far as I can tell.
By production do you mean making video intended to be transmitted? Or do you mean calibrating studio monitors?

At least here in what-used-to-be-PAL-land this now is 98% archaeology since the production chain since 20 years is SDI (or encoded with compression to transmit over IP, to be decoded back to SDI later) and transmission is DVB-T / DVB-T2.
I don't know much about European technology, as I live in the US, and my question was about NTSC (even the spec about 120IRE being full moduluation, is an NTSC standard, so I don't think info about the PAL system will help me much here).

These days transmission is tested using Wireshark for protocols like SMPTE ST2110. Yes, our instrument has bars, but the most work is done using the timing and packet stream analysis tools.

"All broadcast engineers are closet time-nuts"
            (Another forum member to me, in privmsg)

Wireshark really? I thought that was a network protocol analyzer for internet and LAN type networks (like 802.11 wireless, and Ethernet wired networks). I didn't know it could analyzer protocols that weren't designed specifically for computer networking.

Also I'm surprised you even need bars in digital video. Digital video makes sure that the output is a bit-for-bit accurate reproduction of the input, so no possibility to have wrong phase of the chroma carrier or other issues that would make the color wrong output in analog video. Digital video means that all colors are precisely accurate.
 

Offline vk6zgo

  • Super Contributor
  • ***
  • Posts: 6532
  • Country: au
Re: How were color bars broadcast on TV?
« Reply #3 on: October 28, 2021, 08:09:30 am »
I know that some stations would broadcast the SMPTE color bars or other similar color bar patterns when they weren't broadcasting content. I also know there's 2 types of color bar patters. 75% intensity and 100% intensity. With 75% intensity, the maximum excursion of the chroma carrier for the yellow bar would be 100IRE, but the white bar would actually be approximately 77IRE (75*0.95 +7.5=76.875). The other standard color bars are the 100% intensity colors, and have the white bar at 100IRE, but the maximum excursion for the chroma carrier for the yellow bar would be at approximately 131 IRE. However I think this could pose a problem with transmitters. The current NTSC standard has 120IRE being considered full modulation, and because transmitters use inverse AM modulation, at full modulation the carrier signal completely stops transmitting. And above full modulation, the carrier wave's phase is actually inverted. So if you put 131 IRE signal level for a full intensity yellow bar into a transmitter that has full modulation at only 120 IRE, that would cause a problem.


When there's a phase inversion of a signal presented to the AM demodulator in the TV receiver, this will result in various artifacts in the demodulated signal. How is this situation resolved when a studio transmits a full intensity color bars pattern? Or do they simply never transmit a color bar pattern that's above 75% intensity for over-the-air transmission of the color bars, and use the 100% intensity color bars pattern only for calibrating equipment directly connected by wire in the studio?

Analog TV transmitters do not "cut carrier" under any circumstances.
Even if you did hit it with excessive video modulation levels, the inbuilt circuitry would clip the signal before that happened.
 

Offline mansaxel

  • Super Contributor
  • ***
  • Posts: 3203
  • Country: se
  • SA0XLR
    • My very static home page
Re: How were color bars broadcast on TV?
« Reply #4 on: October 28, 2021, 11:54:40 am »
The 75% bars are the only ones used in production and transmission as far as I can tell.
By production do you mean making video intended to be transmitted? Or do you mean calibrating studio monitors?

Production of TV.

At least here in what-used-to-be-PAL-land this now is 98% archaeology since the production chain since 20 years is SDI (or encoded with compression to transmit over IP, to be decoded back to SDI later) and transmission is DVB-T / DVB-T2.
I don't know much about European technology, as I live in the US, and my question was about NTSC (even the spec about 120IRE being full moduluation, is an NTSC standard, so I don't think info about the PAL system will help me much here).

As I wrote, this must be only of historical interest. It's SDI (SD, HD or 3G) all the way unless it's gone ST2110 or NDI. Signal is analog only before the CCD in the camera. The rest is bits and bytes, until it's decoded and displayed in people's TVs. Except for the poor sods who have analog cable.

We currently did some channel and encoding reshuffles here, turning off a lot of the DVB-T net and moving stuff to DVB-T2 instead. Viewers on small cable nets who relied on central decoding into analog channels were hit in a lot of places, because these back-asswards systems were unable to cope.

These days transmission is tested using Wireshark for protocols like SMPTE ST2110. Yes, our instrument has bars, but the most work is done using the timing and packet stream analysis tools.

"All broadcast engineers are closet time-nuts"
            (Another forum member to me, in privmsg)

Wireshark really? I thought that was a network protocol analyzer for internet and LAN type networks (like 802.11 wireless, and Ethernet wired networks). I didn't know it could analyzer protocols that weren't designed specifically for computer networking.

Modern TV production networks are just that, networks. Video is encoded into multicast RTP streams in the camera controller and consumers just siphon off the multicast. Mind you, this does take some network; a 1080i stream is ~1500Mbits/s. Wireshark can drink from that packet hydrant just fine, but there's some processing and sorting required to make sense of it.

Also I'm surprised you even need bars in digital video. Digital video makes sure that the output is a bit-for-bit accurate reproduction of the input, so no possibility to have wrong phase of the chroma carrier or other issues that would make the color wrong output in analog video. Digital video means that all colors are precisely accurate.

Yes, the colours reliably are as weird as they were in the studio. As I entered the field of broadcast engineering 23 years ago, the entry prize was a Spectrol trimmer to put in your pocket protector, for EQing distribution amps. These days, it's a laptop with a web browser and a SSH client.

Offline EPAIII

  • Regular Contributor
  • *
  • Posts: 172
  • Country: us
Re: How were color bars broadcast on TV?
« Reply #5 on: October 30, 2021, 08:44:53 am »
With over 45 years in TV engineering I never used 100% color bars. The only color bar signal that I would have ever sent to the transmitter was the 75% one and even that was only on rare occasions. For the most part, other test signals (Indian head) with a station ID were used much more frequently. These other signals told us more about the performance of the transmitter than color bars ever could.

And even if the 100% bars were sent to the transmitter, there was almost always a clipping circuit that would cut them off at about 101 on the IRE scale. 100 IRE was the maximum amplitude of the 75% color bars so that was only 1% higher. An AGC, video amplifier usually preceded that clipping circuit to control normal video signals.

Because the video signal was transmitted "upside down" (sync was the maximum level RF signal and white was the lowest with black areas of the picture, where the noise was most noticeable, was at a fairly high level), a video signal at 120 IRE would cut the RF signal completely off and result in interference with the audio and adjacent channels. The 100% color bars would exceed even that 120 IRE level and cause severe interference. That could never be allowed to happen.

I am sure there must have been a legitimate use for 100% color bars, but I never ran across it. Perhaps it was used to set up the color encoders in the early color studio cameras. The VERY early color cameras.
 

Offline EPAIII

  • Regular Contributor
  • *
  • Posts: 172
  • Country: us
Re: How were color bars broadcast on TV?
« Reply #6 on: October 30, 2021, 09:00:25 am »
The color bar test signal had a number of uses but these were primarily in the studio equipment, not at the transmitter.

The widest and perhaps most known use was as a standard reference level that was placed at the head of every video tape recording. About one minute of bars and audio tone was the first thing on virtually every reel of video tape. The color bar signal provided:

1. A luminance or black and white level reference.
2. A color sub carrier level reference.
3. A sync level reference.
4. A means of adjusting parameters like phase and amplitude errors that were luminance level dependent (aka: differential gain and phase) in video tape recorders.

These things were checked and adjusted prior to playing every video tape. This was a constant routing for the engineer working in the tape room during broadcast hours.

But there were many other uses for this signal. It was the primary signal used to perform maintenance set ups on the encoders which produced the composit video in cameras and other items of equipment. It was also a quick way of checking and adjusting the performance of video amplifiers which were everywhere. And it was used extensively in maintenance adjustments and troubleshooting of video tape recorders. In the TV studio it was the most used test signal.

It is not useless in a digital facility, but is not used as often as other test signals that can tell more about the digital signals and their transmission.


The 75% bars are the only ones used in production and transmission as far as I can tell. At least here in what-used-to-be-PAL-land this now is 98% archaeology since the production chain since 20 years is SDI (or encoded with compression to transmit over IP, to be decoded back to SDI later) and transmission is DVB-T / DVB-T2.

These days transmission is tested using Wireshark for protocols like SMPTE ST2110. Yes, our instrument has bars, but the most work is done using the timing and packet stream analysis tools.

"All broadcast engineers are closet time-nuts"
            (Another forum member to me, in privmsg)
 

Offline mansaxel

  • Super Contributor
  • ***
  • Posts: 3203
  • Country: se
  • SA0XLR
    • My very static home page
Re: How were color bars broadcast on TV?
« Reply #7 on: October 30, 2021, 09:16:13 am »

It is not useless in a digital facility, but is not used as often as other test signals that can tell more about the digital signals and their transmission.


Indeed. Yesterday I spent some time hunting for a transmission error that turned out to be PEBKAC in the remote end; a helmet camera connected to a cellular encoder/modem setup did show signs of being connected, but not transmitting any video. We did get bars through after some time, but the real litmus test of course was seeing moving picture, confirming we had signal, from the source, and with negligable losses.


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf