| General > General Technical Chat |
| Why does NTSC use a strange frequency for the chroma carrier? |
| << < (4/7) > >> |
| dmills:
My feeling is that we forget how quickly the consumer electronics industry was moving back in the post way years. I have no trouble believing that locking the hum bar was a real advantage in the early days of black and white and was unimportant (even for most black and white sets) by the time colour arrived. The UK switched from 405 line VHF to 625 line PAL UHF at around this time so a lot of the older sets would have became obsolete relatively quickly even if the VHF transmitters were still operating simply because people wanted colour and the higher resolution made a difference even if you could only afford the newer black and white set. The BBC were still taking a mains hookup out of a nearby house to sync their generators so that the generated power frequency would track the grid as late as five years ago, it may well have been about flicker reduction on the camera side when shooting something in a street with LED streetlighting that had some flicker (or the once common domestic CFL that often had horrific flicker). Now that is 50 fields per second, with no weird offset and the UK has one grid with no internal DC links or such so the whole thing tracks reasonably closely absent transient events. It might just be something carried over as a 'best practise' thing from the just post war era, or as I say they might be shooting 50hz and doing it for flicker reduction. I have seen this when shooting features for later broadcast which means that global house sync is not a thing, obviously unworkable if chasing GPS for a live broadcast. House sync is still very much a thing, Tek/Telestream being the usual suspects in an SDI plant, with Meinburg or sometimes Evertz doing the job for a baseband IP facility (There are alternative vendors for both, but probably 90% of the time these are what you will see, SPG8000 and such). These days they chase GPS time as a global reference so as long as you have punched in the appropriate link delay offset you can have local video and remote contribution arrive at the switcher with at worst a few lines of offset. Things are MUCH easier when you have a global time reference that works with negligible offset. A fun one is that while classic gen lock has a delay equal to sync gen -> camera -> router, something like 2110 using PTP measures and compensates the delay from the sync gen -> camera, and separately does the delay sync gen -> router so the amount of cable delay compensation is different... Had more then one headache explaining that one. DF timecode became MUCH more of a pain as the audio workflow and video side drifted apart as the DAW became much more of a goto weapon for audio while video was still mostly on tape. Yea, broadcast engineers are usually time nuts, comes with the territory. |
| schmitt trigger:
Lengthy but very interesting information EPAIII. It reminded me of my time as an intern in a local TV station during the summer of 1977. I distinctly remember the sync generator cabinet, full of tubes and the size of a fridge. It was being replaced with a transistorized unit. I was there helping with the cutting and preparing dozens of cables. A couple of days before we were to turn the unit “live”, the anxiety was palpable. When it finally did, on the wee hours of a Sunday morning, everyone cheered. My young, ignorant mind could not comprehend why all the fuzz. But as your explanation eloquently shows, without a sync generator you didn’t have a TV station. Period. |
| Ben321:
--- Quote from: schmitt trigger on December 04, 2022, 02:59:07 am ---without a sync generator you didn’t have a TV station. Period. --- End quote --- Not entirely true. Having a single sync generator for your station is good to stop issues when switching between cameras. Without it, each camera would need to have its own internal sync pulse generator, and when switching cameras, the sync signals would switch to the next camera. The result is that when receiving the signal on a TV receiver, the TV would momentarily lose sync and the picture would flash momentarily as it locked onto the new camera's sync signal. This would maybe be annoying to TV watchers, but it wouldn't prevent the broadcast from being received. Remember, the sync pulses are detected by circuits in each TV receiver. So you don't actually need the national power grid to act as a master-clock for the entire country's TV system, nor does your TV station need a locally-generated master-clock (it only needs one to avoid this camera-switching visual artifact). There's nothing in terms of a technical impossibility that would prevent a TV station from operating with each camera containing its own internal sync pulse generator. |
| NiHaoMike:
--- Quote from: Ben321 on December 29, 2022, 09:24:38 pm ---Not entirely true. Having a single sync generator for your station is good to stop issues when switching between cameras. Without it, each camera would need to have its own internal sync pulse generator, and when switching cameras, the sync signals would switch to the next camera. The result is that when receiving the signal on a TV receiver, the TV would momentarily lose sync and the picture would flash momentarily as it locked onto the new camera's sync signal. This would maybe be annoying to TV watchers, but it wouldn't prevent the broadcast from being received. Remember, the sync pulses are detected by circuits in each TV receiver. So you don't actually need the national power grid to act as a master-clock for the entire country's TV system, nor does your TV station need a locally-generated master-clock (it only needs one to avoid this camera-switching visual artifact). There's nothing in terms of a technical impossibility that would prevent a TV station from operating with each camera containing its own internal sync pulse generator. --- End quote --- Fran did a video on a 60s era video editing machine. |
| EPAIII:
Oh yea? Try explaining that to any of the chief engineers I worked for or to me on the occasions when I held that job. Lets see now, without ONE sync generator running the entire station: Every time the station went from a program to a commercial, from one commercial to another, from a commercial to a station ID, from any of the above back to a program, the picture would roll on all those TV sets. And each and every sponsor who had PAID for those commercials would refuse to pay. As a side note, back in the day before video tape recorders, sponsors played games with the TV stations. They deliberately made the commercials so difficult to perform on a live basis that the announcer or other talent involved often made mistakes. And then the sponsor demanded what was called a "make good". That was a free airing of the same commercial. A two for the price of one bargain. Oh, and THAT is the real reason why the local stations purchased video tape recorders. They recorded the commercials to be sure they didn't need to give away free air time. And the sync generator kept those video tape recorders in step with the programs so no rolling picture and no free air time. If the station had a live or recorded studio program with more than one camera (they all did) then the picture would roll whenever a switch was made from one camera to another. That would have been a horrible distraction and it is one that I doubt that anyone here can ever recall seeing. If even the simplest text message was to be placed in either a live or a recorded program, then the program's source and the text's source had to be synchronized. There's that pesky old SINGLE sync generator again. And don't even get me started with the news programs. Those people could drive you to the insane asylum if even the tinyest thing went wrong. No, I am sorry, but my TV career started in the 1960s and even long before that point, a TV station without a sync generator was totally unthinkable. And probably cause for losing your job if it failed. That's why there were always at least two with an automatic changeover switch. Some stations even had three. But ONLY ONE was in use at any given time. And I don't know where this power line business came from. The sync generators in the individual stations had their own master oscillators. The only nation wide clock was the network that the station belonged to and they only synchronized to the network's signal when necessary. The power lines then, and now drifted far too much to ever use even for just a black and white facility. At least from the 1960s to the present day the power line frequency was NEVER used by TV stations. Or the networks. --- Quote from: Ben321 on December 29, 2022, 09:24:38 pm --- --- Quote from: schmitt trigger on December 04, 2022, 02:59:07 am ---without a sync generator you didn’t have a TV station. Period. --- End quote --- Not entirely true. Having a single sync generator for your station is good to stop issues when switching between cameras. Without it, each camera would need to have its own internal sync pulse generator, and when switching cameras, the sync signals would switch to the next camera. The result is that when receiving the signal on a TV receiver, the TV would momentarily lose sync and the picture would flash momentarily as it locked onto the new camera's sync signal. This would maybe be annoying to TV watchers, but it wouldn't prevent the broadcast from being received. Remember, the sync pulses are detected by circuits in each TV receiver. So you don't actually need the national power grid to act as a master-clock for the entire country's TV system, nor does your TV station need a locally-generated master-clock (it only needs one to avoid this camera-switching visual artifact). There's nothing in terms of a technical impossibility that would prevent a TV station from operating with each camera containing its own internal sync pulse generator. --- End quote --- |
| Navigation |
| Message Index |
| Next page |
| Previous page |