There's a lot of good information here and some that is just nonsense. As a career broadcast engineer I saw from the conversion from B&W to color TV to the conversion to digital. Here is my attempt to separate the facts from the BS.
B&W TV came first and was based on five numbers: 30 Hz Frame Rate, 60 Hz Field Rate, 15750 Hz Horizontal Rate, 525 Horizontal Lines per Frame, and 2 Fields per Frame. Those numbers are all exactly compatible.
15750 lines per second / 30 frames per second = 525 lines per frame
15750 lines per second / 60 frames per second = 256.5 lines per field
There were two fields per frame and strangely they were called Frame 1 and Frame 2.
The half line fraction in the field rate provided the interlace where the half line number 256.5 which started Frame 2, in theory, started half way across the screen and between lines 1 and 2 of Frame 1. This was done in B&W TV to minimize the appearance of flicker. Two half images, which were superimposed (interlaced) over each other formed one complete image. So the flicker rate was 60 Hz but the information rate was only 30 Hz.
Since B&W TV was well established before color came along, a major goal was compatibility. The old B&W TVs had to receive and display, in black and white of course, the new color broadcasts. And the new color TVs had to do like wise for B&W broadcasts. So no major changes could be made to the five numbers I gave above. But TVs in those days used TUBE circuitry. I remember the TV service man coming to our house with a big case of TUBES and most of the problems were fixed in ten or fifteen minutes with one or more new tubes. These tube circuits had loose tolerances and often things like the horizontal and vertical frequency controls had to be adjusted: they were up front on most TVs of that day. So the older B&W and even the new color TVs would tolerate some small changes in those numbers.
James and his reference to Wikipedia explains the why of the 3.579545 Mhz color sub carrier. It was chosen so that the energy due to that sub carrier fell between that of the horizontal rate. But a small adjustment had to be made in that horizontal rate to keep all the numbers compatible with each other. Please see the Wikipedia explanation. So they adjusted the color horizontal rate to be 15734+. And since there were exactly 525 horizontal lines per frame, the frame also had to have a slightly lower frequency.
All the talk about hum bars in the video may have been a factor in some B&W TVs, but I watched a lot of B&W TV and, while possible, it was not a major factor. For the most part it probably meant that the TV needed something beyond a new tube or two so it went to the shop for repairs. In other words, the design engineers at RCA and other TV equipment OEMs knew it could happen, but they choose to all that possibility. It would probably sell more new, color TVs.
As for TV stations using the 60 Hz power line frequency for synchronization, that was never done in any facility I worked in. While the long term accuracy of the power line frequency was and still is used to synchronize household and other clocks, in the short term (minutes and hours in duration) it could drift quite a bit as loads were added and shed by all the power customers across the country. They lost time in the daytime hours and sped their generators up to regain it at night. That was simply not good enough to keep a TV station, even a B&W station, within legal limits so it was never used. And the only facilities that had rubidium standards were the three/four networks. I guess they used them for synchronization between their major facilities. But they were expensive so were not widely used. TV stations had what were called sync generators which had a crystal oscillator, usually in a temperature controlled oven, and dividers that generated all the lower frequency signals from the oscillator's frequency. Most stations had at least two with an automatic changeover switch for reliability. I had a bad sync generator once in a B&W station and almost nothing else worked properly while it was bad. The sync generator was the heartbeat of the station.
All the sync generators had an external input for a signal that could be used to lock it's oscillator and all the additional signals it produced into step with any external video signal that was provided. Of course, when this happened there was a very noticeable jump or roll in the video. When either a regular TV station or a "remote truck" was going to send a live video feed to a network center, it was normally done by synchronizing that station's or remote truck's second sync generator to the network signal coming from network's New York or LA facility. Then during a station break, but not a commercial, and BEFORE the feed to the network center was to begin, that second sync generator was switched to be the primary one "on line". The "on air" picture would jump but not in a program or commercial. The station or remote truck was then synchronized with the network control center and their feed was synchronized so the network center could do things like switches or fades or other video effects with no noticeable problems. Shortly before the network center was going to use that video feed, the external signal at the station or remote truck would be disconnected allowing their sync generator to free wheel. But the network would synchronize their generator to the incoming feed from that facility. The station or remote truck was the source of all timing for the duration. It was a timing dance between the network and the remote facility and sometimes it went wrong. Then something like a major league ball game would be in a feedback loop with the network center with both trying to lock to the other and the pictures were totally scrambled. That was how it was done before digital synchronizers became affordable.
As to the use of sync generators in TV stations being modern or just today, that is total BS. My first sync generator, at a station in the 1960s was a tube model. It was a nightmare to keep running properly. But it did the job. We replaced it with a transistor based one a few months later. The tolerance on the color SC (the 3.579545 Mhz oscillator) was +/- 10 Hz. We often kept it within +/- 3 Hz and the latest oscillators which are in sync generators for digital stations can do better than +/- 1 Hz at that frequency by using higher frequency oscillators and dividing down. And yes, some TV stations do still use the older signals from their sync generators.
Yes, the change from the "exact" 60 Hz Field rate to 59.9... did produce problems for video editing. But, for the most part, those problems only surfaced long after color TV was firmly established. They came into the fore when digital editing systems were developed. Before that, all video editing was done on video tape recorders which had either mechanical timers or simple digital readouts that worked from those mechanical timers. Time code on the video tape was developed somewhere between when the first digital editors became available and the editors just knew that adjustments were needed. In my last job, an editing facility, we had to be precise and the digital editing systems we used did take this into account. There were two types of time code in use then. Drop frame time code omitted two counts on certain frames of the video to keep the time code in sync with a real world clock. And non drop frame did not omit those two counts so the time code was a number of seconds off by the end of a half hour program. We had calculators for this when needed. I wrote one using Excel.
Oh well, this is long enough.