Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

The 29.97 frames per NTSC standard

  1. Sep 4, 2009 #1
    I have read before that when color television was developed the frame rate was changed from 30 frames per second to 29.97 frames per second to avoid some kind of problem with the circuitry, but I have never seen an explanation of just what this technical problem was. My somewhat educated guess is that maybe it had something to do with not wanting to start a new screen refresh cycle exactly at the same time the power supply was charging, to prevent glitches getting into the chroma demodulator/amplifier system? Or something?
  2. jcsd
  3. Sep 4, 2009 #2


    User Avatar
    Science Advisor

    Prior to color TV, the frame rate was, indeed, 30 fps. When color came along, they needed a small amount of additional information in order to render it but the bandwidth was already fixed by channel allocations--adding more information to a signal necessarily increases its bandwidth. Their solution was to dial back the frame rate ever so slightly so lower the bandwidth that the existing signal required by an amount which would then leave room for the chroma signal. That amount turned out be about .03 fps.
  4. Sep 4, 2009 #3
    Here is why: Originally, the vertical scanning rate was 30 hertz (interlaced, so it was 60 fields per second, 30 frames per second) and the horizontal scanning rate was 15750 hertz. Notice that 15750 is divisible by 30. When the black and white video signal modulates a carrier wave, we get sidebands every 15750 hertz out away from the carrier. Most of the information is close to the carrier but there is still some information that we cannot tolerate losing that is several Mhz away from the carrier. The scheme with NTSC color was to was to put a subcarrier for color several Mhz out away from the main carrier. They picked a subcarrier frequency of 3.579545 Mhz. Then they moved the vertical scanning frequency to 15734 hertz and the verticle scanning frequency to 59.94 hertz (frame rate to 29.97 hertz). Why? All these frequencies are phase related. Black and white information still falls in sidebands out away from the carrier as it always did. Color information was found in sidebands either side of the subcarrier every 15734 hertz. Here is the genius of the whole thing: The color sidebands fall in between the black and white sidebands. It's like having a bucket of large rocks and saying it is full, then you pour sand in and let it settle in between the rocks. More stuff in the same bucket. A comb filter is used to seperate the color information from the black and white. A fairly simple scheme was used to do this. Each line of video information on the average is very similar to the previous. Also, the phase of the color information changes by 180 degrees each line. Knowing this, they split the video information into 2 paths, delay one by 1/15734 (one scan line) and add the two. The color information cancels itself out because of the 180 degree phase shift and the black and white signal is left. To get the color signal without the black and white, the same thing is done except one of signal paths has an amplifier with a gain of -1 in it. Most if not all black and white TV sets didn't know the difference. The scanning frequencies are close enough to original so that they worked fine.
    Want to know more? I used to work on NTSC video equipment. Signal generators, insertion generators, video measuring equipment (waveform monitors, vectorscopes, etc.) cable TV RF signal level meters, etc. I know what it's like to be obsoleted. :(
  5. Sep 4, 2009 #4
    Thank god we switched to digital.
  6. Sep 4, 2009 #5


    User Avatar
    Gold Member

    I used to be able to get 4 channels when the antenna was pointing East. Now I get 2.

    Used to get 6 channels when pointing south. Now I get 3 if I'm lucky.

    Tell me again why digital is better. :confused:
  7. Sep 4, 2009 #6
    Because digital video modulation 30p (progressive) does not go through all the hoops of analog video modulation as described by Averagesupernova. It's more elegant in design.

    You may not be picking up all the stations for many reasons, (bad antenna, wiring, might have to rescan the receiver, or simply your area hasn't upgraded)
  8. Sep 4, 2009 #7


    User Avatar
    Gold Member

    The brother of the previous owner of this place did custom TV antenna installations, and I have really high end VHF-UFH antenna on top of a ~35' mast. It's not the antenna, the converter (I have rescanned 'til I'm sick of it), or the coax. Digital signals are "all or nothing" and they are highly directional and sensitive to terrain. When you're in hilly country and the nearest TV transmitter is over 40 miles away, digital sucks. This is all so the big Telecom outfits can get some publicly-owned bandwidth cheap and sell us services using it.
  9. Sep 4, 2009 #8


    User Avatar
    Science Advisor
    Homework Helper

    The ones you don't get include "so you can think you can dance Canada"?
  10. Sep 4, 2009 #9


    User Avatar
    Gold Member

    I miss the hot political shows, though, including "New Brunswick vs Nova Scotia: Provincial-Government Ultimate Fighting" and the ever-popular "What's up wit' dem hosers, eh?"
  11. Sep 5, 2009 #10
    Let's see if I understand this correctly, first for plain black and white: if the carrier were simply modulated by the 15750 Hz frequency, we'd just get two sidebands + and - 15750 Hz off the carrier frequency, but if I remember correctly analog TV uses some kind of vestigial sideband thing where the upper sideband is partially suppressed. However, the carrier is also being modulated by the luminance information, so is this the reason that there isn't just a single sideband frequency but multiple sidebands stretching away from the carrier spaced every 15750 Hz? Then, if I understand the subcarrier concept correctly, the chroma information is first AM modulated onto a 3.579545 MHz signal, and then that already-modulated signal is used to again modulate the main carrier frequency. I'm having a tough time visualizing what the spectrum would look like in the frequency domain; do you have a link to a pretty diagram that might help by any chance? :biggrin:
  12. Sep 5, 2009 #11
    bitrex. You pretty much have it right except a few things mixed up with the sidebands. Don't have time at the moment to elaborate but I can post more later. Incidentally, with a good spectrum analyzer you can see the sidebands I referred to where they fall in between each other.
  13. Sep 5, 2009 #12


    User Avatar
    Science Advisor

    Hi turbo. Your comments are interesting because while I'm enjoying so many advantages of digital TV I have to agree that the "all or nothing" aspect of it can be a problem in some areas. In my area I know of quite a few people who needed to upgraded their antenna to get decent digital reception. In most cases the initial reception seemed to be good (on initial installation of set-top box or new TV) but over a longer time period drops-outs would often be noticed, particularly in adverse weather conditions.

    The main problem is that when you get a drop-out with digital TV you get a complete freeze of picture for several seconds and sometimes (depending on the equipment) a large and very annoying audio glitch. Personally I get pretty good DTV reception with only a fairly basic antenna, so I'm lucky, but just lately I've started to notice a few drop-outs in windy conditions. At the moment these are few and far between enough to not be real problem though I might need to upgrade my antenna soon.

    Lately I've been wondering if they shouldn't have "wasted" a bit more bandwidth in error correction to make DTV a little more robust in marginal reception area's. Say for example that you put all the bandwidth of a high definition channel into a standard definition channel with a massive amount of error correction redundancy, I think the result would be unbelievably robust. (ok maybe that example is going a bit too far in “waste” but you get the point). Does anyone else have an opinion on this - I mean do you think perhaps the standard for DTV broadcast could have wasted a bit more bandwidth on making it more robust?
  14. Sep 5, 2009 #13


    User Avatar

    that divisor is 525 which means, including the flyback, there are 525 horizontal lines per frame and there used to be 30 frames per second ("now", before DTV, it was 29.97).

    Averagesupernova, there is a reason for why they bumped the fH down from 15750 to 15734, and i don't think you got completely to it. it has to do with the sound carrier at 4.5 MHz from the video base band.

    originally fH = 15750, and since adjacent horizontal lines are often largely identical, the raster scan signal is sorta a periodic function with fundamental frequency at fH, you will see in the video spectrum spikes (a concentration of energy) at integer multiples of fH. now they wanted to bump up and locate this color information exactly half way between two of these harmonics (the 227th and 228th), largely for the reasons you described: for B&W TVs the interference of the color subcarrier for one line will be exactly out-of-phase of the interference for the adjacent line and will tend to visually cancel.

    the question is how does the intensity (B&W) signal interfere with the chroma signal, and it's exactly the same. from the POV of the color base band (sitting at 227.5 x fH), all of those spikes of the intensity signal are halfway between the harmonics of the chroma signal so they tend to cancel visually on every adjacent line.

    but the sound carrier (to the right of where the put the chroma) didn't quite get to be exactly halfway between chroma harmonics and that caused visible distortion to the chroma. to get the sound carrier at exactly halfway between chroma harmonics, they needed to make it precisely an integer times fH. so let's see:

    [tex] \frac{4.5 \mathrm{MHz}}{f_H} = 285.714 [/tex]

    so they want to make it exactly an integer. which integer do you think they'll pick? the closest one, which is 286. they want the sound carrier to be at exactly 286 x fH.

    there are two ways of doing that, they could bump up the sound carrier from 4.5 MHz to 286 x 15750 Hz = 4.5045 Mhz, or they could bump fH down to (4.5 MHz)/286 = 15734.2657 Hz, which is what they chose to do. i think they decided bumping up the sound carrier higher would be more difficult for existing TVs to track or latch onto the slightly detuned sound carrier than it would be for existing TVs to latch onto a slightly lowered fH. from that decision to bump fH down, which caused fV to get bumped down to 29.97 Hz which continued to be fH/525,... because of that decision in the 1950s, video engineers now have to deal with that nasty drop-frame and audio engineers had to deal with an occasional 44.056 kHz sampling rate instead of 44.1 kHz. what a pain in the arse.
    Last edited: Sep 5, 2009
  15. Sep 5, 2009 #14
    Why is it that the sound carrier is interfering with the chroma signal? In this diagram:http://people.seas.harvard.edu/~jones/cscie129/nu_lectures/lecture8/analog_tv/spec_ntsc_col.gif it looks like the audio sidebands fall entirely outside the luminance and chrominance bands. And if that diagram isn't accurate and the audio sidebands do fall inside the main video signal, won't making the audio carrier a integer multiple of the horizontal frequency cause interference in the luminance signal again? And it looks like from the diagram that the chrominance Q signal is being transmitted with both sidebands, but the I signal is being transmitted only with a vestigial sideband! Gaaah <head explodes>
  16. Sep 5, 2009 #15


    User Avatar
    Science Advisor
    Gold Member

  17. Sep 5, 2009 #16
  18. Sep 5, 2009 #17
    No, not quite. The only way you will ever get only one upper and lower sideband is if you modulate with a pure sine wave. No matter what kind of luminance info you modulate with you will always have sidebands at multiples of 15750 because it is non-sinusoidal. The vestigal sideband is done to conserve bandwidth. Only one sideband is needed to convey all the information. I'm told the reason that the lower sideband (not upper) is partially supressed instead of simply eliminated is because the technology didn't exist to completely remove the lower sideband.

    See above.

    The chroma info is AM modulated onto the 3.579545 subcarrier using a balanced modulator. This way, the carrier is automatically supressed. It simply isn't needed. The colorburst signal you may have heard of is in-phase with the color subcarrier. It is used to recover the chroma signal in the TV set. It phase locks an oscillator in the TV set to the original phase of the chroma subcarrier at the TV station.
  19. Sep 6, 2009 #18
    Upon seeing the link in post #14 some of the details started to come back to me about NTSC video and I decided it would be a good idea to do a little more detailed description of what happens in color transmission. I have to admit I needed to do a bit of digging to come up with some of the numbers below. Thanks to the book Basic Televisions and Video Systems by Bernard Grob.

    The 3 color signals out of the camera: Red, Green, and Blue are the 3 primary colors. Technically there is a camera for each optically filtered color. We get 3 signals that feed into a matrix to form the Y (luminance) signal, the I signal and the Q signal.
    The Y signal is simply all three color signals added up in the following proportions: Y = .3R + .59G + .11B. If only the Y signal formed by this matrix were transmitted, all TV sets would receive a black and white signal and look the same as any other black and white TV set.
    The I signal is the three color signals added up in the following proportions: I = .6R - .28G - .32B.
    The Q signal is the three color signals added up in the following proportions: Q = .21R - .52G + .31B.
    The I and Q signals and the Y signal collectively contain all the video information needed to transmit a color TV signal.
    Now the Y signal modulates the carrier the same way it would have in the days before color. The I and Q signals modulate the 3.579545 subcarrier but they do it 90 degrees out of phase. This is why the letter Q was picked. It means quadrature. Each signal (I and Q) modulates a separate modulator and then those outputs are combined.
    The I signal will modulate the chroma subcarrier more than the Q signal does. There is a reason for this in that the color information that it carries requires more detail. Also, the diagram in the link in post #14 will show this to be the case. However, as bitrex pointed out, it IS vestigal (this is one of those details I mentioned previously in this post). This is to keep it out of the sound channel. RBJ, you may have had some confusion over this.
  20. Sep 6, 2009 #19
    Say what you want about NTSC (Never Twice the Same Color) but for what they had to work with, and the requirement to stay compatible with black and white, I think that adding a color signal to the already existing black and white signal was quite an accomplishment.
  21. Sep 6, 2009 #20


    User Avatar

    you're doing pretty good with this. now ask yourself how they picked 3.579545 MHz? ponder that a little, Av.

    then punch into your calculator: 227.5 x 4500000 / 286, and see what you get.

    ::edit:: and take that number and divide by 227.5 (to get fH) and again by 525 and you'll get the number in the subject line of this thread. ::/edit::

    it's in the textbooks. at least in my day (and, for an historical question, old textbooks are better, all other considerations equal). see if your library has a copy of A Bruce Carlson. those drawings on wikipedia are just drawings. that one does not depict it as the line spectrum it nearly is, which is the whole motivation behind interlacing the chroma signal harmonics (or sidebands) up there centered halfway between the 227th and 228th harmonics of fH. anyway, in a less-than-perfect reality, the spectrum does not end there. at least we know the chroma channel receiver does not have infinite-dB rejection at 4.5 MHz. especially back them olden days. selectivity cost money. it was more inductors and capacitors not just simply more taps on the FIR filter.

    when the audio is silence or close to it, the energy of the FM audio signal is concentrated at the carrier frequency. of course, we would not expect the audio sidebands to interlace with either the intensity or chroma sidebands. now ask yourself, what would be the visual effect of whatever sound carrier that leaks through? ask in three cases:

    1. if the sound carrier is exactly an integer times fH?
    2. if the sound carrier is exactly an integer+1/2 times fH?
    3. if the sound carrier is somewhere in between?

    what do you see in either case and which, in your judgment is better?

    Av, i'm not confused about this a bit.

    r b-j
    Last edited: Sep 6, 2009
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?

Similar Discussions: The 29.97 frames per NTSC standard
  1. Software NTSC decoding? (Replies: 14)

  2. Standard for volt (Replies: 4)

  3. ESD standards (Replies: 2)