Register to reply

The 29.97 frames per NTSC standard

by bitrex
Tags: 2997, frames, ntsc, standard
Share this thread:
bitrex
#1
Sep4-09, 05:37 PM
P: 196
I have read before that when color television was developed the frame rate was changed from 30 frames per second to 29.97 frames per second to avoid some kind of problem with the circuitry, but I have never seen an explanation of just what this technical problem was. My somewhat educated guess is that maybe it had something to do with not wanting to start a new screen refresh cycle exactly at the same time the power supply was charging, to prevent glitches getting into the chroma demodulator/amplifier system? Or something?
Phys.Org News Partner Engineering news on Phys.org
Hoverbike drone project for air transport takes off
Student develops filter for clean water around the world
Developing the next evolution in underwater communication
negitron
#2
Sep4-09, 06:20 PM
Sci Advisor
negitron's Avatar
P: 842
Prior to color TV, the frame rate was, indeed, 30 fps. When color came along, they needed a small amount of additional information in order to render it but the bandwidth was already fixed by channel allocations--adding more information to a signal necessarily increases its bandwidth. Their solution was to dial back the frame rate ever so slightly so lower the bandwidth that the existing signal required by an amount which would then leave room for the chroma signal. That amount turned out be about .03 fps.
Averagesupernova
#3
Sep4-09, 07:10 PM
P: 2,497
Here is why: Originally, the vertical scanning rate was 30 hertz (interlaced, so it was 60 fields per second, 30 frames per second) and the horizontal scanning rate was 15750 hertz. Notice that 15750 is divisible by 30. When the black and white video signal modulates a carrier wave, we get sidebands every 15750 hertz out away from the carrier. Most of the information is close to the carrier but there is still some information that we cannot tolerate losing that is several Mhz away from the carrier. The scheme with NTSC color was to was to put a subcarrier for color several Mhz out away from the main carrier. They picked a subcarrier frequency of 3.579545 Mhz. Then they moved the vertical scanning frequency to 15734 hertz and the verticle scanning frequency to 59.94 hertz (frame rate to 29.97 hertz). Why? All these frequencies are phase related. Black and white information still falls in sidebands out away from the carrier as it always did. Color information was found in sidebands either side of the subcarrier every 15734 hertz. Here is the genius of the whole thing: The color sidebands fall in between the black and white sidebands. It's like having a bucket of large rocks and saying it is full, then you pour sand in and let it settle in between the rocks. More stuff in the same bucket. A comb filter is used to seperate the color information from the black and white. A fairly simple scheme was used to do this. Each line of video information on the average is very similar to the previous. Also, the phase of the color information changes by 180 degrees each line. Knowing this, they split the video information into 2 paths, delay one by 1/15734 (one scan line) and add the two. The color information cancels itself out because of the 180 degree phase shift and the black and white signal is left. To get the color signal without the black and white, the same thing is done except one of signal paths has an amplifier with a gain of -1 in it. Most if not all black and white TV sets didn't know the difference. The scanning frequencies are close enough to original so that they worked fine.
-
Want to know more? I used to work on NTSC video equipment. Signal generators, insertion generators, video measuring equipment (waveform monitors, vectorscopes, etc.) cable TV RF signal level meters, etc. I know what it's like to be obsoleted. :(

waht
#4
Sep4-09, 07:35 PM
P: 1,636
The 29.97 frames per NTSC standard

Thank god we switched to digital.
turbo
#5
Sep4-09, 07:51 PM
PF Gold
turbo's Avatar
P: 7,363
Quote Quote by waht View Post
Thank god we switched to digital.
I used to be able to get 4 channels when the antenna was pointing East. Now I get 2.

Used to get 6 channels when pointing south. Now I get 3 if I'm lucky.

Tell me again why digital is better.
waht
#6
Sep4-09, 08:11 PM
P: 1,636
Quote Quote by turbo-1 View Post
Tell me again why digital is better.
Because digital video modulation 30p (progressive) does not go through all the hoops of analog video modulation as described by Averagesupernova. It's more elegant in design.

You may not be picking up all the stations for many reasons, (bad antenna, wiring, might have to rescan the receiver, or simply your area hasn't upgraded)
turbo
#7
Sep4-09, 08:18 PM
PF Gold
turbo's Avatar
P: 7,363
The brother of the previous owner of this place did custom TV antenna installations, and I have really high end VHF-UFH antenna on top of a ~35' mast. It's not the antenna, the converter (I have rescanned 'til I'm sick of it), or the coax. Digital signals are "all or nothing" and they are highly directional and sensitive to terrain. When you're in hilly country and the nearest TV transmitter is over 40 miles away, digital sucks. This is all so the big Telecom outfits can get some publicly-owned bandwidth cheap and sell us services using it.
mgb_phys
#8
Sep4-09, 08:29 PM
Sci Advisor
HW Helper
P: 8,954
Quote Quote by turbo-1 View Post
Tell me again why digital is better.
The ones you don't get include "so you can think you can dance Canada"?
turbo
#9
Sep4-09, 08:37 PM
PF Gold
turbo's Avatar
P: 7,363
Quote Quote by mgb_phys View Post
The ones you don't get include "so you can think you can dance Canada"?
I miss the hot political shows, though, including "New Brunswick vs Nova Scotia: Provincial-Government Ultimate Fighting" and the ever-popular "What's up wit' dem hosers, eh?"
bitrex
#10
Sep5-09, 12:02 AM
P: 196
Quote Quote by Averagesupernova View Post
Here is why: Originally, the vertical scanning rate was 30 hertz (interlaced, so it was 60 fields per second, 30 frames per second) and the horizontal scanning rate was 15750 hertz. Notice that 15750 is divisible by 30. When the black and white video signal modulates a carrier wave, we get sidebands every 15750 hertz out away from the carrier. Most of the information is close to the carrier but there is still some information that we cannot tolerate losing that is several Mhz away from the carrier. The scheme with NTSC color was to was to put a subcarrier for color several Mhz out away from the main carrier. They picked a subcarrier frequency of 3.579545 Mhz. Then they moved the vertical scanning frequency to 15734 hertz and the verticle scanning frequency to 59.94 hertz (frame rate to 29.97 hertz).
Let's see if I understand this correctly, first for plain black and white: if the carrier were simply modulated by the 15750 Hz frequency, we'd just get two sidebands + and - 15750 Hz off the carrier frequency, but if I remember correctly analog TV uses some kind of vestigial sideband thing where the upper sideband is partially suppressed. However, the carrier is also being modulated by the luminance information, so is this the reason that there isn't just a single sideband frequency but multiple sidebands stretching away from the carrier spaced every 15750 Hz? Then, if I understand the subcarrier concept correctly, the chroma information is first AM modulated onto a 3.579545 MHz signal, and then that already-modulated signal is used to again modulate the main carrier frequency. I'm having a tough time visualizing what the spectrum would look like in the frequency domain; do you have a link to a pretty diagram that might help by any chance?
Averagesupernova
#11
Sep5-09, 09:22 AM
P: 2,497
bitrex. You pretty much have it right except a few things mixed up with the sidebands. Don't have time at the moment to elaborate but I can post more later. Incidentally, with a good spectrum analyzer you can see the sidebands I referred to where they fall in between each other.
uart
#12
Sep5-09, 11:31 AM
Sci Advisor
P: 2,751
Quote Quote by turbo-1 View Post
I used to be able to get 4 channels when the antenna was pointing East. Now I get 2.

Used to get 6 channels when pointing south. Now I get 3 if I'm lucky.

Tell me again why digital is better.
Hi turbo. Your comments are interesting because while I'm enjoying so many advantages of digital TV I have to agree that the "all or nothing" aspect of it can be a problem in some areas. In my area I know of quite a few people who needed to upgraded their antenna to get decent digital reception. In most cases the initial reception seemed to be good (on initial installation of set-top box or new TV) but over a longer time period drops-outs would often be noticed, particularly in adverse weather conditions.

The main problem is that when you get a drop-out with digital TV you get a complete freeze of picture for several seconds and sometimes (depending on the equipment) a large and very annoying audio glitch. Personally I get pretty good DTV reception with only a fairly basic antenna, so I'm lucky, but just lately I've started to notice a few drop-outs in windy conditions. At the moment these are few and far between enough to not be real problem though I might need to upgrade my antenna soon.

Lately I've been wondering if they shouldn't have "wasted" a bit more bandwidth in error correction to make DTV a little more robust in marginal reception area's. Say for example that you put all the bandwidth of a high definition channel into a standard definition channel with a massive amount of error correction redundancy, I think the result would be unbelievably robust. (ok maybe that example is going a bit too far in “waste” but you get the point). Does anyone else have an opinion on this - I mean do you think perhaps the standard for DTV broadcast could have wasted a bit more bandwidth on making it more robust?
rbj
#13
Sep5-09, 11:54 AM
P: 2,251
Quote Quote by Averagesupernova View Post
Here is why: Originally, the vertical scanning rate was 30 hertz (interlaced, so it was 60 fields per second, 30 frames per second) and the horizontal scanning rate was 15750 hertz. Notice that 15750 is divisible by 30.
that divisor is 525 which means, including the flyback, there are 525 horizontal lines per frame and there used to be 30 frames per second ("now", before DTV, it was 29.97).

When the black and white video signal modulates a carrier wave, we get sidebands every 15750 hertz out away from the carrier. Most of the information is close to the carrier but there is still some information that we cannot tolerate losing that is several Mhz away from the carrier. The scheme with NTSC color was to was to put a subcarrier for color several Mhz out away from the main carrier. They picked a subcarrier frequency of 3.579545 Mhz. Then they moved the vertical scanning frequency to 15734 hertz and the verticle scanning frequency to 59.94 hertz (frame rate to 29.97 hertz). Why?
Averagesupernova, there is a reason for why they bumped the fH down from 15750 to 15734, and i don't think you got completely to it. it has to do with the sound carrier at 4.5 MHz from the video base band.

originally fH = 15750, and since adjacent horizontal lines are often largely identical, the raster scan signal is sorta a periodic function with fundamental frequency at fH, you will see in the video spectrum spikes (a concentration of energy) at integer multiples of fH. now they wanted to bump up and locate this color information exactly half way between two of these harmonics (the 227th and 228th), largely for the reasons you described: for B&W TVs the interference of the color subcarrier for one line will be exactly out-of-phase of the interference for the adjacent line and will tend to visually cancel.

the question is how does the intensity (B&W) signal interfere with the chroma signal, and it's exactly the same. from the POV of the color base band (sitting at 227.5 x fH), all of those spikes of the intensity signal are halfway between the harmonics of the chroma signal so they tend to cancel visually on every adjacent line.

but the sound carrier (to the right of where the put the chroma) didn't quite get to be exactly halfway between chroma harmonics and that caused visible distortion to the chroma. to get the sound carrier at exactly halfway between chroma harmonics, they needed to make it precisely an integer times fH. so let's see:

[tex] \frac{4.5 \mathrm{MHz}}{f_H} = 285.714 [/tex]

so they want to make it exactly an integer. which integer do you think they'll pick? the closest one, which is 286. they want the sound carrier to be at exactly 286 x fH.

there are two ways of doing that, they could bump up the sound carrier from 4.5 MHz to 286 x 15750 Hz = 4.5045 Mhz, or they could bump fH down to (4.5 MHz)/286 = 15734.2657 Hz, which is what they chose to do. i think they decided bumping up the sound carrier higher would be more difficult for existing TVs to track or latch onto the slightly detuned sound carrier than it would be for existing TVs to latch onto a slightly lowered fH. from that decision to bump fH down, which caused fV to get bumped down to 29.97 Hz which continued to be fH/525,... because of that decision in the 1950s, video engineers now have to deal with that nasty drop-frame and audio engineers had to deal with an occasional 44.056 kHz sampling rate instead of 44.1 kHz. what a pain in the arse.
bitrex
#14
Sep5-09, 02:47 PM
P: 196
Why is it that the sound carrier is interfering with the chroma signal? In this diagram:http://people.seas.harvard.edu/~jone...c_ntsc_col.gif it looks like the audio sidebands fall entirely outside the luminance and chrominance bands. And if that diagram isn't accurate and the audio sidebands do fall inside the main video signal, won't making the audio carrier a integer multiple of the horizontal frequency cause interference in the luminance signal again? And it looks like from the diagram that the chrominance Q signal is being transmitted with both sidebands, but the I signal is being transmitted only with a vestigial sideband! Gaaah <head explodes>
dlgoff
#15
Sep5-09, 05:26 PM
Sci Advisor
PF Gold
dlgoff's Avatar
P: 2,702

The audio lies outside the video frequencies.
Wikipedia has a little info on composite video.
Averagesupernova
#16
Sep5-09, 07:49 PM
P: 2,497
[QUOTE=rbj;2335059]but the sound carrier (to the right of where the put the chroma) didn't quite get to be exactly halfway between chroma harmonics and that caused visible distortion to the chroma. QUOTE]

I don't think that the audio sidebands are guaranteed to fall anywhere. The audio is FM and typically you have sidebands fall at harmonics of the modulating signal all the way out to the edge of the bandwidth of the signal. Max deviation is 25 Khz so suppose you are modulating with a 500 hertz signal you would have sidebands all the way from the sound carrier out to 25 khz away from it at 500 hertz intervals. Any interference with chroma is a shortcoming of the TV. Intermodulation is unwanted heterodyning of the sound and chroma signals within a stage in the TV set.
Averagesupernova
#17
Sep5-09, 08:21 PM
P: 2,497
Quote Quote by bitrex View Post
Let's see if I understand this correctly, first for plain black and white: if the carrier were simply modulated by the 15750 Hz frequency, we'd just get two sidebands + and - 15750 Hz off the carrier frequency, but if I remember correctly analog TV uses some kind of vestigial sideband thing where the upper sideband is partially suppressed.
No, not quite. The only way you will ever get only one upper and lower sideband is if you modulate with a pure sine wave. No matter what kind of luminance info you modulate with you will always have sidebands at multiples of 15750 because it is non-sinusoidal. The vestigal sideband is done to conserve bandwidth. Only one sideband is needed to convey all the information. I'm told the reason that the lower sideband (not upper) is partially supressed instead of simply eliminated is because the technology didn't exist to completely remove the lower sideband.

Quote Quote by bitrex View Post
However, the carrier is also being modulated by the luminance information, so is this the reason that there isn't just a single sideband frequency but multiple sidebands stretching away from the carrier spaced every 15750 Hz?
See above.

Quote Quote by bitrex View Post
Then, if I understand the subcarrier concept correctly, the chroma information is first AM modulated onto a 3.579545 MHz signal, and then that already-modulated signal is used to again modulate the main carrier frequency. I'm having a tough time visualizing what the spectrum would look like in the frequency domain; do you have a link to a pretty diagram that might help by any chance?
The chroma info is AM modulated onto the 3.579545 subcarrier using a balanced modulator. This way, the carrier is automatically supressed. It simply isn't needed. The colorburst signal you may have heard of is in-phase with the color subcarrier. It is used to recover the chroma signal in the TV set. It phase locks an oscillator in the TV set to the original phase of the chroma subcarrier at the TV station.
Averagesupernova
#18
Sep6-09, 11:49 AM
P: 2,497
Upon seeing the link in post #14 some of the details started to come back to me about NTSC video and I decided it would be a good idea to do a little more detailed description of what happens in color transmission. I have to admit I needed to do a bit of digging to come up with some of the numbers below. Thanks to the book Basic Televisions and Video Systems by Bernard Grob.

The 3 color signals out of the camera: Red, Green, and Blue are the 3 primary colors. Technically there is a camera for each optically filtered color. We get 3 signals that feed into a matrix to form the Y (luminance) signal, the I signal and the Q signal.
-
The Y signal is simply all three color signals added up in the following proportions: Y = .3R + .59G + .11B. If only the Y signal formed by this matrix were transmitted, all TV sets would receive a black and white signal and look the same as any other black and white TV set.
-
The I signal is the three color signals added up in the following proportions: I = .6R - .28G - .32B.
-
The Q signal is the three color signals added up in the following proportions: Q = .21R - .52G + .31B.
-
The I and Q signals and the Y signal collectively contain all the video information needed to transmit a color TV signal.
-
Now the Y signal modulates the carrier the same way it would have in the days before color. The I and Q signals modulate the 3.579545 subcarrier but they do it 90 degrees out of phase. This is why the letter Q was picked. It means quadrature. Each signal (I and Q) modulates a separate modulator and then those outputs are combined.
-
The I signal will modulate the chroma subcarrier more than the Q signal does. There is a reason for this in that the color information that it carries requires more detail. Also, the diagram in the link in post #14 will show this to be the case. However, as bitrex pointed out, it IS vestigal (this is one of those details I mentioned previously in this post). This is to keep it out of the sound channel. RBJ, you may have had some confusion over this.


Register to reply

Related Discussions
Non-standard question about the Standard Model Beyond the Standard Model 6
Quadratic, standard and standard form of a curve Calculus 9
Software NTSC decoding? Electrical Engineering 14
If a DVD player plays PAL or SECAM, will a NTSC TV display it? Computers 3
Frames of reference & Inertial frames Classical Physics 2