- #1
gnome
- 1,041
- 1
I read (in W. Stallings "Data & Computer Communications") that
Does that second sentence make sense? That seems to imply that if, for example, several signals of, say, 6 MHz bandwidth each are multiplexed in a cable, so maybe one signal is in a 1-7 MHz range, and another is at 10-16 MHz, and another at 19-25 MHz, that the portions of the signals at 4 MHz and 13 MHz and 22 MHz would be propagating faster than the frequencies in between these levels. (I have no idea whether or not these specific numbers are realistic; I'm just using them as an arbitrary example.) Why would the signal velocities vary up-down-up-down-up-down... as frequency increases?delay distortion occurs because the velocity of propagation of a signal through a guided medium varies with frequency. For a bandlimited signal, the velocity tends to be highest near the center frequency and fall off toward the two edges of the band.