How can wires carry analog signal?

AI Thread Summary
Analog signals can be effectively transmitted over cables despite concerns about voltage drop and cable length affecting signal quality. The key lies in the use of frequency modulation, which makes analog signals less sensitive to variations in amplitude. Attenuation in cables primarily affects the signal's amplitude, not its frequency, allowing for amplification to restore the original signal strength. Additionally, the group delay of frequencies within a standard TV channel remains relatively consistent, preserving the waveform shape during transmission. Overall, while analog transmission is complex, it remains feasible due to modulation techniques and amplification methods.
ShawnD
Science Advisor
Messages
715
Reaction score
2
Digital Cable seems to be a newer thing, which would imply cable TV up to this point has been analog. How? How can a cable accurately carry an analog signal, probably based on voltage, if cable length greatly affects voltage? Wouldn't that cause all sorts of crazy problems such as things appearing blue if the cable length is 1m but appear red if the cable length is 50m? Be a rectangle if the cable is 5m but appear as a circle at 100m? It's not too crazy to think the voltage can drop from 10V to 5V if the cable is too long.

Given that analog signal requires very accurate detection, wouldn't small changes in voltage cause extreme changes in the picture?
I'm still trying to wrap my head around the concept of analog. It's infinitely more complicated than digital.
 
Physics news on Phys.org
Attenuation in a cable affects the amplitude of a signal, not its frequency (neglecting any dispersion effects). So all you have to do is run the attenuated signal through an amplifier to boost it back up to its original level.

TV (whether analog or digital) uses frequency modulation of a VHF or UHF carrier signal, so I don't think it's very sensitive to variations in overall amplitude to begin with. Which is a good thing, because with over-the-air broadcast TV the signal strength from the receiving antenna can vary tremendously from one station to another, depending on the transmitter power, distance from the transmitter, sensitivity (gain) of the receiving antenna, physical obstacles such as buildings or mountains, and atmospheric conditions.
 
jtbell said:
TV (whether analog or digital) uses frequency modulation of a VHF or UHF carrier signal...
Which is why FM radio is typically better in quality than AM (amplitude modulation) radio.
 
Shawn -- check out some of this info on modulation schemes (both analog and digital):

http://en.wikipedia.org/wiki/Modulation

Communication theory is a very broad, interesting and practical subject.
 
The bandwidth of a standard-definition NTSC TV channel is about 5 MHz. Even though coax cables used by cable networks are lossy, and their loss is frequency-dependent, the group delay of all the frequencies between, say, 50 and 55 MHz is roughly equal. The signal's waveform (shape) is thus not changed (much) by the transmission. Only the amplitude is attenuated, and this can be fixed with an amplifier, as long as the signal's signal-to-noise ratio is still acceptable even after the attentuation.

In the limit of smaller and smaller bandwidth signals, channel effects become less and less significent. Obviously, if you're only transmitting a single frequency (zero bandwidth), a cable's not going to affect its shape -- all the cable can do is decrease its amplitude.

- Warren
 
I have recently been really interested in the derivation of Hamiltons Principle. On my research I found that with the term ##m \cdot \frac{d}{dt} (\frac{dr}{dt} \cdot \delta r) = 0## (1) one may derivate ##\delta \int (T - V) dt = 0## (2). The derivation itself I understood quiet good, but what I don't understand is where the equation (1) came from, because in my research it was just given and not derived from anywhere. Does anybody know where (1) comes from or why from it the...
Back
Top