# Wave length and transmission line

1. Nov 17, 2008

### likephysics

I think pretty much everybody knows the length of the transmission line should be less than the wave length of the RF signal.
How do you explain it to some one who is not from engineering background.

I myself can't get it some times.
Say for example the wavelength of a wave is

2" =
------------------

The conductor length is 1/10=0.1" (approximately the '-' below)

-

If you draw a sine wave of wavelength 2", the the voltage amplitude is different over the
0.1" line. It does vary by some amount and its definitely not flat. Maybe flat for 0.001".

So why is it ok if the conductor lenght is 1/10th of the wavelength even though the amplitude of the wave varies over the length of the conductor?

2. Nov 17, 2008

### Staff: Mentor

3. Nov 18, 2008

### Pumblechook

A transmission line can be and in many cases is many times longer than a wavelength.

The coaxial cable from you TV aerial might be 10 metres long and the wavelength is 50 cm. ... 20 wavelengths.

The cable from your satellite dish might be 10 metres with signals at 15 cm ..... 67 wavelengths.

The voltage and current will be constant (apart from the fact that it is AC) along the line if the line is perfectly matched to the load.

4. Nov 18, 2008

### likephysics

ah! Thanks.
In case of 60Hz AC, if the transmission line is longer than wavelength how do you terminate it. I mean what impedance.
So the goal is to stop or minimize reflection from the load?

5. Nov 18, 2008

### Corneo

60 Hz is way too low in frequency for transmission line theory to come into effect. When RF people talk about transmission lines, they aren't talking about the big lines that deliver your AC power.

6. Nov 19, 2008

### dlgoff

High voltage power transmission lines (3 phase) have the conductors spaced to minimize lose. And at the substations, you'll find inductors and capacitors that do the impedance matching. If not, you would be transmitting power to the moon.