I think pretty much everybody knows the length of the transmission line should be less than the wave length of the RF signal. How do you explain it to some one who is not from engineering background. I myself can't get it some times. Say for example the wavelength of a wave is 2" = ------------------ The conductor length is 1/10=0.1" (approximately the '-' below) - If you draw a sine wave of wavelength 2", the the voltage amplitude is different over the 0.1" line. It does vary by some amount and its definitely not flat. Maybe flat for 0.001". So why is it ok if the conductor lenght is 1/10th of the wavelength even though the amplitude of the wave varies over the length of the conductor?