Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Transmission line question

  1. Mar 20, 2008 #1
    Is it true that lossless transmission lines don't distort signals?

    to me, they seem like low pass filters.

    i had this class a while ago, but I guess i forgot this, or never understood it

  2. jcsd
  3. Mar 20, 2008 #2
    I was under the impression they don't really exist...hmm

    you can probably get very close at least
  4. Mar 20, 2008 #3
    Are speaking things like a 50 ohm coaxial cables, or the ideal model in terms of the L-C ladder network in the limit as the ladder is more finely divided?
    Last edited: Mar 20, 2008
  5. Mar 20, 2008 #4


    User Avatar

    no. losslessness is an independent property of distortion free propagation.

    i don't remember the equations that express [itex]\alpha[/itex] and [itex]\beta[/itex] (expressed in terms of nepers per unit length and radians per unit length) in terms of R, L, G, C of the transmission line. all of these parameters are functions of frequency [itex]\omega[/itex]. if [itex]\alpha[/itex] and [itex]\frac{\beta}{\omega}[/itex] are both constant with frequency, then you have a distortionless line. if [itex]\alpha=0[/itex], then you have a lossless line.
  6. Jan 7, 2009 #5
    P=alpha+j * beta
    where P is the propogation constant
    alpha = (R/(2*z))
    where alpha is the attenuation constant
    beta = (2*pi/lambda)
    where beta is the phase constant
    z= characteristic impedence (usually z0)
    For a lossless line alpha is 0
  7. Jan 8, 2009 #6
    Theoractically yes!!!

    But there is not lossless Tx lines. Just how much. They are not exactly low past. For lossy Tx line, different frequencies have slightly different propagation velocities. So for a long line, signals don't arrive at the same time. Of cause the higher the frequency, the more attenuation on the line. Because the attenuation constant is proportional to frequency.
  8. Jan 8, 2009 #7
    I can't see that there are any ways that coax can distort a signal.

    The velocity is more or less constant with frequency.

    There is no frequency cut-off as such except when the diameter is half a wavelength or more and waveguide modes appear. This limits the maximum diameter of coax at a given freq.
  9. Jan 8, 2009 #8
    Not true, depend on how long the coax is. I have experience with this very issue. I worked for LeCroy before that design digital scope and transient recorder in the 80s. Those days, you can't buy 8bits 100MHz ADC. We had to design subrange configuration using two 4 bits ADC of 100MHz. We have to do summing of two signals and delay using coax as delay lines. The wave form get distorted and I have to design compensation networks for different frequencies to line the signal up again. We started out with RG-175, that has so much distortion that no networks can even compensate it back. I end up using big RG-58 and 4 compensating network in order to make it work.

    For single frequency application in typical RF application, you don't see this problem because the frequencies are close together. You see attenuation only and it is easy to get the signal back. For broadband, it is very very obvious!!! We were using about 12 feet and it really showed.

    Check out "group velocity" in EM books, they talk about the exact issue. There are no lossless dielectric, just how much. Not even taflon or any of the fancy dielectrics from Rogers or 3M.

    Velocity=[tex]\frac{1}{\sqrt{\epsilon\mu}}[/tex] Where [tex]\epsilon[/tex] is frequency dependent. The imaginary part is [tex]\sigma[/tex]/ [tex]\omega[/tex].
    Last edited: Jan 8, 2009
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook