# Transmission line question

## Main Question or Discussion Point

Is it true that lossless transmission lines don't distort signals?

to me, they seem like low pass filters.

i had this class a while ago, but I guess i forgot this, or never understood it

thanks

Related Electrical Engineering News on Phys.org
I was under the impression they don't really exist...hmm

you can probably get very close at least

Are speaking things like a 50 ohm coaxial cables, or the ideal model in terms of the L-C ladder network in the limit as the ladder is more finely divided?

Last edited:
rbj
Is it true that lossless transmission lines don't distort signals?
no. losslessness is an independent property of distortion free propagation.

i don't remember the equations that express $\alpha$ and $\beta$ (expressed in terms of nepers per unit length and radians per unit length) in terms of R, L, G, C of the transmission line. all of these parameters are functions of frequency $\omega$. if $\alpha$ and $\frac{\beta}{\omega}$ are both constant with frequency, then you have a distortionless line. if $\alpha=0$, then you have a lossless line.

P=alpha+j * beta
where P is the propogation constant
alpha = (R/(2*z))
where alpha is the attenuation constant
beta = (2*pi/lambda)
where beta is the phase constant
z= characteristic impedence (usually z0)
For a lossless line alpha is 0

Is it true that lossless transmission lines don't distort signals?

to me, they seem like low pass filters.

i had this class a while ago, but I guess i forgot this, or never understood it

thanks
Theoractically yes!!!

But there is not lossless Tx lines. Just how much. They are not exactly low past. For lossy Tx line, different frequencies have slightly different propagation velocities. So for a long line, signals don't arrive at the same time. Of cause the higher the frequency, the more attenuation on the line. Because the attenuation constant is proportional to frequency.

I can't see that there are any ways that coax can distort a signal.

The velocity is more or less constant with frequency.

There is no frequency cut-off as such except when the diameter is half a wavelength or more and waveguide modes appear. This limits the maximum diameter of coax at a given freq.

I can't see that there are any ways that coax can distort a signal.

The velocity is more or less constant with frequency.

There is no frequency cut-off as such except when the diameter is half a wavelength or more and waveguide modes appear. This limits the maximum diameter of coax at a given freq.
Not true, depend on how long the coax is. I have experience with this very issue. I worked for LeCroy before that design digital scope and transient recorder in the 80s. Those days, you can't buy 8bits 100MHz ADC. We had to design subrange configuration using two 4 bits ADC of 100MHz. We have to do summing of two signals and delay using coax as delay lines. The wave form get distorted and I have to design compensation networks for different frequencies to line the signal up again. We started out with RG-175, that has so much distortion that no networks can even compensate it back. I end up using big RG-58 and 4 compensating network in order to make it work.

For single frequency application in typical RF application, you don't see this problem because the frequencies are close together. You see attenuation only and it is easy to get the signal back. For broadband, it is very very obvious!!! We were using about 12 feet and it really showed.

Check out "group velocity" in EM books, they talk about the exact issue. There are no lossless dielectric, just how much. Not even taflon or any of the fancy dielectrics from Rogers or 3M.

Velocity=$$\frac{1}{\sqrt{\epsilon\mu}}$$ Where $$\epsilon$$ is frequency dependent. The imaginary part is $$\sigma$$/ $$\omega$$.

Last edited: