no. losslessness is an independent property of distortion free propagation.Is it true that lossless transmission lines don't distort signals?
Theoractically yes!!!Is it true that lossless transmission lines don't distort signals?
to me, they seem like low pass filters.
i had this class a while ago, but I guess i forgot this, or never understood it
Not true, depend on how long the coax is. I have experience with this very issue. I worked for LeCroy before that design digital scope and transient recorder in the 80s. Those days, you can't buy 8bits 100MHz ADC. We had to design subrange configuration using two 4 bits ADC of 100MHz. We have to do summing of two signals and delay using coax as delay lines. The wave form get distorted and I have to design compensation networks for different frequencies to line the signal up again. We started out with RG-175, that has so much distortion that no networks can even compensate it back. I end up using big RG-58 and 4 compensating network in order to make it work.I can't see that there are any ways that coax can distort a signal.
The velocity is more or less constant with frequency.
There is no frequency cut-off as such except when the diameter is half a wavelength or more and waveguide modes appear. This limits the maximum diameter of coax at a given freq.