Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Transmission Line Simulation Questions

  1. Sep 23, 2013 #1
    I am simulating an unshielded twisted wire pair of a communication network, Controller Area Network.

    Bus Speeds taken into consideration are 250/500/1000/2000 kbps.

    Depending on the physical dimensions of the wire and insulation dielectric constant, I have computed the R,L,C,G parameters. Since, the wire length will be smaller than 300meters (i.e. considering a 1Mhz freq.), I am using lumped parameters for each section of the wire for simulation purposes.

    The general topology of the circuit is a 3 Node network, with line terminations of 120Ω at the two ends of the bus. For simulation purposes, one node of the network will be transmitting, and the others will be receiving. The lumped model is defined by the generic, RLCG model.

    There are a few things, where I cannot get my head around, need your help on it:

    1. The orientation of the model? Generally, the model has R, L at the node where the current is entering the circuit. Is this a rule?

    2. If I model the circuit to have a more distributed parameter, for e.g. instead of having a R,L,C,G model of one section of length 1m, I could split it up in 10 models of 0.1m for every section. Doing this resulted in a very distorted waveform, when the node is not transmitting(at 2.5V, everytime, everywhere). When I just use one model, that is lumped, the waveform is even, and almost follows the measured values. Why?

    3. As I move farther away from the transmitting node, the initial spikes(transient) keep on increasing. This is not the case, in measured voltage waveforms. Why does this happen?

    I have attached files: Capture being the simulated circuit, 3 Node at Sender being waveforms for transmitting end, and 3 Node at Farthest end being, well that is self explanatory. '004' is the measured waveform in the lab.

    Thank you for your time.
     

    Attached Files:

    Last edited: Sep 23, 2013
  2. jcsd
  3. Sep 24, 2013 #2

    meBigGuy

    User Avatar
    Gold Member

    I don't have all the answers for you. I still need to digest exactly what you are doing.
    First off: The idea of relating 300m to 1 Mhz is erroneous. Electrical wavefronts travel at, say, about 0.5 to 0.75 ns per foot (velocity of propagation in the cable). You will see reflections whenever the distance travelled is larger than the propagation distance during a rise/fall time. The period is not the issue. If you want accurate simulation your model should have enough sections to deal with the rise/fall time.

    Here is some model examples:
    http://www.ece.uci.edu/docs/hspice/hspice_2001_2-269.html

    I didn't find any numbers for commercial cables, but they must be out there.

    Note that multi-drop transmission systems will always have edge distortion in the middle taps.
     
  4. Sep 25, 2013 #3
    By this you mean:

    "Let's say the propagation delay of the wire is 5ns/meter, (1.52ns/foot). Because it has to propagate back through the return wire, a delay of 5*2=10ns/m.

    With the transceivers transition (rise/fall) time being in the range of 60ns to 160ns.

    Lets assume, total time delay of the cable to be 10ns/m. Rise/Fall time of transceiver be 100ns. Does it mean that 100ns/10 = 10meters, be the critical length? Meaning, if I have a section larger than 10m, I have a problem."
    Is my understanding right?

    If that is the case, I have accounted for it, and most of my sections are close to around 0.2 to 2 meter.

    By middle taps you mean the connecting points between two sections? Is there a way to mitigate it?
    Does it happen due to "charging" of the following section?

    Thank you for your reply!
     
    Last edited: Sep 25, 2013
  5. Sep 26, 2013 #4

    meBigGuy

    User Avatar
    Gold Member

    Maybe I erred. The middle taps would only see distortion as I was thinking of it if you were driving a series terminated line (open ends).

    If you are driving with low impedance drivers then you need terminators to avoid reflections (I see what look like terminators). In that case if the taps change the impedance, they they will have effects.

    One problem with lumped models is that you tend to think of "charging". In a distributed model the transmission line presents its characteristic impedance always since the capacitance and inductance are distributed evenly along the cable (as are the losses).

    As for the glitches you are seeing, I think I understand. It looks like an imbalance in the propagation paths (delay) and crosstalk between the lines. The purple path is earlier than the green path.
     
  6. Sep 26, 2013 #5
    The transceivers works this way: During the recessive stage both the pins are at 2.5V. In the dominant stage, one is pulled at 3.5V while the other is 1.5V.

    As you mention the glitches are caused due to the crosstalk and propagation delay, I thought about it and googled too, but I cant seem to find any explanation(can you please elaborate a little on this phenomenon?) to it or what could be done to avoid it.

    The cable has been designed to have a delay of 5ns/m with a char. impedance of 120Ω. It is terminated at both ends at with resistors of 120Ω.

    From what I understand, the current in the inductance is causing the spikes at those points. Is it common for that to happen even in the real world, or is it just a limitation of the lumped models or a limitation in design?
     
  7. Sep 27, 2013 #6

    meBigGuy

    User Avatar
    Gold Member

    EDITED: blah blah blah

    The transceiver needs to be open circuit during idle. If you have long cables to each transceiver, that will be an issue.
     
    Last edited: Sep 27, 2013
  8. Sep 27, 2013 #7

    meBigGuy

    User Avatar
    Gold Member

  9. Sep 30, 2013 #8
    Thanks!

    A quick look at the circuit reveals that the new R & L values will be half of the original value and will be arranged according to the image you linked.

    It has solved the issue! Thank you. I feel foolish of not thinking about it earlier.

    I have one more doubt about the connector impedance.

    As I am using a pretty low freq. 250/500/1000/2000kbps, for the simulation and the connector length will be not more than 0.5mm/1cm, does the connector impedance really impact the signal substantially?

    From what I have read, it shouldn't, as the wavelength of the signal is much larger than the length of the connector. But then what exactly are the reasons behind?
     
  10. Sep 30, 2013 #9

    meBigGuy

    User Avatar
    Gold Member

    I agree that the connector will probably not affect the results. You need pretty high speed signals to detect a change like that. You should probably try to simulate something like that just to get a feel for it.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Transmission Line Simulation Questions
Loading...