Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Transmission line radiation

  1. Feb 28, 2016 #1
    This question is about transmission line that I use between my Ham radio and the antenna.

    We a transmission line that is made of two parallel conductors close together. The current in one conductor is exactly the same as the current in the other conductor, but flowing in opposite direction. Transmission line's impedance is 50 ohm and it is terminated into a 50 ohm load. In every text it says that their fields will cancel each other and there will be no radiation looking at some distance "d" that is much larger that two conductors separation.

    That sounds logical but we still have two wave sources from those two wires. Even if their fields cancel each other at the distance they must still lose power due to the radiation.

    So why doesn't the transmission line radiate all the power away?

    There are no standing waves on the transmission line. Do we have to have standing waves in order to radiate something away? In that case, if we left transmission line not terminated, would it radiate since standing waves would occur on it?
     
  2. jcsd
  3. Feb 28, 2016 #2

    tech99

    User Avatar
    Gold Member

    There is radiation from each wire, but it almost cancels at some distance away because the wires are close together in terms of the wavelength. It can also be thought of as mutual impedance between the lines which reduces radiation. The line can be considered as two antennas mounted close to each other and driven in anti phase.
    By the way, I would be surprised if you have a two wire line with a characteristic impedance of 50 ohms - it is more likely to be a few hundred.
    If standing waves exist on the line, this means that the current will be exaggerated at some parts of the line and will increase the radiation, but as we have seen, it is still negligible. It is usually satisfactory to use an open wire line at very high VSWR, even 20 or 30 is still satisfactory, and allows the matching adjustment to be made at the transmitter end of the line, which is very convenient.
     
  4. Feb 28, 2016 #3
    So how much does each wire radiate? Let's say that I have half wave dipole for 3.5MHz, that's 40m total antenna length. And let's say that I have 40m of open wire feedline. Let's say that open wire has impedance of 450 ohms and dipole has impedance of 75 ohms. How much would each wire in open wire feedline radiate?
     
  5. Feb 28, 2016 #4
    There should be a tiny dipole effect I think. The power loss would be insignificant, but you might try shielding if it starts to interfere with something.

    But how could that be if you are driving an antenna a few meters away? The antenna would swamp any dipole effect. Still if one were building a radar isolated after the transmission line, it could be a problem.
     
  6. Feb 28, 2016 #5
    I don't really know how to explain this question better. I am feeding the dipole antenna with open wire feedline. Feedline does not radiate looking from the distance because currents in these two wires are the same amplitude but flowing in different directions. Does that mean that each wire is radiating EM radiation but when those two EM waves interfere they cancel each other? If that is true, how much each wire radiates?
     
  7. Feb 28, 2016 #6
    I would think you would have more problem with near field coupling than with radiative losses.

    Still, to measure the radiation (far field effect), attach a dummy load and measure the radiation pattern far from the line (at least 10 wavelengths, more is better).

    We can't even begin to estimate such a small effect without knowing a lot more information including the ground plane, support structure, and other possible interferences.

    Radiation is an effect of special relativity. It happens when charges accelerate changing their relatavistic reference frame. Antennas guide such accelerating charges while not providing easy ways for the emitted photons to reabsorb. Your transmission line fails that last requirement. Photons emitted by one wire are nearly exactly reabsorbed by the other.

    The photons are very roughly equal to the wavelength in size, so much bigger than the distance between the wires. Most of them never had a chance to break free.

    Near field effects are classical and follow Maxwell (as opposed to relatvistic Maxwell, BTW). Your emitted fields might cause currents in the ground for example which would then pick up I2R losses.
     
  8. Feb 28, 2016 #7
    Radiation implies power which means moving energy. Technically neither wire has any energy (except resistive losses). The energy is carried in the electric and magnetic fields around the wires. The wires just act to guide these waves. As such, you question has little meaning.

    We can divide the fields and their energy into components based on the changing (accelerating) currents, but that is an somewhat artificial division. Under this division, we can say one wire is handing energy off to the other. How much depends on how much energy you are pumping through the circuit. At some point in time, all of the current will be in each wire. At other times none. (It's AC after all.) However none of the energy is ever in either wire (except I2R losses).
     
  9. Feb 28, 2016 #8
    So that means that dipole radiates because the two wires are 180 degrees apart and they can't absorb each others fields?
     
  10. Feb 28, 2016 #9
    No. Because they are out of phase, they almost "absorb" (more correctly "cancel") each others' fields.

    The dipole effect is caused by the distance between the two wire. If the two wires were in exactly the same place, they would exactly cancel. Since they are separated they only almost cancel.

    Suppose the wires are mounted flat, one next to the other (which is typical). Looking from above and at a distance, the fields exactly cancel. But from the side, one is slightly closer than the other. This makes it slightly stronger. So they don't exactly cancel in this direction.

    There is a tiny opportunity for a photon to escape in this direction. But it will be many dB lower than in an antenna and directional to boot. Since it is near the antenna the effect will be unnoticeable for most purposes.
     
  11. Feb 28, 2016 #10
    What fields are we talking about, are we only talking about magnetic fields of two wires that's interacting or are we talking about EM waves from two wires with their magnetic and electric fields interacting?

    EDIT: I think I understand. We are looking at total magnetic field from two wires. It doesn't matter that there are two wires that have two magnetic fields. What matters is total magnetic field which is near zero. Am I getting any closer?
     
    Last edited: Feb 28, 2016
  12. Feb 28, 2016 #11

    tech99

    User Avatar
    Gold Member

    In this case, the two wires are fed at opposite ends but 180 degrees out of phase, so the overall effect is additive i.e. in-phase radiation.
     
  13. Feb 28, 2016 #12
    Both the electric field and the magnetic field (which are typically 90º apart) add in a linear fashion. So the + electric field cancels the - electric field, and the north magnetic field cancels the south magnetic field.
     
  14. Feb 28, 2016 #13
    You are not talking about destructive interference?
    6acd38d48065ef988dddb89df4f1dd9c.png
    That way we would have areas with no radiation and areas with radiation. That's obviously not how transmission lines work.

    EDIT:
    I mean, why would one wire "care" where the other wire is? If we pulled them apart so they look like a dipole it would radiate. If they are parallel they don't radiate. It's not interference because there is no interference pattern, and besides, if it was interference, all of the power would get radiated before it even reached the antenna.

    I understand how magnetic fields can cancel each other, and how electric fields can cancel each other, but I can't understand how they can cancel each other in feedline when nothing is radiated away from feedline.
     
    Last edited: Feb 28, 2016
  15. Feb 28, 2016 #14
    No. The points (wires?) above are λ apart. In your transmission line, they are close (typically 1/10 λ or less).
     
  16. Feb 28, 2016 #15

    sophiecentaur

    User Avatar
    Science Advisor
    Gold Member

    The amount that each one radiates is affected greatly by the current flowing in the other. You cannot consider them separately if you want to calculate the 'total power'.
    One way of looking at it is that the two wires constitute a very thin loop antenna. Such an antenna is very badly matched free space so it will not radiate much power.
    But a simple two wire feeder does produce finite radiation. The effect of this radiation can cause crosstalk between nearby open wire feeders (and audio lines) which can be partially cancelled by reversing the connections every so often - and by twisting the pairs.
     
  17. Feb 28, 2016 #16

    sophiecentaur

    User Avatar
    Science Advisor
    Gold Member

    That diagram is not appropriate here. A and B are a small fraction of a wavelength so the interference pattern will have a very shallow max and min. The absolute level will also be small because of the mismatch.
     
  18. Feb 29, 2016 #17
    Is there some book that you would recommend me to read about this?

    Impedance of the free space is 377 ohms. we have open wire feedline with 350 ohms characteristic impedance. How can it be matched badly to free space?
    The only thing that I can find that is different in the antenna is the shape of the electric field lines.

    Untitled.png


    Does it have to do with that?

    OK, basically what I read was that in order for something to radiate, it must have radiation resistance. Radiation resistance depends on geometry of the antenna. So, two wire feedline doesn't have radiation resistance? Or should I say, very small radiation resistance?
     
    Last edited: Feb 29, 2016
  19. Feb 29, 2016 #18

    tech99

    User Avatar
    Gold Member

    I think the diagram which is, I think, by J D Kraus, is very misleading. It is not enough to just create a 377 ohm transmission line to obtain radiation. Sorry I cannot recommend a book which describes the process of radiation in a correct and simple manner.
     
  20. Feb 29, 2016 #19
    OK, can you at least confirm if this is correct?

    We have two parallel wires and we are looking at some small length segment dL.
    Is magnetic field from left wire segment inducing current in the right wire segment?
     
  21. Feb 29, 2016 #20
    Yes. But once again it is a dipole-ish effect. (But for a different reason.)

    Magnetic fields circle current carrying wires. This is so fundamental that the IEEE organization uses it as its symbol. Similarly circling magnetic fields induce a current.

    So the first wire induces a field. But that field circles the first wire not the second. Instead the second wire feels a slight curl because the far side of the wire is farther away than the near side. So some current will be induced, but not much. Further the amount will depend on the thickness of the second wire.

    On radiation resistance: Radiation resistance is how most antenna designers account for the relativistic effect of the photons jumping into space -- i.e. far field energy loss. Few texts wish to go into the relativity thing, so I doubt you will encounter it. It is highly dependent on morphology (as is the underlying relativistic effect). EEs like to think in fields rather than particles (both are valid and mathematically equivalent).
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted



Similar Discussions: Transmission line radiation
  1. Transmission line (Replies: 6)

  2. Transmission Lines (Replies: 4)

  3. Transmission line (Replies: 4)

Loading...