Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Why use 75 ohms on TV cables

  1. Apr 13, 2010 #1
    we learned it could be a trade off between loss and flexibility. Because if it was around 50 ohms or so it would be too thick of a wire.

    and the we learned the minimum loss is around about 77 ohms.

    What do you guys think.
     
  2. jcsd
  3. Apr 13, 2010 #2
    Many years ago I recall calculating the optimum impedance and found ~77 ohms to be lowest attenuation for a fixed coax OD and dielectric type (solid, foam, etc.). The minimum attenuation impedance may depend on the propagation velocity (i.e., dielectric type). As I recall, at other impedances the skin effect losses on the center conductor were higher. Foam polyethylene dielectric is better than solid. See

    http://www.dxzone.com/cgi-bin/dir/jump2.cgi?ID=14693

    Bob S
     
  4. Apr 13, 2010 #3
    The real reason is that 75 ohms is closer to the real part of the impedance of a dipole at around 73 ohms.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Why use 75 ohms on TV cables
  1. Cable TV Signal (Replies: 5)

Loading...