1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Time it takes to transmit signal to satellite (modern physics)

  1. Jun 12, 2014 #1
    1. The problem statement, all variables and given/known data

    A communication satellite is orbiting 36 000 km above Earth’s surface. Two cities, 3500 km apart are transmitting to and receiving signals from each other. Find the time required to transmit a signal from one city to the other. They are equidistant from the satellite.

    h = 36000 km = 3.6 x 107 m
    d = 3500 km = 3.5 x 106 m
    t = ?

    2. Relevant equations

    Pythagorean theorem to find l, but I'm not sure what equation to use to find time! Any help?

    3. The attempt at a solution

    I found the length of the hypotenuse between each town and the satellite:

    l = √((3.6 x 107)2 + (3.5 x 106)2) = 3.60 x 107 m.

    How do I go along from there?
     
  2. jcsd
  3. Jun 12, 2014 #2

    BvU

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    But your hypothenusa calculation seems to suggest the satellite is right above one of the cities ?! (Either that, or the towns are 7000 km apart...)

    Another thing: You a member of the flat earth society ? Make a drawing ! Not a big correction, but nevertheless...

    You may assume the signal is not a mail pigeon, but an electromagnetic wave (e.g. a radio signal or a light signal). Such waves travel with the speed of light. Distance / speed = time ! Bingo!
     
  4. Jun 12, 2014 #3
    velocity = distance/time

    The problem never states that the satellite must be directly above the point in the middle of the line between the cities. The satellite could be just above the horizon in both cities, and somewhat more than the radius of the earth would be added to the distance.
     
  5. Jun 12, 2014 #4
    I got the answer as being 0.24 seconds which I think is right! Thank you. :)
     
  6. Jun 12, 2014 #5

    BvU

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    What would you get as answer when the two cities are 1 km apart ?

    By the way, l = √((3.6 x 107)2 + (3.5 x 106)2) = 3.60 x 107 m is wrong. Not by much, I concede.

    Wim points out something sensible, but I dare to assume the satellite is above the halfway point.

    Even then a correction due to the earth being round is needed! Both to the 36000 and to the 3500 km! Make the drawing.

    Good thing I'm not teacher. 0.24 s wouldn't cut the cake in my class....
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted



Similar Discussions: Time it takes to transmit signal to satellite (modern physics)
  1. Modern physics (Replies: 1)

  2. Modern Physics (Replies: 4)

Loading...