A communication satellite is orbiting 36 000 km above Earth’s surface. Two cities, 3500 km apart are transmitting to and receiving signals from each other. Find the time required to transmit a signal from one city to the other. They are equidistant from the satellite.
h = 36000 km = 3.6 x 107 m
d = 3500 km = 3.5 x 106 m
t = ?
Pythagorean theorem to find l, but I'm not sure what equation to use to find time! Any help?
The Attempt at a Solution
I found the length of the hypotenuse between each town and the satellite:
l = √((3.6 x 107)2 + (3.5 x 106)2) = 3.60 x 107 m.
How do I go along from there?