1. The problem statement, all variables and given/known data Telephone connection between Europe and North America can be carried by cable or by the use of a geosynchronous communication satellite. Estimate the time it takes for a signal to travel 8000 km via cable, assuming the speed is close to the speed of light. How does this compare to the time required for the same signal to travel via satellite, at a near geosynchronous altitude of 40200 km from the center of Earth? (Assume the satellite is directly above the sending and receiving location of the signal.) 2. Relevant equations time = (distance/speed) so far I have the first part of this question correct, which asks 'estimate the time it takes for a signal to travel 8000 km. 3. The attempt at a solution I have done this (8000 km x 1000 m)/(3E8 m/s) = 0.02666 seconds. I am however confused on how to do the next part. It asks "how does this 'compare' to the time it takes for the same signal to reach 40200 km to a satellite". Of course i first converted the km to m which would be 4.02E7 m. Am i suppose to find the time like above and somehow come up with a ratio? Then, I would do something like this (4.02E7 m)/(3E8 m/s) = 0.134 seconds. This is where i am stuck I tried taking time1/time2 and time2/time1, but neither of these ratios work for the comparison Any help would be appreciated. Thank you.