How Long Is the Delay for a Signal Traveling via Two Geostationary Satellites?

AI Thread Summary
The discussion focuses on calculating the delay for a signal traveling between two geostationary satellites and returning to Earth. The signal first ascends to the first satellite, then travels 50,000 km to the second satellite, and finally descends back to Earth. The speed of light is used to determine the time taken for each leg of the journey. The total expected delay for the signal is approximately 0.40 seconds, but the calculations presented yield a different result, prompting further clarification on the correct approach. Understanding the three segments of the signal's path is crucial for accurately determining the total delay.
JasonR2
Messages
2
Reaction score
0

Homework Statement



Question: If a broadcast of a sporting event went up to a geostationary satellite, then traveled 50 000 km to another geostationary satellite, then came back to earth, what would be the delay in the signal? All electromagnetic waves travel at the speed of light, and most of the route would be in a vacuum. Also, remember that the path starts and ends at at the Earth's surface, not at the center of the Earth.

Variables:

c = 299 792 458 m / s
distance between two satellites = 50,000 km or 5.0 x 10^7 m

Other variables:

G = 6.674 x 10 ^ -11 m ^3 / kg/
Me (mass of earth) = 5.97219 x 10^ 24 kg
sd (Seconds in a day) = 86400 seconds

Homework Equations

L= 2 * pi *r

L = V/T

V = sqrt(G * Me / r)

From those the equation below is derived...

r = ((sd * sqrt(G * Me)) / (2 * pi) )^ (2/3)

v = d/t

The Attempt at a Solution



I found that the geostationary radius (r_) is 4.22 x 10^ 7 m or 4.22 x 10^4 km. Which is confirmed is confirmed correct.

My problem is I don't know where to go from here I'm assuming I have to use the geostationary radius (r_) calculated and the speed of light (c). So the formula V=D/T should be the one to use.

The answer is supposed to be 0.40 seconds

I thought that I might need to look for the hypotenuse so I tried the a^2 + b^2 = c^2 Using 4.22 x 10^7 as a and 5.0 x 10^7 as b. I end up getting 6.54 x 10 ^ 7 m. I then took that and plugged it into the V=D/T equation to solve for T. V = speed of light.

T = 6.54 x 10 ^ 7 / 3.0 x 10 ^ 8 = 0.218 seconds

Unfortunately I come out with 0.218 seconds not even close to the answer.

Where do I go from here? Thanks!
 

Attachments

  • Earth Orbit.png
    Earth Orbit.png
    1.7 KB · Views: 512
Physics news on Phys.org
JasonR2 said:

Homework Statement



Question: If a broadcast of a sporting event went up to a geostationary satellite, then traveled 50 000 km to another geostationary satellite, then came back to earth, what would be the delay in the signal? All electromagnetic waves travel at the speed of light, and most of the route would be in a vacuum. Also, remember that the path starts and ends at at the Earth's surface, not at the center of the Earth.

Variables:

c = 299 792 458 m / s
distance between two satellites = 50,000 km or 5.0 x 10^7 m

Other variables:

G = 6.674 x 10 ^ -11 m ^3 / kg/
Me (mass of earth) = 5.97219 x 10^ 24 kg
sd (Seconds in a day) = 86400 seconds

Homework Equations

L= 2 * pi *r

L = V/T

V = sqrt(G * Me / r)

From those the equation below is derived...

r = ((sd * sqrt(G * Me)) / (2 * pi) )^ (2/3)

v = d/t

The Attempt at a Solution



I found that the geostationary radius (r_) is 4.22 x 10^ 7 m or 4.22 x 10^4 km. Which is confirmed is confirmed correct.

My problem is I don't know where to go from here I'm assuming I have to use the geostationary radius (r_) calculated and the speed of light (c). So the formula V=D/T should be the one to use.

The answer is supposed to be 0.40 seconds

I thought that I might need to look for the hypotenuse so I tried the a^2 + b^2 = c^2 Using 4.22 x 10^7 as a and 5.0 x 10^7 as b. I end up getting 6.54 x 10 ^ 7 m. I then took that and plugged it into the V=D/T equation to solve for T. V = speed of light.

T = 6.54 x 10 ^ 7 / 3.0 x 10 ^ 8 = 0.218 seconds

Unfortunately I come out with 0.218 seconds not even close to the answer.

Where do I go from here? Thanks!
Read the problem statement again carefully.

The signal is transmitted from the surface of the Earth to a satellite in geosynchronous orbit. The signal then bounces off that first satellite over to another satellite which is located 50,000 km away from the first satellite, and which is also in geosynchronous orbit. The signal then travels back to the surface of the earth. The total time delay is how long it takes the signal to travel over these three legs combined.
 
Kindly see the attached pdf. My attempt to solve it, is in it. I'm wondering if my solution is right. My idea is this: At any point of time, the ball may be assumed to be at an incline which is at an angle of θ(kindly see both the pics in the pdf file). The value of θ will continuously change and so will the value of friction. I'm not able to figure out, why my solution is wrong, if it is wrong .
TL;DR Summary: I came across this question from a Sri Lankan A-level textbook. Question - An ice cube with a length of 10 cm is immersed in water at 0 °C. An observer observes the ice cube from the water, and it seems to be 7.75 cm long. If the refractive index of water is 4/3, find the height of the ice cube immersed in the water. I could not understand how the apparent height of the ice cube in the water depends on the height of the ice cube immersed in the water. Does anyone have an...
Back
Top