Time it takes to transmit signal to satellite (modern physics)

In summary, a communication satellite is orbiting 36 000 km above Earth's surface. Two cities, 3500 km apart are transmitting to and receiving signals from each other. To send a signal from one city to the other, it takes approximately 0.24 seconds. If the two cities were 1 km apart, the signal would take approximately 0.48 seconds to travel.
  • #1
salmayoussef
31
0

Homework Statement



A communication satellite is orbiting 36 000 km above Earth’s surface. Two cities, 3500 km apart are transmitting to and receiving signals from each other. Find the time required to transmit a signal from one city to the other. They are equidistant from the satellite.

h = 36000 km = 3.6 x 107 m
d = 3500 km = 3.5 x 106 m
t = ?

Homework Equations



Pythagorean theorem to find l, but I'm not sure what equation to use to find time! Any help?

The Attempt at a Solution



I found the length of the hypotenuse between each town and the satellite:

l = √((3.6 x 107)2 + (3.5 x 106)2) = 3.60 x 107 m.

How do I go along from there?
 
Physics news on Phys.org
  • #2
They are equidistant from the satellite
But your hypothenusa calculation seems to suggest the satellite is right above one of the cities ?! (Either that, or the towns are 7000 km apart...)

Another thing: You a member of the flat Earth society ? Make a drawing ! Not a big correction, but nevertheless...

You may assume the signal is not a mail pigeon, but an electromagnetic wave (e.g. a radio signal or a light signal). Such waves travel with the speed of light. Distance / speed = time ! Bingo!
 
  • #3
salmayoussef said:

Homework Statement



A communication satellite is orbiting 36 000 km above Earth’s surface. Two cities, 3500 km apart are transmitting to and receiving signals from each other. Find the time required to transmit a signal from one city to the other. They are equidistant from the satellite.

h = 36000 km = 3.6 x 107 m
d = 3500 km = 3.5 x 106 m
t = ?

Homework Equations



Pythagorean theorem to find l, but I'm not sure what equation to use to find time! Any help?

velocity = distance/time

The problem never states that the satellite must be directly above the point in the middle of the line between the cities. The satellite could be just above the horizon in both cities, and somewhat more than the radius of the Earth would be added to the distance.
 
  • #4
I got the answer as being 0.24 seconds which I think is right! Thank you. :)
 
  • #5
What would you get as answer when the two cities are 1 km apart ?

By the way, l = √((3.6 x 107)2 + (3.5 x 106)2) = 3.60 x 107 m is wrong. Not by much, I concede.

Wim points out something sensible, but I dare to assume the satellite is above the halfway point.

Even then a correction due to the Earth being round is needed! Both to the 36000 and to the 3500 km! Make the drawing.

Good thing I'm not teacher. 0.24 s wouldn't cut the cake in my class...
 

1. How does the time it takes to transmit a signal to a satellite affect communication?

The time it takes for a signal to reach a satellite can impact the speed and reliability of communication with that satellite. If the signal takes too long, there may be delays or even lost connections. This is especially important for real-time communication, such as video conferencing or live streaming.

2. What factors affect the time it takes for a signal to reach a satellite?

The main factors that affect the time it takes for a signal to reach a satellite include the distance between the satellite and the source of the signal, the speed of the signal, and any potential interference or obstructions along the signal's path.

3. Can the time it takes for a signal to reach a satellite be reduced?

In general, the time it takes for a signal to reach a satellite cannot be significantly reduced due to the limitations of the speed of light. However, advancements in technology and signal processing can help to minimize delays and improve overall communication speeds.

4. How is the time it takes to transmit a signal to a satellite calculated?

The time it takes for a signal to reach a satellite is calculated by dividing the distance between the source and the satellite by the speed of the signal, which is the speed of light. This calculation results in the time it takes for the signal to travel one way, so the total round-trip time would need to be doubled.

5. Is there a difference in the time it takes to transmit a signal to different types of satellites?

Yes, there can be differences in the time it takes for a signal to reach different types of satellites. For example, a geostationary satellite, which is positioned in a fixed location above the Earth's equator, will have a longer signal time compared to a low Earth orbit satellite, which is much closer to the Earth's surface. Additionally, the type and strength of the signal being transmitted can also affect the overall transmission time.

Similar threads

  • Introductory Physics Homework Help
Replies
1
Views
1K
  • Introductory Physics Homework Help
Replies
1
Views
1K
  • Introductory Physics Homework Help
Replies
1
Views
1K
  • Introductory Physics Homework Help
Replies
2
Views
908
  • Introductory Physics Homework Help
Replies
2
Views
2K
  • Introductory Physics Homework Help
Replies
5
Views
1K
  • Introductory Physics Homework Help
Replies
3
Views
14K
  • Introductory Physics Homework Help
Replies
4
Views
2K
  • Introductory Physics Homework Help
Replies
7
Views
3K
  • Introductory Physics Homework Help
Replies
1
Views
2K
Back
Top