- #1
jemjabella42
- 11
- 0
Radio waves travel at the speed of light. A satellite is in a "geosynchronous orbit." A radio signal is sent from the ground to the satellite and then the satellite sends the signal back down to the ground. Satellites in geosynchronous orbit are 36,000km above the surface of the earth. How much time does it take for a signal to go from the ground to the satellite and back to the ground?
I plugged numbers into the equation speed=distance/time to get:
2.99792458x10^8 meters per second=36,000km/T
Then, I took both sides multiplied by T to get:
2.99792458x10^8 meters/sec(T)=36,000km
I am not sure how to isolate my variable from here because I'm letting the labels throw me off. I know 1km=1000meters. Do I need to use dimensional analysis to get past this step? It is very confusing for me and I don't know why! It seems like it should be simple math but my brain is just not used to it.
I plugged numbers into the equation speed=distance/time to get:
2.99792458x10^8 meters per second=36,000km/T
Then, I took both sides multiplied by T to get:
2.99792458x10^8 meters/sec(T)=36,000km
I am not sure how to isolate my variable from here because I'm letting the labels throw me off. I know 1km=1000meters. Do I need to use dimensional analysis to get past this step? It is very confusing for me and I don't know why! It seems like it should be simple math but my brain is just not used to it.