# Time question

1. Mar 18, 2014

### a.k

1. The problem statement, all variables and given/known data
The average speed of an orbiting satellite is 20,000 mph. How much time is required for the satellite to orbit Earth? (the satellite is orbiting 250 miles above the Earth’s surface, and remember that the Earth has a radius of 3,963 miles.)

2. Relevant equations
t=d/v
2∏(r)

3. The attempt at a solution
3963+250=4213

2∏(4213)
6.28(4213)
26457.84 miles

26457.64/20000
=1.32 hours*60mins
=79.2 mins

Did I do this correctly?

2. Mar 18, 2014

### nasu

It looks OK.

3. Mar 18, 2014

### a.k

Thank you for the response.

4. Mar 18, 2014

### vanceEE

I don't see anything wrong. But, for the future; try to keep exact, or near exact answers throughout your problem, for example, when you multiplied $6.28$ by $4213$ you got $26457.84$mi, whereas the exact answer was $8426\pi$mi or $26471.0597$mi. Besides that I don't see anything wrong with your solution.

Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted