1. The problem statement, all variables and given/known data How fast must a satellite 10,000 miles above Earth's surface travel, and how long does it take to complete one orbit of Earth? 2. Relevant equations [itex]v^2 = g*r[/itex] 3. The attempt at a solution 10,000 miles + radius of Earth = 14,000mi = [itex]v^2 = (9.8)(2.253*10^7m)[/itex] v = 14859.13 m/s The answer says 9400 miles per hour which is about 4202m/s. Is there something I'm doing wrong or overlooking?