1. The problem statement, all variables and given/known data How fast must a satellite leave Earth's surface to reach an orbit with an altitude of 895 km? 2. Relevant equations v = √GM/r 3. The attempt at a solution G = 6.67 x 10^-11 M = 5.98 x 10^24 r = (6.38 x 10^6) + (8.95 x 10^5) = 7.275 x 10^6 v = √(6.67 x 10^-11)(5.98 x 10^24)/(7.275 x 10^6) v = 7404.5 m/s But this is the speed the satellite needs to stay in orbit. Is this the same speed that it needs to leave the Earth's surface? Do I need to use v = √2GM/r instead to find the velocity it needs to escape the gravitational field at Earth's surface and go into orbit?