Hello, I had a bit of trouble figuring out this problem: 1. The problem statement, all variables and given/known data Given the following, determine the distance in miles above the Earth's surface of a geosynchronous satellite. MEarth=5.98E24 kg REarth=4,000 miles 1 mile=1604 m 2. Relevant equations Fg=(Gm1m2)/r2 FC=(mv2)/r 3. The attempt at a solution ((6.67E-11m^3/kg x s^2)(mSatellite)(5.98E24kg))/(6416000m+x)^2 = (mSatellitevSatellite^2)/(6416000m+x) For v of the satellite, I said the velocity is equal to the distance of the orbit divided by 86,400 seconds, so I have: (3.98866E14)/(4.1165056E13+12832000x+x^2) = (((6.2832x)/(86400))^2)/(6416000+x) and then http://www4b.wolframalpha.com/Calculate/MSP/MSP94291ci3c60ee01fbifd00000i565236dhh0d08b?MSPStoreType=image/gif&s=2&w=476.&h=66. [Broken] Then, I found that x equaled 40,216,400 meters or 11,658 miles. Is this correct?