1. The problem statement, all variables and given/known data sky tv is broadcast from a satellite in orbit at an altitude of 35600km. By considering the defference between the satellite and the surface of the earth , find the frequency shift of a 1GHz signal broadcast to the earth. (i have been given radius and mass of the earth. 2. Relevant equations g=Gm/r^2, f(observed)=f(emit){1 +/- gh/c^2} 3. The attempt at a solution find g f(emitted)=1GHz how do i calculate f(observed)?
(G*M) / (r * c^2) = (wo - we) / we "wo" is the wavelength of the photon as measured by a distant observer. "we" is the wavelength of the photon when measured at the source of emission. and "r" is distance w = c / f (c=speed of light) ---------------- Pooya Afaghi