1. Feb 14, 2010

EmmaK

1. The problem statement, all variables and given/known data

A radiotelescope is positioned at a height of h=150m on a cliff overlooking the sea. A very distant galaxy emitting radiowaves, wavelength 3m, is at an angle $$\theta$$ above the horizion. The radiowaves reach the telescope both directly and after reflection off the water surface. Show that if $$\theta$$=57degrees, then the 2 rays interfere destructively, so that the signal received is zero. Note that the refractive index,n,of the sea is greater than that of air

2. Relevant equations
dsin$$\theta$$=(m+1/2)$$\lambda$$ for destructive interference.

3. The attempt at a solution
Tried substituting the values into the equation, with d=2hsin$$\theta$$, and expected to get m as an integer, but didnt..

2. Feb 14, 2010

EmmaK

anyone?

3. Feb 14, 2010

ideasrule

That equation doesn't apply here; I'll leave it to you to figure out why. You have to approach this problem from first principles. First, calculate the path length difference between the reflected ray and the direct ray.

4. Feb 14, 2010

EmmaK

oh ok, is the path difference not 2hsin(theta)=300sin(theta) either?

5. Feb 14, 2010

ideasrule

I see no obvious reason why it should be equal to that.

6. Feb 14, 2010

EmmaK

ok.. i just used trig. sin(angle) = opposite/hypotenuse. want to find hypotenuse and opposite =h. and there's 2 of these triangles, so path difference is double this..hard to explain without a diagram!

7. Feb 15, 2010

Stonebridge

I have given you a hint (with diagram) on the TSR physics site.