1. The problem statement, all variables and given/known data Light is incident on the reflecting surface tilted at a blaze angle [tex]\gamma[/tex] with respect to the horizontal non-reflecting screen as shown in the figure (below). The angle of incidence with respect to the screen normal is [tex]\theta_i[/tex]. Consider two points along the reflecting surface with the distance z between them. Show that the path difference between the waves incident at these two points and then diffracted in the direction with the angle [tex]\theta[/tex] (with respect to the screen normal) can be calculated as: [tex]\Delta = z[sin(\theta + \gamma) - sin(\theta_i - \gamma)][/tex] 2. Relevant equations I'm not sure - I think maybe simple geometry and tig can be used. 3. The attempt at a solution I've drawn a diagram (not here) with the setup of the question, and all rays and important lines on it. From this I have extracted a triangle with base z (which is also the hypotenuse), and the longer of the two sides is [tex]\Delta[/tex], as shown below. Now from this I can see that [tex]\Delta[/tex] = zsin[tex]\phi[/tex], so this indicates to me that I have find [tex]\phi[/tex] in terms of the other angles (or sin[tex]\phi[/tex] in terms of the sine's of the other angles). However I am struggling to do this, I can't see a relationship, unless I think of vector addition, and say that [tex]z[sin(\theta + \gamma) - sin(\theta_i - \gamma)] = zsin\phi[/tex], which I think is correct. Then is sufficient to say that I found this out from the diagram and geometry? Or do you believe that the question is looking for more of an algebraic approach to the solution?