1. The problem statement, all variables and given/known data let there be three points: A, B, and D. Light of wavelength 400 m is emitted from the source at A to the destination D, and also from B to D. Line segment AD is 100 m longer than BD. The starting phase of the light at A is pi/2 radians ahead of B. At point D, what is the phase difference between the light rays? 2. Relevant equations none really. Just that path length differences cause phase differences. 3. The attempt at a solution Let the phase of light at the source B be 0. The starting phase at A is then pi/2. Light ray A also goes through 100/400 more wavelengths than light B since it has to go through a longer distance. This is equivalent to another pi/2 radians. Therefore at point D, light ray A leads B by pi radians. This is actually a problem from Halliday and Resnick, 8th edition, chapter 35 number 21. The back of the book says the answer is 0, but I have no idea why they took pi/2 - pi/2 instead of pi/2 + pi/2.