1. The problem statement, all variables and given/known data Two antennas located at points A and B are broadcasting radio waves of frequency 98.0 MHz, perfectly in phase with each other. The two antennas are separated by a distance d= 6.20 m. An observer, P, is located on the x axis, a distance x= 60.0 m from antenna A, so that APB forms a right triangle with PB as hypotenuse. 2. Relevant equations phase difference / 2 pi = r2-r1 / lambda d(y/L)= n(lamdba) if constructive d(y/L) = (n+1/2)(lambda) if deconstructive. 3. The attempt at a solution There are three questions, I have managed to solve the first one , which is : What is the phase difference between the waves arriving at P from antennas A and B? I have found, that the solution is : phase = (2(pi ) sqrt( (6.2)^2 + (60)^2) - 60) / 3.0612m - > 0.6557 rad. the two following parts are where i become confused , I have searched online for advice, but I cannot seem to solve the following parts; here they are. Now observer P walks along the x axis toward antenna A. What is P's distance from A when he first observes fully destructive interference between the two waves? and finally, the third part : If observer P continues walking until he reaches antenna A, at how many places along the x axis (including the place you found in the previous problem) will he detect minima in the radio signal, due to destructive interference? note: If you do help, I very much appreciate it! However, If you could be as clear as possible. Thanks in advance!