(adsbygoogle = window.adsbygoogle || []).push({}); 1. The problem statement, all variables and given/known data

Two antennas located at points A and B are broadcasting radio waves of frequency 96.0 MHz, perfectly in phase with each other. The two antennas are separated by a distance d=12.40m. An observer, P, is located on the x axis, a distance x=55.0m from antenna A, so that APB forms a right triangle with PB as hypotenuse. What is the phase difference between the waves arriving at P from antennas A and B? Use units of "rad" for the answer. (If you are stuck, read the hint.)

2. Relevant equations

1 wavelength = 2pi radians

m * lambda / distance between slits = distance between maxima (y) / Distance to screen

c/f = lambda

3. The attempt at a solution

First, I converted 96MHz to wavelengths, = 3.125 m

Then, I wanted to find the pathlength of AP and the pathlength of BP, but I don't know how to do this without knowing the distance between maxima (y). If I had y, I could find the value of the order (m), and then I would know the pathlength difference. How do I solve for this?

Finally, I will convert the pathlength distance to radians with the above formula. I just don't understand how to get the pathlength distance.

**Physics Forums | Science Articles, Homework Help, Discussion**

Join Physics Forums Today!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

# Homework Help: Phase Difference between two waves from antennas

**Physics Forums | Science Articles, Homework Help, Discussion**