# Phase Difference between two waves from antennas

#### skibum143

1. Homework Statement
Two antennas located at points A and B are broadcasting radio waves of frequency 96.0 MHz, perfectly in phase with each other. The two antennas are separated by a distance d=12.40m. An observer, P, is located on the x axis, a distance x=55.0m from antenna A, so that APB forms a right triangle with PB as hypotenuse. What is the phase difference between the waves arriving at P from antennas A and B? Use units of "rad" for the answer. (If you are stuck, read the hint.)

2. Homework Equations
m * lambda / distance between slits = distance between maxima (y) / Distance to screen
c/f = lambda

3. The Attempt at a Solution
First, I converted 96MHz to wavelengths, = 3.125 m
Then, I wanted to find the pathlength of AP and the pathlength of BP, but I don't know how to do this without knowing the distance between maxima (y). If I had y, I could find the value of the order (m), and then I would know the pathlength difference. How do I solve for this?

Finally, I will convert the pathlength distance to radians with the above formula. I just don't understand how to get the pathlength distance.

#### Attachments

• 714 bytes Views: 585
Related Introductory Physics Homework Help News on Phys.org

#### skibum143

I know that AP = 55m and BP = 56.4m, I just don't know how to find the wavelength of pathlength difference

#### skibum143

So I know the difference in the paths is 1.4m, but when I multiply that by the wavelength, (3.125m), and convert to radians it is wrong.

### Physics Forums Values

We Value Quality
• Topics based on mainstream science
• Proper English grammar and spelling
We Value Civility
• Positive and compassionate attitudes
• Patience while debating
We Value Productivity
• Disciplined to remain on-topic
• Recognition of own weaknesses
• Solo and co-op problem solving