Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Homework Help: Phase Difference between two waves from antennas

  1. Apr 12, 2010 #1
    1. The problem statement, all variables and given/known data
    Two antennas located at points A and B are broadcasting radio waves of frequency 96.0 MHz, perfectly in phase with each other. The two antennas are separated by a distance d=12.40m. An observer, P, is located on the x axis, a distance x=55.0m from antenna A, so that APB forms a right triangle with PB as hypotenuse. What is the phase difference between the waves arriving at P from antennas A and B? Use units of "rad" for the answer. (If you are stuck, read the hint.)


    2. Relevant equations
    1 wavelength = 2pi radians
    m * lambda / distance between slits = distance between maxima (y) / Distance to screen
    c/f = lambda

    3. The attempt at a solution
    First, I converted 96MHz to wavelengths, = 3.125 m
    Then, I wanted to find the pathlength of AP and the pathlength of BP, but I don't know how to do this without knowing the distance between maxima (y). If I had y, I could find the value of the order (m), and then I would know the pathlength difference. How do I solve for this?

    Finally, I will convert the pathlength distance to radians with the above formula. I just don't understand how to get the pathlength distance.
     

    Attached Files:

  2. jcsd
  3. Apr 12, 2010 #2
    I know that AP = 55m and BP = 56.4m, I just don't know how to find the wavelength of pathlength difference
     
  4. Apr 12, 2010 #3
    So I know the difference in the paths is 1.4m, but when I multiply that by the wavelength, (3.125m), and convert to radians it is wrong.
     
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook