1. Not finding help here? Sign up for a free 30min tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Interference in light rays due to phase difference

  1. Sep 21, 2008 #1
    1. The problem statement, all variables and given/known data
    let there be three points: A, B, and D. Light of wavelength 400 m is emitted from the source at A to the destination D, and also from B to D. Line segment AD is 100 m longer than BD. The starting phase of the light at A is pi/2 radians ahead of B. At point D, what is the phase difference between the light rays?

    2. Relevant equations
    none really. Just that path length differences cause phase differences.

    3. The attempt at a solution
    Let the phase of light at the source B be 0. The starting phase at A is then pi/2. Light ray A also goes through 100/400 more wavelengths than light B since it has to go through a longer distance. This is equivalent to another pi/2 radians. Therefore at point D, light ray A leads B by pi radians.

    This is actually a problem from Halliday and Resnick, 8th edition, chapter 35 number 21. The back of the book says the answer is 0, but I have no idea why they took pi/2 - pi/2 instead of pi/2 + pi/2.
  2. jcsd
  3. Sep 21, 2008 #2
    Light ray A started out with a lead of pi/2, so AD being pi/2 longer than BD leaves A and B in-phase at D.

    Keep track of your signs when dealing with phase.


  4. Sep 21, 2008 #3
    believe me, i'm trying as hard as i can to work out the phase shifts, both due to A having a head start and A travelling a greater path, as having opposite signs, but I just can't do it.

    Having a head start and travelling a greater path both make A lead MORE. In fact, the length of path B is a free variable in this problem. Let path B have length 0. Then path A would have length length 100m. This would mean that B goes through zero change in phase and stays at phase = 0. A goes through 100m, which is equivalent to 100/400 wavelengths, which means it goes through another pi/2 radians. Since A already starts out at pi/2, this means A, which will be at a phase of pi/2 + pi/2 at point D, leads B by pi.

    I'm positive that halliday and resnick has made an error on their part for this problem.
  5. Sep 21, 2008 #4
    Here's the exact wording for this problem:

    Sources A and B emit long-range radio waves of wavelength 400 m, with the phase of the emission from A ahead of that from source B by 90 degrees. The distance Ra from A to detector D is greater than the corresponding distance Rb by 100 m. What is the phase difference of the waves at D?
  6. Sep 22, 2008 #5
    The extra 100m in path A means that by the time light from A has reached the detector, B has undergone a pi/2 phase change to be in phase with A, so the total phase difference is zero. B leads A in path difference because it is closer to the detector, and this makes up for the initial phase lead in A.
  7. Sep 22, 2008 #6
    "Leading phase" has to do with time. If A and B are emitted at the same time, A's phase is pi/2 relative to B (assuming positive-increasing phase convention). When you translate the phase references to D, A's path is pi/2 longer. When you subtract off the electrical path length to get the phase at D, A shifts back to zero relative to B.

    If A were "lagging" by pi/2, then the phase of A at D would be -pi relative to B.


Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?

Similar Discussions: Interference in light rays due to phase difference