(adsbygoogle = window.adsbygoogle || []).push({}); 1. The problem statement, all variables and given/known data

Sources A and B emit long-range radio waves of wavelength 550 m, with the phase of the emission from A ahead of that from source B by 90°. The distance rA from A to a detector is greater than the corresponding distance rB from B by 140 m. What is the magnitude of the phase difference at the detector?

2. Relevant equations

3. The attempt at a solution

Initially, source A leads source B by 90°, which is equivalent to \frac{1}{4} wavelength.

However, source A also lags behind source B since rA is longer than rB by 140 m, which is \frac{140}{550}=0.2545 wavelength. So the net phase difference between A and B at the

detector is -0.732 Rad.

This is wrong though, can someone help me =D

**Physics Forums | Science Articles, Homework Help, Discussion**

Join Physics Forums Today!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

# Homework Help: Phase Difference

**Physics Forums | Science Articles, Homework Help, Discussion**