- #1

- 96

- 0

## Homework Statement

Sources A and B emit long-range radio waves of wavelength 550 m, with the phase of the emission from A ahead of that from source B by 90°. The distance rA from A to a detector is greater than the corresponding distance rB from B by 140 m. What is the magnitude of the phase difference at the detector?

## Homework Equations

## The Attempt at a Solution

Initially, source A leads source B by 90°, which is equivalent to \frac{1}{4} wavelength.

However, source A also lags behind source B since rA is longer than rB by 140 m, which is \frac{140}{550}=0.2545 wavelength. So the net phase difference between A and B at the

detector is -0.732 Rad.

This is wrong though, can someone help me =D

Last edited: