Wave Interference Problem

1. Oct 28, 2008

Machodog

1. The problem statement, all variables and given/known data

Two equal point sources S1 and S2 are a distance d apart. They emit waves in phase, of wavelength $$\lambda$$. P is a point on the line passing through the mid-point of S1S2 and making an angle $$\theta$$ with the centre line; the distance of P from S1 or S2 is very much greater than d.

Show that the intensity I of the disturbance at P is given by

$$I=2I_0\left\lbrace1+\cos \left(\frac{2\pi d\sin{\theta}}{\lambda}\right)\right\rbrace$$
Or
$$I=4I_0\cos^2 \left(\frac{\pi d\sin{\theta}}{\lambda}\right)$$

where $$I_0$$ is the intensity emitted by each source.

2. Relevant equations
Since we are told the distance of P from the two sources is very much greater than d, the source separation, we can assume that the lines S1P and S2P are parallel, in which case the path difference will be:

$$\frac{d \sin{\theta}}{\lambda}$$

And so the phase difference will be:

$$\frac{2\pi d\sin{\theta}}{\lambda}$$

3. The attempt at a solution
I've been trying to do this by adding two general waves with the same amplitude but out of phase like so:

$$A\sin{\omega t} + A\sin{(\omega t + \phi)}$$

Using a trig identity this becomes:

$$2A\sin{\left(\omega t + \frac{\phi}{2}\right)}cos{\frac{\phi}{2}}$$

Since intensity is amplitude squared, i just square this expression, but it doesn't really look like what I'm asked to show.

I can do it another way, by using phasors and the cosine rule, but was just wondering if anybody could make the method above work, or explain to me why it doesn't work.

Any help appreciated and apologies for not having a picture.