Thin film interference

1. Dec 29, 2009

timhunderwood

1. The problem statement, all variables and given/known data

Light of wavelength $$\lambda$$ = 500nm, produced by an extended source, is incident at an angle of $$\phi$$= 30 degrees from the normal upon a dielectric film of refractive index, n=2, supported on a solid planar substrate. Reflectivity minima are observed to have an angular separation of 0.05 radians. What is the thickness, d, of the film?

2. Relevant equations

Phase difference $$\delta$$ = 2ndcos$$\theta$$*(2*pi/$$\lambda$$) where sin$$\phi$$=nsin$$\theta$$

Intensity proportional to sin2($$\delta$$/2)

3. The attempt at a solution

0th and first minima occur when the $$\delta$$=0 and $$\delta$$=2 pi respectively.

Hence:
2ndcos$$\theta$$*(2*pi/$$\lambda$$) = 2pi

which gives d=258nm

which seems very small and I didn't have to use the angular seperation to get my answer which leads me to believe I'm doing something wrong.

I think my mistake is to do with the fact I don't now how to involve the angular separation of 0.05 radians into the solution?

Help appreciated