The figure shows the interference pattern that appears on a distant screen when coherent light is incident on a mask with two identical, very narrow slits. Points P and Q are maxima; Point R is a minimum. The wavelength of the light that created the interference pattern is λ = 699 nm, the two slites are separated by rm d = 6 μm, and the distance from the slits to the center of the screen is L = 80 cm . The difference in path length at a point on the screen is ∆s = |s1 − s2|, where s1 and s2 are the distances from each slit to the point.
1. What is ∆s (in nm) at Point P?
2. What is ∆s (in nm) at Point Q?
3. What is ∆s (in nm) at Point R?
The Attempt at a Solution
I know the answer to the first problem is 0.
But i tried plugging in da=sqr(80^2+(699-3)^2) and I am getting it wrong. I tried converting all the units to nm, but then the equation just turns into sqr(80nm^2) because the second term turns into a number close to zero.
Last edited by a moderator: