An interference experiment employs two narrow parallel slits of separation
0.25mm, and monochromatic light of wavelength [tex]\lambda[/tex] = 500 nm. Estimate
the minimum distance that the projection screen must be placed behind
the slits in order to obtain a far-field interference pattern.
[tex]\lambda[/tex] = 500nm
d = .25mm = 2.5 *10^5 nm
D = distance to projection screen
I know that for a far-field interference pattern to emerge, d/D<<1 and d^2/[tex]\lambda[/tex]<<D
The Attempt at a Solution
I don't know how to get an exact value for a minimum value of D. I know that D must be much greater than 125mm from the above, but I don't know how to get anything more specific than that.