Jupiter
- 46
- 0
Consider two wavelengths that differ by .6nm and a 1cm diffraction grating that has 105 slits. How great must D (dist. to detector) be in order to resolve these wavelengths from a source containing them at better than ΔY=0.1mm?
From the intensity equation, I find that a maximum will occur whenever Yd/D=n2λ. So I need (solving for D)
D=\frac{d\Delta Y}{2\Delta \lambda}=8.3\textrm{mm}. My work seems fine, but my answer seems off the mark. .6nm is a small wavelength difference. I'd expect a large D.
Verify anyone?
(d is the dist. between slit, which I take to be 1/105cm)
From the intensity equation, I find that a maximum will occur whenever Yd/D=n2λ. So I need (solving for D)
D=\frac{d\Delta Y}{2\Delta \lambda}=8.3\textrm{mm}. My work seems fine, but my answer seems off the mark. .6nm is a small wavelength difference. I'd expect a large D.
Verify anyone?
(d is the dist. between slit, which I take to be 1/105cm)
Last edited: