# Diffraction grating

1. Jan 26, 2004

### Jupiter

Consider two wavelengths that differ by .6nm and a 1cm diffraction grating that has 105 slits. How great must D (dist. to detector) be in order to resolve these wavelengths from a source containing them at better than &Delta;Y=0.1mm?

From the intensity equation, I find that a maximum will occur whenever Yd/D=n2&lambda;. So I need (solving for D)
$$D=\frac{d\Delta Y}{2\Delta \lambda}=8.3\textrm{mm}$$. My work seems fine, but my answer seems off the mark. .6nm is a small wavelength difference. I'd expect a large D.
Verify anyone?
(d is the dist. between slit, which I take to be 1/105cm)

Last edited: Jan 26, 2004
2. Jan 26, 2004

### Jupiter

I am surprised no one can help me. Has no one ever studied this before?

3. Jan 27, 2004

### Staff: Mentor

Your equation seems fishy since it shows no dependence on the number of slits. The more slits in the grating, the greater the resolving power. (Also, the distance D need not be all that big if there are lots of slits.) Check this out:
http://hyperphysics.phy-astr.gsu.edu/hbase/phyopt/gratres.html